Mar 08 03:45:00.200047 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 08 03:45:00.879376 master-0 kubenswrapper[4045]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:45:00.879376 master-0 kubenswrapper[4045]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 08 03:45:00.879376 master-0 kubenswrapper[4045]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:45:00.879376 master-0 kubenswrapper[4045]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:45:00.879376 master-0 kubenswrapper[4045]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 08 03:45:00.879376 master-0 kubenswrapper[4045]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:45:00.880967 master-0 kubenswrapper[4045]: I0308 03:45:00.880733 4045 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 08 03:45:00.887777 master-0 kubenswrapper[4045]: W0308 03:45:00.887718 4045 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:45:00.887777 master-0 kubenswrapper[4045]: W0308 03:45:00.887752 4045 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:45:00.887777 master-0 kubenswrapper[4045]: W0308 03:45:00.887764 4045 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:45:00.887777 master-0 kubenswrapper[4045]: W0308 03:45:00.887775 4045 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:45:00.887777 master-0 kubenswrapper[4045]: W0308 03:45:00.887786 4045 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887797 4045 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887807 4045 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887816 4045 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887853 4045 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887864 4045 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887875 4045 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887885 4045 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887894 4045 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887902 4045 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887911 4045 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887920 4045 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887928 4045 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887936 4045 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887944 4045 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887952 4045 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887960 4045 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887969 4045 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887977 4045 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:45:00.888119 master-0 kubenswrapper[4045]: W0308 03:45:00.887985 4045 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.887993 4045 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888002 4045 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888010 4045 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888018 4045 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888026 4045 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888034 4045 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888041 4045 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888049 4045 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888057 4045 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888065 4045 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888073 4045 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888081 4045 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888090 4045 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888098 4045 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888107 4045 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888114 4045 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888122 4045 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888130 4045 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888138 4045 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:45:00.889093 master-0 kubenswrapper[4045]: W0308 03:45:00.888147 4045 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888155 4045 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888162 4045 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888173 4045 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888181 4045 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888191 4045 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888198 4045 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888206 4045 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888214 4045 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888221 4045 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888229 4045 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888237 4045 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888277 4045 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888287 4045 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888296 4045 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888303 4045 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888311 4045 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888319 4045 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888327 4045 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888335 4045 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:45:00.890119 master-0 kubenswrapper[4045]: W0308 03:45:00.888342 4045 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: W0308 03:45:00.888350 4045 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: W0308 03:45:00.888358 4045 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: W0308 03:45:00.888365 4045 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: W0308 03:45:00.888373 4045 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: W0308 03:45:00.888380 4045 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: W0308 03:45:00.888389 4045 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: W0308 03:45:00.888397 4045 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: W0308 03:45:00.888405 4045 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: I0308 03:45:00.888554 4045 flags.go:64] FLAG: --address="0.0.0.0" Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: I0308 03:45:00.888569 4045 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: I0308 03:45:00.888585 4045 flags.go:64] FLAG: --anonymous-auth="true" Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: I0308 03:45:00.888597 4045 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: I0308 03:45:00.888608 4045 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: I0308 03:45:00.888617 4045 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: I0308 03:45:00.888628 4045 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: I0308 03:45:00.888639 4045 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: I0308 03:45:00.888648 4045 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: I0308 03:45:00.888658 4045 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: I0308 03:45:00.888667 4045 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: I0308 03:45:00.888677 4045 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: I0308 03:45:00.888685 4045 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 08 03:45:00.891117 master-0 kubenswrapper[4045]: I0308 03:45:00.888695 4045 flags.go:64] FLAG: --cgroup-root="" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.888703 4045 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.888865 4045 flags.go:64] FLAG: --client-ca-file="" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.888881 4045 flags.go:64] FLAG: --cloud-config="" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.888892 4045 flags.go:64] FLAG: --cloud-provider="" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.888903 4045 flags.go:64] FLAG: --cluster-dns="[]" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.888924 4045 flags.go:64] FLAG: --cluster-domain="" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.888933 4045 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.888943 4045 flags.go:64] FLAG: --config-dir="" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.888952 4045 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.888966 4045 flags.go:64] FLAG: --container-log-max-files="5" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.888982 4045 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.888992 4045 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.889001 4045 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.889011 4045 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.889021 4045 flags.go:64] FLAG: --contention-profiling="false" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.889030 4045 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.889040 4045 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.889051 4045 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.889061 4045 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.889075 4045 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.889085 4045 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.889094 4045 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.889104 4045 flags.go:64] FLAG: --enable-load-reader="false" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.889114 4045 flags.go:64] FLAG: --enable-server="true" Mar 08 03:45:00.892218 master-0 kubenswrapper[4045]: I0308 03:45:00.889124 4045 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889136 4045 flags.go:64] FLAG: --event-burst="100" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889147 4045 flags.go:64] FLAG: --event-qps="50" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889157 4045 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889167 4045 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889176 4045 flags.go:64] FLAG: --eviction-hard="" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889189 4045 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889198 4045 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889207 4045 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889219 4045 flags.go:64] FLAG: --eviction-soft="" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889228 4045 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889237 4045 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889246 4045 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889258 4045 flags.go:64] FLAG: --experimental-mounter-path="" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889267 4045 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889276 4045 flags.go:64] FLAG: --fail-swap-on="true" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889285 4045 flags.go:64] FLAG: --feature-gates="" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889296 4045 flags.go:64] FLAG: --file-check-frequency="20s" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889306 4045 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889315 4045 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889325 4045 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889336 4045 flags.go:64] FLAG: --healthz-port="10248" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889345 4045 flags.go:64] FLAG: --help="false" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889354 4045 flags.go:64] FLAG: --hostname-override="" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889363 4045 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889373 4045 flags.go:64] FLAG: --http-check-frequency="20s" Mar 08 03:45:00.893434 master-0 kubenswrapper[4045]: I0308 03:45:00.889382 4045 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889391 4045 flags.go:64] FLAG: --image-credential-provider-config="" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889400 4045 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889409 4045 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889418 4045 flags.go:64] FLAG: --image-service-endpoint="" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889427 4045 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889436 4045 flags.go:64] FLAG: --kube-api-burst="100" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889445 4045 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889455 4045 flags.go:64] FLAG: --kube-api-qps="50" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889465 4045 flags.go:64] FLAG: --kube-reserved="" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889474 4045 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889483 4045 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889493 4045 flags.go:64] FLAG: --kubelet-cgroups="" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889501 4045 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889511 4045 flags.go:64] FLAG: --lock-file="" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889520 4045 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889529 4045 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889538 4045 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889553 4045 flags.go:64] FLAG: --log-json-split-stream="false" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889562 4045 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889571 4045 flags.go:64] FLAG: --log-text-split-stream="false" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889582 4045 flags.go:64] FLAG: --logging-format="text" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889591 4045 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889602 4045 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889611 4045 flags.go:64] FLAG: --manifest-url="" Mar 08 03:45:00.894694 master-0 kubenswrapper[4045]: I0308 03:45:00.889620 4045 flags.go:64] FLAG: --manifest-url-header="" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889635 4045 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889645 4045 flags.go:64] FLAG: --max-open-files="1000000" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889656 4045 flags.go:64] FLAG: --max-pods="110" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889665 4045 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889674 4045 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889683 4045 flags.go:64] FLAG: --memory-manager-policy="None" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889693 4045 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889702 4045 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889711 4045 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889721 4045 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889749 4045 flags.go:64] FLAG: --node-status-max-images="50" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889759 4045 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889768 4045 flags.go:64] FLAG: --oom-score-adj="-999" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889777 4045 flags.go:64] FLAG: --pod-cidr="" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889786 4045 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889803 4045 flags.go:64] FLAG: --pod-manifest-path="" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889812 4045 flags.go:64] FLAG: --pod-max-pids="-1" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889821 4045 flags.go:64] FLAG: --pods-per-core="0" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889874 4045 flags.go:64] FLAG: --port="10250" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889884 4045 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889893 4045 flags.go:64] FLAG: --provider-id="" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889902 4045 flags.go:64] FLAG: --qos-reserved="" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889912 4045 flags.go:64] FLAG: --read-only-port="10255" Mar 08 03:45:00.896023 master-0 kubenswrapper[4045]: I0308 03:45:00.889920 4045 flags.go:64] FLAG: --register-node="true" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.889929 4045 flags.go:64] FLAG: --register-schedulable="true" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.889938 4045 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.889955 4045 flags.go:64] FLAG: --registry-burst="10" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.889964 4045 flags.go:64] FLAG: --registry-qps="5" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.889973 4045 flags.go:64] FLAG: --reserved-cpus="" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.889981 4045 flags.go:64] FLAG: --reserved-memory="" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.889992 4045 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.890002 4045 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.890011 4045 flags.go:64] FLAG: --rotate-certificates="false" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.890023 4045 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.890032 4045 flags.go:64] FLAG: --runonce="false" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.890041 4045 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.890050 4045 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.890060 4045 flags.go:64] FLAG: --seccomp-default="false" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.890069 4045 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.890078 4045 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.890087 4045 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.890097 4045 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.890107 4045 flags.go:64] FLAG: --storage-driver-password="root" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.890116 4045 flags.go:64] FLAG: --storage-driver-secure="false" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.890125 4045 flags.go:64] FLAG: --storage-driver-table="stats" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.890136 4045 flags.go:64] FLAG: --storage-driver-user="root" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.890147 4045 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.890161 4045 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 08 03:45:00.897114 master-0 kubenswrapper[4045]: I0308 03:45:00.890172 4045 flags.go:64] FLAG: --system-cgroups="" Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: I0308 03:45:00.890182 4045 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: I0308 03:45:00.890199 4045 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: I0308 03:45:00.890210 4045 flags.go:64] FLAG: --tls-cert-file="" Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: I0308 03:45:00.890220 4045 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: I0308 03:45:00.890236 4045 flags.go:64] FLAG: --tls-min-version="" Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: I0308 03:45:00.890246 4045 flags.go:64] FLAG: --tls-private-key-file="" Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: I0308 03:45:00.890256 4045 flags.go:64] FLAG: --topology-manager-policy="none" Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: I0308 03:45:00.890267 4045 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: I0308 03:45:00.890277 4045 flags.go:64] FLAG: --topology-manager-scope="container" Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: I0308 03:45:00.890288 4045 flags.go:64] FLAG: --v="2" Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: I0308 03:45:00.890302 4045 flags.go:64] FLAG: --version="false" Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: I0308 03:45:00.890315 4045 flags.go:64] FLAG: --vmodule="" Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: I0308 03:45:00.890328 4045 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: I0308 03:45:00.890339 4045 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: W0308 03:45:00.890611 4045 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: W0308 03:45:00.890624 4045 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: W0308 03:45:00.890636 4045 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: W0308 03:45:00.890645 4045 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: W0308 03:45:00.890655 4045 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: W0308 03:45:00.890664 4045 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: W0308 03:45:00.890675 4045 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:45:00.898257 master-0 kubenswrapper[4045]: W0308 03:45:00.890683 4045 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890693 4045 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890701 4045 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890710 4045 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890718 4045 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890727 4045 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890735 4045 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890743 4045 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890751 4045 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890759 4045 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890767 4045 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890774 4045 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890782 4045 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890789 4045 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890798 4045 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890811 4045 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890819 4045 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890853 4045 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890861 4045 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890889 4045 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:45:00.899267 master-0 kubenswrapper[4045]: W0308 03:45:00.890897 4045 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.890905 4045 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.890913 4045 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.890921 4045 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.890929 4045 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.890936 4045 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.890947 4045 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.890957 4045 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.890969 4045 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.890977 4045 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.890986 4045 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.890994 4045 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.891003 4045 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.891012 4045 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.891020 4045 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.891031 4045 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.891039 4045 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.891047 4045 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.891055 4045 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.891063 4045 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:45:00.900195 master-0 kubenswrapper[4045]: W0308 03:45:00.891071 4045 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891079 4045 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891087 4045 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891095 4045 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891103 4045 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891112 4045 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891122 4045 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891137 4045 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891149 4045 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891160 4045 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891173 4045 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891185 4045 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891195 4045 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891205 4045 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891214 4045 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891223 4045 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891232 4045 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891241 4045 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891250 4045 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:45:00.901232 master-0 kubenswrapper[4045]: W0308 03:45:00.891259 4045 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:45:00.902310 master-0 kubenswrapper[4045]: W0308 03:45:00.891270 4045 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:45:00.902310 master-0 kubenswrapper[4045]: W0308 03:45:00.891280 4045 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:45:00.902310 master-0 kubenswrapper[4045]: W0308 03:45:00.891289 4045 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:45:00.902310 master-0 kubenswrapper[4045]: W0308 03:45:00.891298 4045 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:45:00.902310 master-0 kubenswrapper[4045]: W0308 03:45:00.891308 4045 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:45:00.902310 master-0 kubenswrapper[4045]: I0308 03:45:00.891341 4045 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 03:45:00.905151 master-0 kubenswrapper[4045]: I0308 03:45:00.905062 4045 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 08 03:45:00.905151 master-0 kubenswrapper[4045]: I0308 03:45:00.905115 4045 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 08 03:45:00.905304 master-0 kubenswrapper[4045]: W0308 03:45:00.905239 4045 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:45:00.905304 master-0 kubenswrapper[4045]: W0308 03:45:00.905251 4045 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:45:00.905304 master-0 kubenswrapper[4045]: W0308 03:45:00.905261 4045 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:45:00.905304 master-0 kubenswrapper[4045]: W0308 03:45:00.905270 4045 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:45:00.905304 master-0 kubenswrapper[4045]: W0308 03:45:00.905278 4045 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:45:00.905304 master-0 kubenswrapper[4045]: W0308 03:45:00.905287 4045 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:45:00.905304 master-0 kubenswrapper[4045]: W0308 03:45:00.905295 4045 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:45:00.905304 master-0 kubenswrapper[4045]: W0308 03:45:00.905303 4045 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:45:00.905304 master-0 kubenswrapper[4045]: W0308 03:45:00.905311 4045 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:45:00.905304 master-0 kubenswrapper[4045]: W0308 03:45:00.905320 4045 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905329 4045 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905338 4045 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905346 4045 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905355 4045 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905366 4045 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905375 4045 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905404 4045 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905412 4045 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905421 4045 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905429 4045 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905437 4045 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905445 4045 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905454 4045 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905477 4045 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905485 4045 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905493 4045 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905500 4045 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905508 4045 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905516 4045 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:45:00.905903 master-0 kubenswrapper[4045]: W0308 03:45:00.905524 4045 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905532 4045 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905540 4045 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905547 4045 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905555 4045 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905564 4045 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905572 4045 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905580 4045 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905588 4045 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905596 4045 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905604 4045 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905611 4045 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905620 4045 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905628 4045 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905638 4045 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905646 4045 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905654 4045 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905662 4045 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905670 4045 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:45:00.906892 master-0 kubenswrapper[4045]: W0308 03:45:00.905678 4045 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905688 4045 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905701 4045 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905710 4045 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905719 4045 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905728 4045 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905736 4045 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905745 4045 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905754 4045 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905762 4045 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905771 4045 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905780 4045 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905789 4045 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905797 4045 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905808 4045 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905818 4045 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905872 4045 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905884 4045 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905893 4045 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:45:00.908233 master-0 kubenswrapper[4045]: W0308 03:45:00.905902 4045 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:45:00.909182 master-0 kubenswrapper[4045]: W0308 03:45:00.905911 4045 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:45:00.909182 master-0 kubenswrapper[4045]: W0308 03:45:00.905920 4045 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:45:00.909182 master-0 kubenswrapper[4045]: W0308 03:45:00.905933 4045 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:45:00.909182 master-0 kubenswrapper[4045]: W0308 03:45:00.905943 4045 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:45:00.909182 master-0 kubenswrapper[4045]: I0308 03:45:00.905957 4045 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 03:45:00.909182 master-0 kubenswrapper[4045]: W0308 03:45:00.906215 4045 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:45:00.909182 master-0 kubenswrapper[4045]: W0308 03:45:00.906230 4045 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:45:00.909182 master-0 kubenswrapper[4045]: W0308 03:45:00.906242 4045 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:45:00.909182 master-0 kubenswrapper[4045]: W0308 03:45:00.906253 4045 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:45:00.909182 master-0 kubenswrapper[4045]: W0308 03:45:00.906262 4045 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:45:00.909182 master-0 kubenswrapper[4045]: W0308 03:45:00.906272 4045 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:45:00.909182 master-0 kubenswrapper[4045]: W0308 03:45:00.906284 4045 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:45:00.909182 master-0 kubenswrapper[4045]: W0308 03:45:00.906297 4045 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:45:00.909182 master-0 kubenswrapper[4045]: W0308 03:45:00.906308 4045 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906318 4045 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906327 4045 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906337 4045 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906346 4045 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906356 4045 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906364 4045 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906373 4045 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906382 4045 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906391 4045 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906400 4045 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906408 4045 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906416 4045 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906425 4045 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906433 4045 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906442 4045 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906449 4045 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906457 4045 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906465 4045 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906473 4045 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:45:00.909935 master-0 kubenswrapper[4045]: W0308 03:45:00.906481 4045 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906489 4045 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906497 4045 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906504 4045 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906512 4045 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906522 4045 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906533 4045 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906542 4045 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906551 4045 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906560 4045 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906570 4045 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906579 4045 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906588 4045 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906596 4045 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906604 4045 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906612 4045 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906620 4045 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906628 4045 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906635 4045 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:45:00.910982 master-0 kubenswrapper[4045]: W0308 03:45:00.906643 4045 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906653 4045 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906662 4045 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906670 4045 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906678 4045 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906686 4045 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906695 4045 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906703 4045 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906711 4045 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906718 4045 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906726 4045 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906734 4045 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906741 4045 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906750 4045 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906757 4045 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906765 4045 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906773 4045 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906780 4045 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906789 4045 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906797 4045 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:45:00.911966 master-0 kubenswrapper[4045]: W0308 03:45:00.906804 4045 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:45:00.912941 master-0 kubenswrapper[4045]: W0308 03:45:00.906812 4045 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:45:00.912941 master-0 kubenswrapper[4045]: W0308 03:45:00.906857 4045 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:45:00.912941 master-0 kubenswrapper[4045]: W0308 03:45:00.906872 4045 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:45:00.912941 master-0 kubenswrapper[4045]: W0308 03:45:00.906883 4045 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:45:00.912941 master-0 kubenswrapper[4045]: I0308 03:45:00.906895 4045 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 03:45:00.912941 master-0 kubenswrapper[4045]: I0308 03:45:00.909254 4045 server.go:940] "Client rotation is on, will bootstrap in background" Mar 08 03:45:00.913538 master-0 kubenswrapper[4045]: I0308 03:45:00.913476 4045 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 08 03:45:00.915127 master-0 kubenswrapper[4045]: I0308 03:45:00.915077 4045 server.go:997] "Starting client certificate rotation" Mar 08 03:45:00.915127 master-0 kubenswrapper[4045]: I0308 03:45:00.915122 4045 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 08 03:45:00.915356 master-0 kubenswrapper[4045]: I0308 03:45:00.915303 4045 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 03:45:00.943852 master-0 kubenswrapper[4045]: I0308 03:45:00.943689 4045 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 03:45:00.946739 master-0 kubenswrapper[4045]: I0308 03:45:00.946674 4045 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 03:45:00.949260 master-0 kubenswrapper[4045]: E0308 03:45:00.947749 4045 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:45:00.964560 master-0 kubenswrapper[4045]: I0308 03:45:00.964473 4045 log.go:25] "Validated CRI v1 runtime API" Mar 08 03:45:00.971798 master-0 kubenswrapper[4045]: I0308 03:45:00.971722 4045 log.go:25] "Validated CRI v1 image API" Mar 08 03:45:00.974553 master-0 kubenswrapper[4045]: I0308 03:45:00.974489 4045 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 08 03:45:00.978387 master-0 kubenswrapper[4045]: I0308 03:45:00.978325 4045 fs.go:135] Filesystem UUIDs: map[67898fbb-3e32-465e-b6f9-207afe668b6e:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 08 03:45:00.978387 master-0 kubenswrapper[4045]: I0308 03:45:00.978369 4045 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Mar 08 03:45:01.004090 master-0 kubenswrapper[4045]: I0308 03:45:01.003515 4045 manager.go:217] Machine: {Timestamp:2026-03-08 03:45:01.002026032 +0000 UTC m=+0.612727070 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:713fb7c44cd644b5986e15157751dddb SystemUUID:713fb7c4-4cd6-44b5-986e-15157751dddb BootID:30e60e76-0e70-41ea-99da-7a4dcafd0e32 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:26:03:3b Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:21:09:89 Speed:-1 Mtu:9000} {Name:ovs-system MacAddress:06:d0:49:23:c0:ac Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 08 03:45:01.004090 master-0 kubenswrapper[4045]: I0308 03:45:01.004024 4045 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 08 03:45:01.004343 master-0 kubenswrapper[4045]: I0308 03:45:01.004199 4045 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 08 03:45:01.004685 master-0 kubenswrapper[4045]: I0308 03:45:01.004640 4045 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 08 03:45:01.005173 master-0 kubenswrapper[4045]: I0308 03:45:01.005057 4045 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 08 03:45:01.005479 master-0 kubenswrapper[4045]: I0308 03:45:01.005151 4045 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 08 03:45:01.005600 master-0 kubenswrapper[4045]: I0308 03:45:01.005486 4045 topology_manager.go:138] "Creating topology manager with none policy" Mar 08 03:45:01.005600 master-0 kubenswrapper[4045]: I0308 03:45:01.005506 4045 container_manager_linux.go:303] "Creating device plugin manager" Mar 08 03:45:01.005600 master-0 kubenswrapper[4045]: I0308 03:45:01.005521 4045 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 03:45:01.005600 master-0 kubenswrapper[4045]: I0308 03:45:01.005558 4045 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 03:45:01.005786 master-0 kubenswrapper[4045]: I0308 03:45:01.005736 4045 state_mem.go:36] "Initialized new in-memory state store" Mar 08 03:45:01.006282 master-0 kubenswrapper[4045]: I0308 03:45:01.006240 4045 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 08 03:45:01.011341 master-0 kubenswrapper[4045]: I0308 03:45:01.011294 4045 kubelet.go:418] "Attempting to sync node with API server" Mar 08 03:45:01.011341 master-0 kubenswrapper[4045]: I0308 03:45:01.011336 4045 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 08 03:45:01.011491 master-0 kubenswrapper[4045]: I0308 03:45:01.011424 4045 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 08 03:45:01.011491 master-0 kubenswrapper[4045]: I0308 03:45:01.011455 4045 kubelet.go:324] "Adding apiserver pod source" Mar 08 03:45:01.011491 master-0 kubenswrapper[4045]: I0308 03:45:01.011488 4045 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 08 03:45:01.024396 master-0 kubenswrapper[4045]: I0308 03:45:01.024299 4045 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 08 03:45:01.024802 master-0 kubenswrapper[4045]: W0308 03:45:01.024704 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:01.024802 master-0 kubenswrapper[4045]: W0308 03:45:01.024704 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:01.025031 master-0 kubenswrapper[4045]: E0308 03:45:01.024859 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:45:01.025031 master-0 kubenswrapper[4045]: E0308 03:45:01.024880 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:45:01.026565 master-0 kubenswrapper[4045]: I0308 03:45:01.026506 4045 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 08 03:45:01.030492 master-0 kubenswrapper[4045]: I0308 03:45:01.030442 4045 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 08 03:45:01.030492 master-0 kubenswrapper[4045]: I0308 03:45:01.030491 4045 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 08 03:45:01.030628 master-0 kubenswrapper[4045]: I0308 03:45:01.030507 4045 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 08 03:45:01.030628 master-0 kubenswrapper[4045]: I0308 03:45:01.030523 4045 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 08 03:45:01.030628 master-0 kubenswrapper[4045]: I0308 03:45:01.030536 4045 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 08 03:45:01.030628 master-0 kubenswrapper[4045]: I0308 03:45:01.030550 4045 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 08 03:45:01.030628 master-0 kubenswrapper[4045]: I0308 03:45:01.030563 4045 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 08 03:45:01.030628 master-0 kubenswrapper[4045]: I0308 03:45:01.030576 4045 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 08 03:45:01.030628 master-0 kubenswrapper[4045]: I0308 03:45:01.030592 4045 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 08 03:45:01.030628 master-0 kubenswrapper[4045]: I0308 03:45:01.030605 4045 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 08 03:45:01.030628 master-0 kubenswrapper[4045]: I0308 03:45:01.030628 4045 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 08 03:45:01.031087 master-0 kubenswrapper[4045]: I0308 03:45:01.030657 4045 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 08 03:45:01.031087 master-0 kubenswrapper[4045]: I0308 03:45:01.030718 4045 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 08 03:45:01.031431 master-0 kubenswrapper[4045]: I0308 03:45:01.031382 4045 server.go:1280] "Started kubelet" Mar 08 03:45:01.031992 master-0 kubenswrapper[4045]: I0308 03:45:01.031885 4045 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 08 03:45:01.032117 master-0 kubenswrapper[4045]: I0308 03:45:01.031940 4045 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 08 03:45:01.032117 master-0 kubenswrapper[4045]: I0308 03:45:01.032065 4045 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 08 03:45:01.032755 master-0 kubenswrapper[4045]: I0308 03:45:01.032442 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:01.033471 master-0 kubenswrapper[4045]: I0308 03:45:01.033404 4045 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 08 03:45:01.034379 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 08 03:45:01.036998 master-0 kubenswrapper[4045]: I0308 03:45:01.034513 4045 server.go:449] "Adding debug handlers to kubelet server" Mar 08 03:45:01.036998 master-0 kubenswrapper[4045]: I0308 03:45:01.036006 4045 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 08 03:45:01.036998 master-0 kubenswrapper[4045]: I0308 03:45:01.036051 4045 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 08 03:45:01.036998 master-0 kubenswrapper[4045]: I0308 03:45:01.036695 4045 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 08 03:45:01.036998 master-0 kubenswrapper[4045]: I0308 03:45:01.036719 4045 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 08 03:45:01.036998 master-0 kubenswrapper[4045]: I0308 03:45:01.036906 4045 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 08 03:45:01.037403 master-0 kubenswrapper[4045]: I0308 03:45:01.037018 4045 reconstruct.go:97] "Volume reconstruction finished" Mar 08 03:45:01.037403 master-0 kubenswrapper[4045]: I0308 03:45:01.037032 4045 reconciler.go:26] "Reconciler: start to sync state" Mar 08 03:45:01.037403 master-0 kubenswrapper[4045]: E0308 03:45:01.037030 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:01.040889 master-0 kubenswrapper[4045]: W0308 03:45:01.040453 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:01.040889 master-0 kubenswrapper[4045]: E0308 03:45:01.040543 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:45:01.041592 master-0 kubenswrapper[4045]: E0308 03:45:01.041491 4045 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 08 03:45:01.042482 master-0 kubenswrapper[4045]: E0308 03:45:01.040750 4045 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189ac0e6d88567c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.031344073 +0000 UTC m=+0.642045061,LastTimestamp:2026-03-08 03:45:01.031344073 +0000 UTC m=+0.642045061,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:01.045792 master-0 kubenswrapper[4045]: I0308 03:45:01.045758 4045 factory.go:55] Registering systemd factory Mar 08 03:45:01.046087 master-0 kubenswrapper[4045]: I0308 03:45:01.046057 4045 factory.go:221] Registration of the systemd container factory successfully Mar 08 03:45:01.046777 master-0 kubenswrapper[4045]: I0308 03:45:01.046746 4045 factory.go:153] Registering CRI-O factory Mar 08 03:45:01.047014 master-0 kubenswrapper[4045]: I0308 03:45:01.046984 4045 factory.go:221] Registration of the crio container factory successfully Mar 08 03:45:01.047265 master-0 kubenswrapper[4045]: I0308 03:45:01.047237 4045 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 08 03:45:01.047452 master-0 kubenswrapper[4045]: I0308 03:45:01.047426 4045 factory.go:103] Registering Raw factory Mar 08 03:45:01.047723 master-0 kubenswrapper[4045]: I0308 03:45:01.047697 4045 manager.go:1196] Started watching for new ooms in manager Mar 08 03:45:01.048243 master-0 kubenswrapper[4045]: E0308 03:45:01.048122 4045 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 08 03:45:01.050887 master-0 kubenswrapper[4045]: I0308 03:45:01.050795 4045 manager.go:319] Starting recovery of all containers Mar 08 03:45:01.079598 master-0 kubenswrapper[4045]: I0308 03:45:01.079367 4045 manager.go:324] Recovery completed Mar 08 03:45:01.089720 master-0 kubenswrapper[4045]: I0308 03:45:01.089701 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:01.091593 master-0 kubenswrapper[4045]: I0308 03:45:01.091548 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:01.091698 master-0 kubenswrapper[4045]: I0308 03:45:01.091684 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:01.091779 master-0 kubenswrapper[4045]: I0308 03:45:01.091766 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:01.092763 master-0 kubenswrapper[4045]: I0308 03:45:01.092744 4045 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 08 03:45:01.092901 master-0 kubenswrapper[4045]: I0308 03:45:01.092886 4045 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 08 03:45:01.092978 master-0 kubenswrapper[4045]: I0308 03:45:01.092966 4045 state_mem.go:36] "Initialized new in-memory state store" Mar 08 03:45:01.099524 master-0 kubenswrapper[4045]: I0308 03:45:01.099507 4045 policy_none.go:49] "None policy: Start" Mar 08 03:45:01.101904 master-0 kubenswrapper[4045]: I0308 03:45:01.100632 4045 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 08 03:45:01.101904 master-0 kubenswrapper[4045]: I0308 03:45:01.100787 4045 state_mem.go:35] "Initializing new in-memory state store" Mar 08 03:45:01.138013 master-0 kubenswrapper[4045]: E0308 03:45:01.137889 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:01.180988 master-0 kubenswrapper[4045]: I0308 03:45:01.180940 4045 manager.go:334] "Starting Device Plugin manager" Mar 08 03:45:01.205911 master-0 kubenswrapper[4045]: I0308 03:45:01.181013 4045 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 08 03:45:01.205911 master-0 kubenswrapper[4045]: I0308 03:45:01.181029 4045 server.go:79] "Starting device plugin registration server" Mar 08 03:45:01.205911 master-0 kubenswrapper[4045]: I0308 03:45:01.181402 4045 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 08 03:45:01.205911 master-0 kubenswrapper[4045]: I0308 03:45:01.181418 4045 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 08 03:45:01.205911 master-0 kubenswrapper[4045]: E0308 03:45:01.183688 4045 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:45:01.205911 master-0 kubenswrapper[4045]: I0308 03:45:01.184280 4045 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 08 03:45:01.205911 master-0 kubenswrapper[4045]: I0308 03:45:01.184364 4045 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 08 03:45:01.205911 master-0 kubenswrapper[4045]: I0308 03:45:01.184374 4045 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 08 03:45:01.205911 master-0 kubenswrapper[4045]: I0308 03:45:01.196518 4045 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 08 03:45:01.205911 master-0 kubenswrapper[4045]: I0308 03:45:01.198420 4045 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 08 03:45:01.205911 master-0 kubenswrapper[4045]: I0308 03:45:01.198490 4045 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 08 03:45:01.205911 master-0 kubenswrapper[4045]: I0308 03:45:01.198522 4045 kubelet.go:2335] "Starting kubelet main sync loop" Mar 08 03:45:01.205911 master-0 kubenswrapper[4045]: E0308 03:45:01.198602 4045 kubelet.go:2359] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 08 03:45:01.205911 master-0 kubenswrapper[4045]: W0308 03:45:01.199711 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:01.205911 master-0 kubenswrapper[4045]: E0308 03:45:01.199779 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:45:01.243422 master-0 kubenswrapper[4045]: E0308 03:45:01.243326 4045 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 08 03:45:01.282552 master-0 kubenswrapper[4045]: I0308 03:45:01.282484 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:01.283849 master-0 kubenswrapper[4045]: I0308 03:45:01.283775 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:01.283958 master-0 kubenswrapper[4045]: I0308 03:45:01.283864 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:01.283958 master-0 kubenswrapper[4045]: I0308 03:45:01.283884 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:01.283958 master-0 kubenswrapper[4045]: I0308 03:45:01.283939 4045 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:45:01.284814 master-0 kubenswrapper[4045]: E0308 03:45:01.284752 4045 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:45:01.299016 master-0 kubenswrapper[4045]: I0308 03:45:01.298939 4045 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 08 03:45:01.299128 master-0 kubenswrapper[4045]: I0308 03:45:01.299040 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:01.300094 master-0 kubenswrapper[4045]: I0308 03:45:01.300046 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:01.300197 master-0 kubenswrapper[4045]: I0308 03:45:01.300099 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:01.300197 master-0 kubenswrapper[4045]: I0308 03:45:01.300119 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:01.300312 master-0 kubenswrapper[4045]: I0308 03:45:01.300274 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:01.300476 master-0 kubenswrapper[4045]: I0308 03:45:01.300427 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:01.300544 master-0 kubenswrapper[4045]: I0308 03:45:01.300482 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:01.301398 master-0 kubenswrapper[4045]: I0308 03:45:01.301359 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:01.301494 master-0 kubenswrapper[4045]: I0308 03:45:01.301408 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:01.301494 master-0 kubenswrapper[4045]: I0308 03:45:01.301428 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:01.301609 master-0 kubenswrapper[4045]: I0308 03:45:01.301498 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:01.301609 master-0 kubenswrapper[4045]: I0308 03:45:01.301529 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:01.301609 master-0 kubenswrapper[4045]: I0308 03:45:01.301544 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:01.301771 master-0 kubenswrapper[4045]: I0308 03:45:01.301647 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:01.302068 master-0 kubenswrapper[4045]: I0308 03:45:01.302006 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:45:01.302192 master-0 kubenswrapper[4045]: I0308 03:45:01.302101 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:01.302548 master-0 kubenswrapper[4045]: I0308 03:45:01.302476 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:01.302548 master-0 kubenswrapper[4045]: I0308 03:45:01.302520 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:01.302548 master-0 kubenswrapper[4045]: I0308 03:45:01.302537 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:01.302940 master-0 kubenswrapper[4045]: I0308 03:45:01.302719 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:01.302940 master-0 kubenswrapper[4045]: I0308 03:45:01.302921 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:45:01.303071 master-0 kubenswrapper[4045]: I0308 03:45:01.302969 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:01.303802 master-0 kubenswrapper[4045]: I0308 03:45:01.303732 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:01.303802 master-0 kubenswrapper[4045]: I0308 03:45:01.303783 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:01.303802 master-0 kubenswrapper[4045]: I0308 03:45:01.303799 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:01.304115 master-0 kubenswrapper[4045]: I0308 03:45:01.303733 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:01.304115 master-0 kubenswrapper[4045]: I0308 03:45:01.304019 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:01.304115 master-0 kubenswrapper[4045]: I0308 03:45:01.304039 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:45:01.304115 master-0 kubenswrapper[4045]: I0308 03:45:01.303951 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:01.304115 master-0 kubenswrapper[4045]: I0308 03:45:01.304076 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:01.304464 master-0 kubenswrapper[4045]: I0308 03:45:01.304040 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:01.305273 master-0 kubenswrapper[4045]: I0308 03:45:01.305206 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:01.305273 master-0 kubenswrapper[4045]: I0308 03:45:01.305258 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:01.305273 master-0 kubenswrapper[4045]: I0308 03:45:01.305275 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:01.305554 master-0 kubenswrapper[4045]: I0308 03:45:01.305328 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:01.305554 master-0 kubenswrapper[4045]: I0308 03:45:01.305359 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:01.305554 master-0 kubenswrapper[4045]: I0308 03:45:01.305374 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:01.305554 master-0 kubenswrapper[4045]: I0308 03:45:01.305333 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:01.305817 master-0 kubenswrapper[4045]: I0308 03:45:01.305581 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:01.305817 master-0 kubenswrapper[4045]: I0308 03:45:01.305597 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:01.305817 master-0 kubenswrapper[4045]: I0308 03:45:01.305635 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.305817 master-0 kubenswrapper[4045]: I0308 03:45:01.305680 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:01.307016 master-0 kubenswrapper[4045]: I0308 03:45:01.306970 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:01.307108 master-0 kubenswrapper[4045]: I0308 03:45:01.307083 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:01.307108 master-0 kubenswrapper[4045]: I0308 03:45:01.307104 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:01.338235 master-0 kubenswrapper[4045]: I0308 03:45:01.338146 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.338235 master-0 kubenswrapper[4045]: I0308 03:45:01.338205 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.338235 master-0 kubenswrapper[4045]: I0308 03:45:01.338241 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:01.338499 master-0 kubenswrapper[4045]: I0308 03:45:01.338391 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:45:01.338666 master-0 kubenswrapper[4045]: I0308 03:45:01.338574 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:45:01.338760 master-0 kubenswrapper[4045]: I0308 03:45:01.338685 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:45:01.338900 master-0 kubenswrapper[4045]: I0308 03:45:01.338785 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:45:01.339059 master-0 kubenswrapper[4045]: I0308 03:45:01.338957 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.339195 master-0 kubenswrapper[4045]: I0308 03:45:01.339144 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:01.339359 master-0 kubenswrapper[4045]: I0308 03:45:01.339246 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:45:01.339359 master-0 kubenswrapper[4045]: I0308 03:45:01.339319 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.339480 master-0 kubenswrapper[4045]: I0308 03:45:01.339452 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:01.339597 master-0 kubenswrapper[4045]: I0308 03:45:01.339552 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:01.339655 master-0 kubenswrapper[4045]: I0308 03:45:01.339612 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:45:01.339655 master-0 kubenswrapper[4045]: I0308 03:45:01.339645 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.339942 master-0 kubenswrapper[4045]: I0308 03:45:01.339712 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.339942 master-0 kubenswrapper[4045]: I0308 03:45:01.339775 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:01.440521 master-0 kubenswrapper[4045]: I0308 03:45:01.440434 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.440521 master-0 kubenswrapper[4045]: I0308 03:45:01.440510 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.440761 master-0 kubenswrapper[4045]: I0308 03:45:01.440563 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:01.440761 master-0 kubenswrapper[4045]: I0308 03:45:01.440640 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.440945 master-0 kubenswrapper[4045]: I0308 03:45:01.440747 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.440945 master-0 kubenswrapper[4045]: I0308 03:45:01.440901 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.441085 master-0 kubenswrapper[4045]: I0308 03:45:01.440972 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.441085 master-0 kubenswrapper[4045]: I0308 03:45:01.441022 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:01.441085 master-0 kubenswrapper[4045]: I0308 03:45:01.441026 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:01.441260 master-0 kubenswrapper[4045]: I0308 03:45:01.441106 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:01.441260 master-0 kubenswrapper[4045]: I0308 03:45:01.441114 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.441260 master-0 kubenswrapper[4045]: I0308 03:45:01.441119 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.441260 master-0 kubenswrapper[4045]: I0308 03:45:01.441204 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:45:01.441260 master-0 kubenswrapper[4045]: I0308 03:45:01.441156 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:45:01.441558 master-0 kubenswrapper[4045]: I0308 03:45:01.441286 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:45:01.441558 master-0 kubenswrapper[4045]: I0308 03:45:01.441326 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:45:01.441558 master-0 kubenswrapper[4045]: I0308 03:45:01.441356 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:45:01.441558 master-0 kubenswrapper[4045]: I0308 03:45:01.441388 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.441558 master-0 kubenswrapper[4045]: I0308 03:45:01.441442 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:45:01.441558 master-0 kubenswrapper[4045]: I0308 03:45:01.441466 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.441558 master-0 kubenswrapper[4045]: I0308 03:45:01.441522 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:45:01.442008 master-0 kubenswrapper[4045]: I0308 03:45:01.441575 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:45:01.442008 master-0 kubenswrapper[4045]: I0308 03:45:01.441622 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:01.442008 master-0 kubenswrapper[4045]: I0308 03:45:01.441673 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:45:01.442008 master-0 kubenswrapper[4045]: I0308 03:45:01.441680 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:01.442008 master-0 kubenswrapper[4045]: I0308 03:45:01.441717 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.442008 master-0 kubenswrapper[4045]: I0308 03:45:01.441755 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:45:01.442008 master-0 kubenswrapper[4045]: I0308 03:45:01.441764 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:01.442008 master-0 kubenswrapper[4045]: I0308 03:45:01.441866 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:01.442008 master-0 kubenswrapper[4045]: I0308 03:45:01.441910 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.442008 master-0 kubenswrapper[4045]: I0308 03:45:01.441949 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:01.442008 master-0 kubenswrapper[4045]: I0308 03:45:01.442010 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:01.442749 master-0 kubenswrapper[4045]: I0308 03:45:01.442090 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:45:01.442749 master-0 kubenswrapper[4045]: I0308 03:45:01.442172 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:45:01.485907 master-0 kubenswrapper[4045]: I0308 03:45:01.485771 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:01.489132 master-0 kubenswrapper[4045]: I0308 03:45:01.489076 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:01.489224 master-0 kubenswrapper[4045]: I0308 03:45:01.489174 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:01.489224 master-0 kubenswrapper[4045]: I0308 03:45:01.489198 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:01.489380 master-0 kubenswrapper[4045]: I0308 03:45:01.489341 4045 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:45:01.490711 master-0 kubenswrapper[4045]: E0308 03:45:01.490649 4045 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:45:01.644818 master-0 kubenswrapper[4045]: E0308 03:45:01.644595 4045 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 08 03:45:01.652786 master-0 kubenswrapper[4045]: I0308 03:45:01.652725 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:01.664139 master-0 kubenswrapper[4045]: I0308 03:45:01.664093 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:45:01.688158 master-0 kubenswrapper[4045]: I0308 03:45:01.688076 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:45:01.699620 master-0 kubenswrapper[4045]: I0308 03:45:01.699558 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:45:01.720804 master-0 kubenswrapper[4045]: I0308 03:45:01.720711 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:01.891395 master-0 kubenswrapper[4045]: I0308 03:45:01.891263 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:01.892793 master-0 kubenswrapper[4045]: I0308 03:45:01.892731 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:01.892793 master-0 kubenswrapper[4045]: I0308 03:45:01.892794 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:01.892971 master-0 kubenswrapper[4045]: I0308 03:45:01.892810 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:01.892971 master-0 kubenswrapper[4045]: I0308 03:45:01.892903 4045 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:45:01.894014 master-0 kubenswrapper[4045]: E0308 03:45:01.893955 4045 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:45:02.018947 master-0 kubenswrapper[4045]: W0308 03:45:02.018792 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:02.018947 master-0 kubenswrapper[4045]: E0308 03:45:02.018943 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:45:02.034275 master-0 kubenswrapper[4045]: I0308 03:45:02.034218 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:02.301759 master-0 kubenswrapper[4045]: W0308 03:45:02.301548 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:02.301759 master-0 kubenswrapper[4045]: E0308 03:45:02.301684 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:45:02.325320 master-0 kubenswrapper[4045]: W0308 03:45:02.325245 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:02.325320 master-0 kubenswrapper[4045]: E0308 03:45:02.325333 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:45:02.446217 master-0 kubenswrapper[4045]: E0308 03:45:02.446123 4045 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 08 03:45:02.577979 master-0 kubenswrapper[4045]: W0308 03:45:02.577894 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9add8df47182fc2eaf8cd78016ebe72.slice/crio-ed400b0e1b21fe5e4ef5385a05444bf39db4c2fd9c754a3d6c45427d3b29ef99 WatchSource:0}: Error finding container ed400b0e1b21fe5e4ef5385a05444bf39db4c2fd9c754a3d6c45427d3b29ef99: Status 404 returned error can't find the container with id ed400b0e1b21fe5e4ef5385a05444bf39db4c2fd9c754a3d6c45427d3b29ef99 Mar 08 03:45:02.581634 master-0 kubenswrapper[4045]: W0308 03:45:02.581565 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod354f29997baa583b6238f7de9108ee10.slice/crio-604b5e18b0f1fc95cb4cabd9d6cb088bfcead3c4cba52acd70685e03b5856c7f WatchSource:0}: Error finding container 604b5e18b0f1fc95cb4cabd9d6cb088bfcead3c4cba52acd70685e03b5856c7f: Status 404 returned error can't find the container with id 604b5e18b0f1fc95cb4cabd9d6cb088bfcead3c4cba52acd70685e03b5856c7f Mar 08 03:45:02.584320 master-0 kubenswrapper[4045]: I0308 03:45:02.584270 4045 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 03:45:02.611346 master-0 kubenswrapper[4045]: W0308 03:45:02.611270 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:02.611457 master-0 kubenswrapper[4045]: E0308 03:45:02.611349 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:45:02.611800 master-0 kubenswrapper[4045]: W0308 03:45:02.611715 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a56802af72ce1aac6b5077f1695ac0.slice/crio-5b8c31076d1db49fd8c133661fbbc131a58892112131cf3118f58212505e7460 WatchSource:0}: Error finding container 5b8c31076d1db49fd8c133661fbbc131a58892112131cf3118f58212505e7460: Status 404 returned error can't find the container with id 5b8c31076d1db49fd8c133661fbbc131a58892112131cf3118f58212505e7460 Mar 08 03:45:02.632257 master-0 kubenswrapper[4045]: W0308 03:45:02.632213 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f77c8e18b751d90bc0dfe2d4e304050.slice/crio-5ffd7e8cf7a9593e9910a67c41b6e95af26b8d49eaf5fd007129fe49d1978425 WatchSource:0}: Error finding container 5ffd7e8cf7a9593e9910a67c41b6e95af26b8d49eaf5fd007129fe49d1978425: Status 404 returned error can't find the container with id 5ffd7e8cf7a9593e9910a67c41b6e95af26b8d49eaf5fd007129fe49d1978425 Mar 08 03:45:02.657981 master-0 kubenswrapper[4045]: W0308 03:45:02.657807 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf78c05e1499b533b83f091333d61f045.slice/crio-758a2c2e2af7455b02804a595f36886f4047114b8dbd25a8393a292e35b7254e WatchSource:0}: Error finding container 758a2c2e2af7455b02804a595f36886f4047114b8dbd25a8393a292e35b7254e: Status 404 returned error can't find the container with id 758a2c2e2af7455b02804a595f36886f4047114b8dbd25a8393a292e35b7254e Mar 08 03:45:02.694130 master-0 kubenswrapper[4045]: I0308 03:45:02.694051 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:02.695517 master-0 kubenswrapper[4045]: I0308 03:45:02.695476 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:02.695517 master-0 kubenswrapper[4045]: I0308 03:45:02.695508 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:02.695517 master-0 kubenswrapper[4045]: I0308 03:45:02.695516 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:02.695692 master-0 kubenswrapper[4045]: I0308 03:45:02.695566 4045 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:45:02.696406 master-0 kubenswrapper[4045]: E0308 03:45:02.696352 4045 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:45:03.033871 master-0 kubenswrapper[4045]: I0308 03:45:03.033754 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:03.145579 master-0 kubenswrapper[4045]: I0308 03:45:03.145510 4045 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 03:45:03.147187 master-0 kubenswrapper[4045]: E0308 03:45:03.147133 4045 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:45:03.205685 master-0 kubenswrapper[4045]: I0308 03:45:03.205528 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"758a2c2e2af7455b02804a595f36886f4047114b8dbd25a8393a292e35b7254e"} Mar 08 03:45:03.206609 master-0 kubenswrapper[4045]: I0308 03:45:03.206553 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"5ffd7e8cf7a9593e9910a67c41b6e95af26b8d49eaf5fd007129fe49d1978425"} Mar 08 03:45:03.209059 master-0 kubenswrapper[4045]: I0308 03:45:03.208998 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"5b8c31076d1db49fd8c133661fbbc131a58892112131cf3118f58212505e7460"} Mar 08 03:45:03.210162 master-0 kubenswrapper[4045]: I0308 03:45:03.210114 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"604b5e18b0f1fc95cb4cabd9d6cb088bfcead3c4cba52acd70685e03b5856c7f"} Mar 08 03:45:03.211316 master-0 kubenswrapper[4045]: I0308 03:45:03.211265 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"ed400b0e1b21fe5e4ef5385a05444bf39db4c2fd9c754a3d6c45427d3b29ef99"} Mar 08 03:45:03.971055 master-0 kubenswrapper[4045]: W0308 03:45:03.970986 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:03.971055 master-0 kubenswrapper[4045]: E0308 03:45:03.971049 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:45:04.034129 master-0 kubenswrapper[4045]: I0308 03:45:04.034062 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:04.047138 master-0 kubenswrapper[4045]: E0308 03:45:04.047081 4045 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 08 03:45:04.226892 master-0 kubenswrapper[4045]: W0308 03:45:04.226656 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:04.226892 master-0 kubenswrapper[4045]: E0308 03:45:04.226737 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:45:04.297523 master-0 kubenswrapper[4045]: I0308 03:45:04.297471 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:04.298713 master-0 kubenswrapper[4045]: I0308 03:45:04.298675 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:04.298773 master-0 kubenswrapper[4045]: I0308 03:45:04.298720 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:04.298773 master-0 kubenswrapper[4045]: I0308 03:45:04.298738 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:04.298916 master-0 kubenswrapper[4045]: I0308 03:45:04.298798 4045 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:45:04.299734 master-0 kubenswrapper[4045]: E0308 03:45:04.299675 4045 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:45:04.860630 master-0 kubenswrapper[4045]: W0308 03:45:04.860570 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:04.860630 master-0 kubenswrapper[4045]: E0308 03:45:04.860629 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:45:05.033761 master-0 kubenswrapper[4045]: I0308 03:45:05.033709 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:05.100231 master-0 kubenswrapper[4045]: W0308 03:45:05.100197 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:05.100490 master-0 kubenswrapper[4045]: E0308 03:45:05.100250 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:45:06.035132 master-0 kubenswrapper[4045]: I0308 03:45:06.035080 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:06.221395 master-0 kubenswrapper[4045]: I0308 03:45:06.221330 4045 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="809953c4a3d0d1d245b0d287db991ef24c93f664dbc39c226e2a89fc2ba7da3d" exitCode=0 Mar 08 03:45:06.222104 master-0 kubenswrapper[4045]: I0308 03:45:06.221526 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:06.222104 master-0 kubenswrapper[4045]: I0308 03:45:06.221878 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"809953c4a3d0d1d245b0d287db991ef24c93f664dbc39c226e2a89fc2ba7da3d"} Mar 08 03:45:06.222590 master-0 kubenswrapper[4045]: I0308 03:45:06.222541 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:06.222643 master-0 kubenswrapper[4045]: I0308 03:45:06.222603 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:06.222643 master-0 kubenswrapper[4045]: I0308 03:45:06.222616 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:06.224814 master-0 kubenswrapper[4045]: I0308 03:45:06.224774 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"c18b9f8b4dcef22d65b7b32df1f7077ca430d4d3cab49ca6d36290193d631e27"} Mar 08 03:45:06.224881 master-0 kubenswrapper[4045]: I0308 03:45:06.224837 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:06.225013 master-0 kubenswrapper[4045]: I0308 03:45:06.224840 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"4573b175e4638284868f035fd979eb84c441b639f2cba6882ebb0bdabc7d53f1"} Mar 08 03:45:06.225450 master-0 kubenswrapper[4045]: I0308 03:45:06.225424 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:06.225489 master-0 kubenswrapper[4045]: I0308 03:45:06.225452 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:06.225489 master-0 kubenswrapper[4045]: I0308 03:45:06.225461 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:06.670409 master-0 kubenswrapper[4045]: E0308 03:45:06.670168 4045 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189ac0e6d88567c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.031344073 +0000 UTC m=+0.642045061,LastTimestamp:2026-03-08 03:45:01.031344073 +0000 UTC m=+0.642045061,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:07.033753 master-0 kubenswrapper[4045]: I0308 03:45:07.033660 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:07.228442 master-0 kubenswrapper[4045]: I0308 03:45:07.228393 4045 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/0.log" Mar 08 03:45:07.228915 master-0 kubenswrapper[4045]: I0308 03:45:07.228812 4045 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="6c049cc62cac642e91ccc51178c5b9ce84741ffa408b3df71c9f7e5072ca001a" exitCode=1 Mar 08 03:45:07.228915 master-0 kubenswrapper[4045]: I0308 03:45:07.228886 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"6c049cc62cac642e91ccc51178c5b9ce84741ffa408b3df71c9f7e5072ca001a"} Mar 08 03:45:07.228980 master-0 kubenswrapper[4045]: I0308 03:45:07.228922 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:07.229010 master-0 kubenswrapper[4045]: I0308 03:45:07.228924 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:07.229879 master-0 kubenswrapper[4045]: I0308 03:45:07.229856 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:07.229936 master-0 kubenswrapper[4045]: I0308 03:45:07.229884 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:07.229936 master-0 kubenswrapper[4045]: I0308 03:45:07.229895 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:07.229988 master-0 kubenswrapper[4045]: I0308 03:45:07.229937 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:07.229988 master-0 kubenswrapper[4045]: I0308 03:45:07.229976 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:07.230040 master-0 kubenswrapper[4045]: I0308 03:45:07.229990 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:07.230217 master-0 kubenswrapper[4045]: I0308 03:45:07.230196 4045 scope.go:117] "RemoveContainer" containerID="6c049cc62cac642e91ccc51178c5b9ce84741ffa408b3df71c9f7e5072ca001a" Mar 08 03:45:07.236872 master-0 kubenswrapper[4045]: I0308 03:45:07.236812 4045 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 03:45:07.238256 master-0 kubenswrapper[4045]: E0308 03:45:07.238218 4045 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:45:07.249131 master-0 kubenswrapper[4045]: E0308 03:45:07.249074 4045 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 08 03:45:07.500376 master-0 kubenswrapper[4045]: I0308 03:45:07.500262 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:07.536349 master-0 kubenswrapper[4045]: I0308 03:45:07.536298 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:07.536349 master-0 kubenswrapper[4045]: I0308 03:45:07.536333 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:07.536349 master-0 kubenswrapper[4045]: I0308 03:45:07.536344 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:07.536650 master-0 kubenswrapper[4045]: I0308 03:45:07.536383 4045 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:45:07.537193 master-0 kubenswrapper[4045]: E0308 03:45:07.537149 4045 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:45:08.034386 master-0 kubenswrapper[4045]: I0308 03:45:08.034318 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:08.407244 master-0 kubenswrapper[4045]: W0308 03:45:08.407120 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:08.407244 master-0 kubenswrapper[4045]: E0308 03:45:08.407185 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:45:09.014957 master-0 kubenswrapper[4045]: W0308 03:45:09.014860 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:09.015197 master-0 kubenswrapper[4045]: E0308 03:45:09.015006 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:45:09.033333 master-0 kubenswrapper[4045]: I0308 03:45:09.033289 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:09.101429 master-0 kubenswrapper[4045]: W0308 03:45:09.101348 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:09.101519 master-0 kubenswrapper[4045]: E0308 03:45:09.101447 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:45:10.034039 master-0 kubenswrapper[4045]: I0308 03:45:10.033925 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:45:10.239109 master-0 kubenswrapper[4045]: I0308 03:45:10.239023 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"f5a0acfb3a3f4f285f366c3abcb3f9d3bebb3626e4a976de0dab27a634745185"} Mar 08 03:45:10.239109 master-0 kubenswrapper[4045]: I0308 03:45:10.239067 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:10.241045 master-0 kubenswrapper[4045]: I0308 03:45:10.240997 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:10.241146 master-0 kubenswrapper[4045]: I0308 03:45:10.241052 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:10.241146 master-0 kubenswrapper[4045]: I0308 03:45:10.241070 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:10.242947 master-0 kubenswrapper[4045]: I0308 03:45:10.242902 4045 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 08 03:45:10.243534 master-0 kubenswrapper[4045]: I0308 03:45:10.243486 4045 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/0.log" Mar 08 03:45:10.244098 master-0 kubenswrapper[4045]: I0308 03:45:10.244035 4045 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="78066375e971a13b6e4a7c9120903810bb4af202f86064fd0ff8ab99a3010659" exitCode=1 Mar 08 03:45:10.244198 master-0 kubenswrapper[4045]: I0308 03:45:10.244089 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"78066375e971a13b6e4a7c9120903810bb4af202f86064fd0ff8ab99a3010659"} Mar 08 03:45:10.244198 master-0 kubenswrapper[4045]: I0308 03:45:10.244149 4045 scope.go:117] "RemoveContainer" containerID="6c049cc62cac642e91ccc51178c5b9ce84741ffa408b3df71c9f7e5072ca001a" Mar 08 03:45:10.244319 master-0 kubenswrapper[4045]: I0308 03:45:10.244155 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:10.245259 master-0 kubenswrapper[4045]: I0308 03:45:10.245193 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:10.245259 master-0 kubenswrapper[4045]: I0308 03:45:10.245252 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:10.245418 master-0 kubenswrapper[4045]: I0308 03:45:10.245272 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:10.245786 master-0 kubenswrapper[4045]: I0308 03:45:10.245721 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"724178cb9f231b822e2bf919b24049f88ede4ee540e7e7751c011ef4363756c9"} Mar 08 03:45:10.245906 master-0 kubenswrapper[4045]: I0308 03:45:10.245743 4045 scope.go:117] "RemoveContainer" containerID="78066375e971a13b6e4a7c9120903810bb4af202f86064fd0ff8ab99a3010659" Mar 08 03:45:10.246248 master-0 kubenswrapper[4045]: E0308 03:45:10.246192 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 08 03:45:10.249024 master-0 kubenswrapper[4045]: I0308 03:45:10.248971 4045 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="92b1c53e472127e182a48a6a8f941f7dd97106f322656ce4711b76ad8c4fc359" exitCode=0 Mar 08 03:45:10.249117 master-0 kubenswrapper[4045]: I0308 03:45:10.249030 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerDied","Data":"92b1c53e472127e182a48a6a8f941f7dd97106f322656ce4711b76ad8c4fc359"} Mar 08 03:45:10.249197 master-0 kubenswrapper[4045]: I0308 03:45:10.249108 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:10.250689 master-0 kubenswrapper[4045]: I0308 03:45:10.250636 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:10.250780 master-0 kubenswrapper[4045]: I0308 03:45:10.250691 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:10.250780 master-0 kubenswrapper[4045]: I0308 03:45:10.250714 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:10.255807 master-0 kubenswrapper[4045]: I0308 03:45:10.255779 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:10.260802 master-0 kubenswrapper[4045]: I0308 03:45:10.260753 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:10.260986 master-0 kubenswrapper[4045]: I0308 03:45:10.260807 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:10.260986 master-0 kubenswrapper[4045]: I0308 03:45:10.260881 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:11.188683 master-0 kubenswrapper[4045]: E0308 03:45:11.188635 4045 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:45:11.255301 master-0 kubenswrapper[4045]: I0308 03:45:11.255214 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"a87790639e12c044cf9f716dfd6c742c89b97ffde357a755afcc44a38db6328d"} Mar 08 03:45:11.258316 master-0 kubenswrapper[4045]: I0308 03:45:11.258283 4045 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 08 03:45:11.258926 master-0 kubenswrapper[4045]: I0308 03:45:11.258902 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:11.259045 master-0 kubenswrapper[4045]: I0308 03:45:11.258992 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:11.259646 master-0 kubenswrapper[4045]: I0308 03:45:11.259617 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:11.259701 master-0 kubenswrapper[4045]: I0308 03:45:11.259651 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:11.259701 master-0 kubenswrapper[4045]: I0308 03:45:11.259660 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:11.259991 master-0 kubenswrapper[4045]: I0308 03:45:11.259964 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:11.260040 master-0 kubenswrapper[4045]: I0308 03:45:11.260027 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:11.260086 master-0 kubenswrapper[4045]: I0308 03:45:11.260041 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:11.260568 master-0 kubenswrapper[4045]: I0308 03:45:11.260542 4045 scope.go:117] "RemoveContainer" containerID="78066375e971a13b6e4a7c9120903810bb4af202f86064fd0ff8ab99a3010659" Mar 08 03:45:11.260913 master-0 kubenswrapper[4045]: E0308 03:45:11.260880 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 08 03:45:11.908799 master-0 kubenswrapper[4045]: I0308 03:45:11.907561 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:11.908799 master-0 kubenswrapper[4045]: W0308 03:45:11.907591 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:11.908799 master-0 kubenswrapper[4045]: E0308 03:45:11.907680 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 03:45:12.046873 master-0 kubenswrapper[4045]: I0308 03:45:12.046559 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:13.043417 master-0 kubenswrapper[4045]: I0308 03:45:13.043168 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:13.657084 master-0 kubenswrapper[4045]: E0308 03:45:13.656932 4045 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 03:45:13.938281 master-0 kubenswrapper[4045]: I0308 03:45:13.938213 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:13.939749 master-0 kubenswrapper[4045]: I0308 03:45:13.939691 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:13.939910 master-0 kubenswrapper[4045]: I0308 03:45:13.939764 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:13.939910 master-0 kubenswrapper[4045]: I0308 03:45:13.939783 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:13.939910 master-0 kubenswrapper[4045]: I0308 03:45:13.939886 4045 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:45:13.947642 master-0 kubenswrapper[4045]: E0308 03:45:13.947563 4045 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 08 03:45:14.038807 master-0 kubenswrapper[4045]: I0308 03:45:14.038693 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:14.268655 master-0 kubenswrapper[4045]: I0308 03:45:14.268407 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"512d196861598af69e92dd9aa3d25b53c40e97b92520ddd9df4d73c8065df7e5"} Mar 08 03:45:14.268655 master-0 kubenswrapper[4045]: I0308 03:45:14.268540 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:14.270161 master-0 kubenswrapper[4045]: I0308 03:45:14.270109 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:14.270261 master-0 kubenswrapper[4045]: I0308 03:45:14.270164 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:14.270261 master-0 kubenswrapper[4045]: I0308 03:45:14.270182 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:14.272841 master-0 kubenswrapper[4045]: I0308 03:45:14.272761 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"b00978d6151280d243ba1f6c8276b934ba5c5276b57bc3800284f048820f905f"} Mar 08 03:45:14.272943 master-0 kubenswrapper[4045]: I0308 03:45:14.272893 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:14.274051 master-0 kubenswrapper[4045]: I0308 03:45:14.273992 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:14.274127 master-0 kubenswrapper[4045]: I0308 03:45:14.274059 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:14.274127 master-0 kubenswrapper[4045]: I0308 03:45:14.274081 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:15.040003 master-0 kubenswrapper[4045]: I0308 03:45:15.039917 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:15.178451 master-0 kubenswrapper[4045]: I0308 03:45:15.178372 4045 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:15.185325 master-0 kubenswrapper[4045]: I0308 03:45:15.185267 4045 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:15.275429 master-0 kubenswrapper[4045]: I0308 03:45:15.275360 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:15.276170 master-0 kubenswrapper[4045]: I0308 03:45:15.275610 4045 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:15.276170 master-0 kubenswrapper[4045]: I0308 03:45:15.275959 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:15.276477 master-0 kubenswrapper[4045]: I0308 03:45:15.276431 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:15.276477 master-0 kubenswrapper[4045]: I0308 03:45:15.276488 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:15.276641 master-0 kubenswrapper[4045]: I0308 03:45:15.276510 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:15.277604 master-0 kubenswrapper[4045]: I0308 03:45:15.277556 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:15.277690 master-0 kubenswrapper[4045]: I0308 03:45:15.277615 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:15.277690 master-0 kubenswrapper[4045]: I0308 03:45:15.277634 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:15.282667 master-0 kubenswrapper[4045]: I0308 03:45:15.282592 4045 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:45:15.299581 master-0 kubenswrapper[4045]: I0308 03:45:15.299440 4045 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 03:45:15.319347 master-0 kubenswrapper[4045]: I0308 03:45:15.319272 4045 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 08 03:45:15.963347 master-0 kubenswrapper[4045]: I0308 03:45:15.963173 4045 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:15.969998 master-0 kubenswrapper[4045]: I0308 03:45:15.969942 4045 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:16.040645 master-0 kubenswrapper[4045]: I0308 03:45:16.040600 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:16.237271 master-0 kubenswrapper[4045]: I0308 03:45:16.237118 4045 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:16.277989 master-0 kubenswrapper[4045]: I0308 03:45:16.277917 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:16.278707 master-0 kubenswrapper[4045]: I0308 03:45:16.278029 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:16.279281 master-0 kubenswrapper[4045]: I0308 03:45:16.279229 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:16.279281 master-0 kubenswrapper[4045]: I0308 03:45:16.279255 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:16.279442 master-0 kubenswrapper[4045]: I0308 03:45:16.279287 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:16.279442 master-0 kubenswrapper[4045]: I0308 03:45:16.279298 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:16.279442 master-0 kubenswrapper[4045]: I0308 03:45:16.279306 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:16.279442 master-0 kubenswrapper[4045]: I0308 03:45:16.279318 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:16.686344 master-0 kubenswrapper[4045]: E0308 03:45:16.686064 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6d88567c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.031344073 +0000 UTC m=+0.642045061,LastTimestamp:2026-03-08 03:45:01.031344073 +0000 UTC m=+0.642045061,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.694103 master-0 kubenswrapper[4045]: E0308 03:45:16.693953 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc1de6bb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091669691 +0000 UTC m=+0.702370679,LastTimestamp:2026-03-08 03:45:01.091669691 +0000 UTC m=+0.702370679,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.700890 master-0 kubenswrapper[4045]: E0308 03:45:16.700732 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc1f1e75 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091749493 +0000 UTC m=+0.702450461,LastTimestamp:2026-03-08 03:45:01.091749493 +0000 UTC m=+0.702450461,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.707760 master-0 kubenswrapper[4045]: E0308 03:45:16.707643 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc20a81d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091850269 +0000 UTC m=+0.702551257,LastTimestamp:2026-03-08 03:45:01.091850269 +0000 UTC m=+0.702551257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.713987 master-0 kubenswrapper[4045]: E0308 03:45:16.713868 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6e1a27380 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.18424256 +0000 UTC m=+0.794943528,LastTimestamp:2026-03-08 03:45:01.18424256 +0000 UTC m=+0.794943528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.721976 master-0 kubenswrapper[4045]: E0308 03:45:16.721798 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc1de6bb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc1de6bb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091669691 +0000 UTC m=+0.702370679,LastTimestamp:2026-03-08 03:45:01.283817111 +0000 UTC m=+0.894518099,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.728648 master-0 kubenswrapper[4045]: E0308 03:45:16.728531 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc1f1e75\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc1f1e75 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091749493 +0000 UTC m=+0.702450461,LastTimestamp:2026-03-08 03:45:01.283877627 +0000 UTC m=+0.894578625,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.735964 master-0 kubenswrapper[4045]: E0308 03:45:16.735787 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc20a81d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc20a81d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091850269 +0000 UTC m=+0.702551257,LastTimestamp:2026-03-08 03:45:01.283897942 +0000 UTC m=+0.894598930,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.742718 master-0 kubenswrapper[4045]: E0308 03:45:16.742522 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc1de6bb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc1de6bb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091669691 +0000 UTC m=+0.702370679,LastTimestamp:2026-03-08 03:45:01.300079334 +0000 UTC m=+0.910780322,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.749607 master-0 kubenswrapper[4045]: E0308 03:45:16.749480 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc1f1e75\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc1f1e75 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091749493 +0000 UTC m=+0.702450461,LastTimestamp:2026-03-08 03:45:01.300113337 +0000 UTC m=+0.910814335,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.756040 master-0 kubenswrapper[4045]: E0308 03:45:16.755863 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc20a81d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc20a81d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091850269 +0000 UTC m=+0.702551257,LastTimestamp:2026-03-08 03:45:01.300129023 +0000 UTC m=+0.910830021,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.764191 master-0 kubenswrapper[4045]: E0308 03:45:16.764032 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc1de6bb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc1de6bb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091669691 +0000 UTC m=+0.702370679,LastTimestamp:2026-03-08 03:45:01.301395884 +0000 UTC m=+0.912096882,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.771345 master-0 kubenswrapper[4045]: E0308 03:45:16.771170 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc1f1e75\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc1f1e75 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091749493 +0000 UTC m=+0.702450461,LastTimestamp:2026-03-08 03:45:01.301420399 +0000 UTC m=+0.912121397,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.778818 master-0 kubenswrapper[4045]: E0308 03:45:16.778643 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc20a81d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc20a81d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091850269 +0000 UTC m=+0.702551257,LastTimestamp:2026-03-08 03:45:01.301438025 +0000 UTC m=+0.912139013,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.793189 master-0 kubenswrapper[4045]: E0308 03:45:16.792952 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc1de6bb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc1de6bb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091669691 +0000 UTC m=+0.702370679,LastTimestamp:2026-03-08 03:45:01.301520466 +0000 UTC m=+0.912221454,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.800089 master-0 kubenswrapper[4045]: E0308 03:45:16.799958 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc1f1e75\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc1f1e75 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091749493 +0000 UTC m=+0.702450461,LastTimestamp:2026-03-08 03:45:01.301538731 +0000 UTC m=+0.912239719,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.807429 master-0 kubenswrapper[4045]: E0308 03:45:16.807212 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc20a81d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc20a81d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091850269 +0000 UTC m=+0.702551257,LastTimestamp:2026-03-08 03:45:01.301553318 +0000 UTC m=+0.912254306,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.815328 master-0 kubenswrapper[4045]: E0308 03:45:16.815113 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc1de6bb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc1de6bb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091669691 +0000 UTC m=+0.702370679,LastTimestamp:2026-03-08 03:45:01.302502351 +0000 UTC m=+0.913203339,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.822693 master-0 kubenswrapper[4045]: E0308 03:45:16.822518 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc1f1e75\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc1f1e75 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091749493 +0000 UTC m=+0.702450461,LastTimestamp:2026-03-08 03:45:01.302530844 +0000 UTC m=+0.913231842,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.830154 master-0 kubenswrapper[4045]: E0308 03:45:16.829984 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc20a81d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc20a81d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091850269 +0000 UTC m=+0.702551257,LastTimestamp:2026-03-08 03:45:01.302546261 +0000 UTC m=+0.913247249,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.836380 master-0 kubenswrapper[4045]: E0308 03:45:16.836230 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc1de6bb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc1de6bb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091669691 +0000 UTC m=+0.702370679,LastTimestamp:2026-03-08 03:45:01.303774771 +0000 UTC m=+0.914475759,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.843057 master-0 kubenswrapper[4045]: E0308 03:45:16.842818 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc1f1e75\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc1f1e75 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091749493 +0000 UTC m=+0.702450461,LastTimestamp:2026-03-08 03:45:01.303794116 +0000 UTC m=+0.914495104,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.849820 master-0 kubenswrapper[4045]: E0308 03:45:16.849688 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc20a81d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc20a81d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091850269 +0000 UTC m=+0.702551257,LastTimestamp:2026-03-08 03:45:01.303809813 +0000 UTC m=+0.914510811,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.856517 master-0 kubenswrapper[4045]: E0308 03:45:16.856390 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc1de6bb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc1de6bb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091669691 +0000 UTC m=+0.702370679,LastTimestamp:2026-03-08 03:45:01.303969816 +0000 UTC m=+0.914670814,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.863208 master-0 kubenswrapper[4045]: E0308 03:45:16.863080 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189ac0e6dc1f1e75\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189ac0e6dc1f1e75 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:01.091749493 +0000 UTC m=+0.702450461,LastTimestamp:2026-03-08 03:45:01.304033672 +0000 UTC m=+0.914734670,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.872120 master-0 kubenswrapper[4045]: E0308 03:45:16.871882 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ac0e735138abb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:02.584163003 +0000 UTC m=+2.194863991,LastTimestamp:2026-03-08 03:45:02.584163003 +0000 UTC m=+2.194863991,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.879401 master-0 kubenswrapper[4045]: E0308 03:45:16.879231 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189ac0e7359c1194 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:02.59311042 +0000 UTC m=+2.203811378,LastTimestamp:2026-03-08 03:45:02.59311042 +0000 UTC m=+2.203811378,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.885550 master-0 kubenswrapper[4045]: E0308 03:45:16.885407 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189ac0e7372767b8 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:02.619019192 +0000 UTC m=+2.229720180,LastTimestamp:2026-03-08 03:45:02.619019192 +0000 UTC m=+2.229720180,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.891929 master-0 kubenswrapper[4045]: E0308 03:45:16.891761 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ac0e738615c2e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:02.639594542 +0000 UTC m=+2.250295500,LastTimestamp:2026-03-08 03:45:02.639594542 +0000 UTC m=+2.250295500,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.897552 master-0 kubenswrapper[4045]: E0308 03:45:16.897366 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189ac0e739a0c1b6 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:02.660526518 +0000 UTC m=+2.271227466,LastTimestamp:2026-03-08 03:45:02.660526518 +0000 UTC m=+2.271227466,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.909558 master-0 kubenswrapper[4045]: E0308 03:45:16.908951 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ac0e7ca73aed3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" in 2.506s (2.506s including waiting). Image size: 465086330 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:05.090268883 +0000 UTC m=+4.700969831,LastTimestamp:2026-03-08 03:45:05.090268883 +0000 UTC m=+4.700969831,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.918251 master-0 kubenswrapper[4045]: E0308 03:45:16.917672 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189ac0e7cbb17686 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\" in 2.517s (2.517s including waiting). Image size: 529324693 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:05.111094918 +0000 UTC m=+4.721795876,LastTimestamp:2026-03-08 03:45:05.111094918 +0000 UTC m=+4.721795876,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.924690 master-0 kubenswrapper[4045]: E0308 03:45:16.924602 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ac0e7d4e64777 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:05.265551223 +0000 UTC m=+4.876252181,LastTimestamp:2026-03-08 03:45:05.265551223 +0000 UTC m=+4.876252181,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.931863 master-0 kubenswrapper[4045]: E0308 03:45:16.931634 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189ac0e7d5840e96 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:05.27589135 +0000 UTC m=+4.886592308,LastTimestamp:2026-03-08 03:45:05.27589135 +0000 UTC m=+4.886592308,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.937789 master-0 kubenswrapper[4045]: E0308 03:45:16.937622 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ac0e7d67918c9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:05.291950281 +0000 UTC m=+4.902651229,LastTimestamp:2026-03-08 03:45:05.291950281 +0000 UTC m=+4.902651229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.944876 master-0 kubenswrapper[4045]: E0308 03:45:16.944689 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189ac0e7d7028d91 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:05.300958609 +0000 UTC m=+4.911659567,LastTimestamp:2026-03-08 03:45:05.300958609 +0000 UTC m=+4.911659567,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.955945 master-0 kubenswrapper[4045]: E0308 03:45:16.955711 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189ac0e7d79fccfc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:05.311263996 +0000 UTC m=+4.921964964,LastTimestamp:2026-03-08 03:45:05.311263996 +0000 UTC m=+4.921964964,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.963333 master-0 kubenswrapper[4045]: E0308 03:45:16.963194 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189ac0e7e401f176 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:05.519022454 +0000 UTC m=+5.129723412,LastTimestamp:2026-03-08 03:45:05.519022454 +0000 UTC m=+5.129723412,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.969568 master-0 kubenswrapper[4045]: E0308 03:45:16.969460 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189ac0e7e4b1b247 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:05.530540615 +0000 UTC m=+5.141241573,LastTimestamp:2026-03-08 03:45:05.530540615 +0000 UTC m=+5.141241573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.974329 master-0 kubenswrapper[4045]: E0308 03:45:16.974174 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ac0e80e30d8be openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:06.22673939 +0000 UTC m=+5.837440348,LastTimestamp:2026-03-08 03:45:06.22673939 +0000 UTC m=+5.837440348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.980967 master-0 kubenswrapper[4045]: E0308 03:45:16.979547 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ac0e818f47d62 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:06.407333218 +0000 UTC m=+6.018034186,LastTimestamp:2026-03-08 03:45:06.407333218 +0000 UTC m=+6.018034186,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:16.992258 master-0 kubenswrapper[4045]: E0308 03:45:16.992096 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ac0e819d5926b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:06.422084203 +0000 UTC m=+6.032785171,LastTimestamp:2026-03-08 03:45:06.422084203 +0000 UTC m=+6.032785171,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.000106 master-0 kubenswrapper[4045]: E0308 03:45:16.999973 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189ac0e80e30d8be\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ac0e80e30d8be openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:06.22673939 +0000 UTC m=+5.837440348,LastTimestamp:2026-03-08 03:45:09.594257909 +0000 UTC m=+9.204958877,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.005103 master-0 kubenswrapper[4045]: E0308 03:45:17.003968 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ac0e8dca03cdd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 7.05s (7.05s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:09.690146013 +0000 UTC m=+9.300847021,LastTimestamp:2026-03-08 03:45:09.690146013 +0000 UTC m=+9.300847021,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.010237 master-0 kubenswrapper[4045]: E0308 03:45:17.010110 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189ac0e8dee6a78d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 7.109s (7.109s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:09.728315277 +0000 UTC m=+9.339016225,LastTimestamp:2026-03-08 03:45:09.728315277 +0000 UTC m=+9.339016225,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.017089 master-0 kubenswrapper[4045]: E0308 03:45:17.016949 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189ac0e8e01a75b1 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 7.087s (7.087s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:09.748487601 +0000 UTC m=+9.359188579,LastTimestamp:2026-03-08 03:45:09.748487601 +0000 UTC m=+9.359188579,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.022604 master-0 kubenswrapper[4045]: E0308 03:45:17.022404 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189ac0e818f47d62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ac0e818f47d62 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:06.407333218 +0000 UTC m=+6.018034186,LastTimestamp:2026-03-08 03:45:09.87509587 +0000 UTC m=+9.485796838,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.029991 master-0 kubenswrapper[4045]: E0308 03:45:17.029877 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189ac0e819d5926b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ac0e819d5926b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:06.422084203 +0000 UTC m=+6.032785171,LastTimestamp:2026-03-08 03:45:09.898907781 +0000 UTC m=+9.509608749,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.036606 master-0 kubenswrapper[4045]: I0308 03:45:17.036573 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:17.037330 master-0 kubenswrapper[4045]: E0308 03:45:17.037137 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189ac0e8eba7a70d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:09.942290189 +0000 UTC m=+9.552991157,LastTimestamp:2026-03-08 03:45:09.942290189 +0000 UTC m=+9.552991157,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.040888 master-0 kubenswrapper[4045]: E0308 03:45:17.040411 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189ac0e8ec3adf3b kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:09.951938363 +0000 UTC m=+9.562639341,LastTimestamp:2026-03-08 03:45:09.951938363 +0000 UTC m=+9.562639341,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.044588 master-0 kubenswrapper[4045]: E0308 03:45:17.044416 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ac0e8ed16616d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:09.966324077 +0000 UTC m=+9.577025035,LastTimestamp:2026-03-08 03:45:09.966324077 +0000 UTC m=+9.577025035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.051023 master-0 kubenswrapper[4045]: E0308 03:45:17.050930 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ac0e8edaa9d06 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:09.976038662 +0000 UTC m=+9.586739620,LastTimestamp:2026-03-08 03:45:09.976038662 +0000 UTC m=+9.586739620,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.056060 master-0 kubenswrapper[4045]: E0308 03:45:17.055898 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189ac0e8f4569d0f kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:10.087974159 +0000 UTC m=+9.698675117,LastTimestamp:2026-03-08 03:45:10.087974159 +0000 UTC m=+9.698675117,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.061752 master-0 kubenswrapper[4045]: E0308 03:45:17.061639 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189ac0e8f516fab1 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:10.100581041 +0000 UTC m=+9.711282039,LastTimestamp:2026-03-08 03:45:10.100581041 +0000 UTC m=+9.711282039,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.066658 master-0 kubenswrapper[4045]: E0308 03:45:17.066528 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189ac0e8f52a49f6 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:10.101846518 +0000 UTC m=+9.712547486,LastTimestamp:2026-03-08 03:45:10.101846518 +0000 UTC m=+9.712547486,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.071999 master-0 kubenswrapper[4045]: E0308 03:45:17.071763 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ac0e8fdc3aa6b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:10.246115947 +0000 UTC m=+9.856816945,LastTimestamp:2026-03-08 03:45:10.246115947 +0000 UTC m=+9.856816945,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.077203 master-0 kubenswrapper[4045]: E0308 03:45:17.077016 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ac0e8fe55be5d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:10.255689309 +0000 UTC m=+9.866390297,LastTimestamp:2026-03-08 03:45:10.255689309 +0000 UTC m=+9.866390297,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.083184 master-0 kubenswrapper[4045]: E0308 03:45:17.083007 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ac0e90e4aee1e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:10.523416094 +0000 UTC m=+10.134117092,LastTimestamp:2026-03-08 03:45:10.523416094 +0000 UTC m=+10.134117092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.089425 master-0 kubenswrapper[4045]: E0308 03:45:17.089267 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ac0e90f22af81 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:10.537555841 +0000 UTC m=+10.148256829,LastTimestamp:2026-03-08 03:45:10.537555841 +0000 UTC m=+10.148256829,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.096515 master-0 kubenswrapper[4045]: E0308 03:45:17.096374 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ac0e90f3133d2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:10.538507218 +0000 UTC m=+10.149208216,LastTimestamp:2026-03-08 03:45:10.538507218 +0000 UTC m=+10.149208216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.105158 master-0 kubenswrapper[4045]: E0308 03:45:17.105025 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189ac0e8fdc3aa6b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ac0e8fdc3aa6b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:10.246115947 +0000 UTC m=+9.856816945,LastTimestamp:2026-03-08 03:45:11.260816342 +0000 UTC m=+10.871517310,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.112817 master-0 kubenswrapper[4045]: E0308 03:45:17.112633 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189ac0e9d3bf3cbd kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\" in 3.734s (3.734s including waiting). Image size: 505242594 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:13.836149949 +0000 UTC m=+13.446850947,LastTimestamp:2026-03-08 03:45:13.836149949 +0000 UTC m=+13.446850947,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.120352 master-0 kubenswrapper[4045]: E0308 03:45:17.120192 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ac0e9d549ef35 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" in 3.323s (3.323s including waiting). Image size: 514980169 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:13.862016821 +0000 UTC m=+13.472717819,LastTimestamp:2026-03-08 03:45:13.862016821 +0000 UTC m=+13.472717819,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.127209 master-0 kubenswrapper[4045]: E0308 03:45:17.127078 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189ac0e9e2b5ff3f kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:14.087202623 +0000 UTC m=+13.697903611,LastTimestamp:2026-03-08 03:45:14.087202623 +0000 UTC m=+13.697903611,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.134357 master-0 kubenswrapper[4045]: E0308 03:45:17.134204 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ac0e9e2bd199f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:14.087668127 +0000 UTC m=+13.698369115,LastTimestamp:2026-03-08 03:45:14.087668127 +0000 UTC m=+13.698369115,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.140932 master-0 kubenswrapper[4045]: E0308 03:45:17.140807 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189ac0e9e393745d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:14.101716061 +0000 UTC m=+13.712417049,LastTimestamp:2026-03-08 03:45:14.101716061 +0000 UTC m=+13.712417049,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.147841 master-0 kubenswrapper[4045]: E0308 03:45:17.147706 4045 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ac0e9e3a233dd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:14.102682589 +0000 UTC m=+13.713383587,LastTimestamp:2026-03-08 03:45:14.102682589 +0000 UTC m=+13.713383587,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:17.280635 master-0 kubenswrapper[4045]: I0308 03:45:17.280482 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:17.280635 master-0 kubenswrapper[4045]: I0308 03:45:17.280538 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:17.281764 master-0 kubenswrapper[4045]: I0308 03:45:17.281690 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:17.281859 master-0 kubenswrapper[4045]: I0308 03:45:17.281692 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:17.281859 master-0 kubenswrapper[4045]: I0308 03:45:17.281817 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:17.281943 master-0 kubenswrapper[4045]: I0308 03:45:17.281878 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:17.282035 master-0 kubenswrapper[4045]: I0308 03:45:17.281771 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:17.282082 master-0 kubenswrapper[4045]: I0308 03:45:17.282034 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:18.040591 master-0 kubenswrapper[4045]: I0308 03:45:18.040545 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:19.051415 master-0 kubenswrapper[4045]: I0308 03:45:19.051368 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:19.357489 master-0 kubenswrapper[4045]: I0308 03:45:19.357083 4045 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:19.357489 master-0 kubenswrapper[4045]: I0308 03:45:19.357262 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:19.358484 master-0 kubenswrapper[4045]: I0308 03:45:19.358417 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:19.358619 master-0 kubenswrapper[4045]: I0308 03:45:19.358502 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:19.358619 master-0 kubenswrapper[4045]: I0308 03:45:19.358526 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:19.363572 master-0 kubenswrapper[4045]: I0308 03:45:19.363524 4045 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:19.457809 master-0 kubenswrapper[4045]: W0308 03:45:19.457745 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 08 03:45:19.457968 master-0 kubenswrapper[4045]: E0308 03:45:19.457860 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 03:45:20.040392 master-0 kubenswrapper[4045]: I0308 03:45:20.040290 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:20.289210 master-0 kubenswrapper[4045]: I0308 03:45:20.288365 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:20.289210 master-0 kubenswrapper[4045]: I0308 03:45:20.288590 4045 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:20.290876 master-0 kubenswrapper[4045]: I0308 03:45:20.290797 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:20.291002 master-0 kubenswrapper[4045]: I0308 03:45:20.290887 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:20.291002 master-0 kubenswrapper[4045]: I0308 03:45:20.290907 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:20.295327 master-0 kubenswrapper[4045]: I0308 03:45:20.295281 4045 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:20.669641 master-0 kubenswrapper[4045]: E0308 03:45:20.667977 4045 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 03:45:20.766008 master-0 kubenswrapper[4045]: W0308 03:45:20.765943 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 08 03:45:20.766008 master-0 kubenswrapper[4045]: E0308 03:45:20.766010 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 03:45:20.836099 master-0 kubenswrapper[4045]: I0308 03:45:20.836013 4045 csr.go:261] certificate signing request csr-knlqm is approved, waiting to be issued Mar 08 03:45:20.947795 master-0 kubenswrapper[4045]: I0308 03:45:20.947660 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:20.949238 master-0 kubenswrapper[4045]: I0308 03:45:20.949177 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:20.949349 master-0 kubenswrapper[4045]: I0308 03:45:20.949244 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:20.949349 master-0 kubenswrapper[4045]: I0308 03:45:20.949262 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:20.949349 master-0 kubenswrapper[4045]: I0308 03:45:20.949317 4045 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:45:20.957847 master-0 kubenswrapper[4045]: E0308 03:45:20.957737 4045 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 08 03:45:21.043044 master-0 kubenswrapper[4045]: I0308 03:45:21.042956 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:21.189359 master-0 kubenswrapper[4045]: E0308 03:45:21.189267 4045 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:45:21.291444 master-0 kubenswrapper[4045]: I0308 03:45:21.291289 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:21.292893 master-0 kubenswrapper[4045]: I0308 03:45:21.292855 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:21.293020 master-0 kubenswrapper[4045]: I0308 03:45:21.292914 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:21.293020 master-0 kubenswrapper[4045]: I0308 03:45:21.292941 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:21.637779 master-0 kubenswrapper[4045]: W0308 03:45:21.637612 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 08 03:45:21.637779 master-0 kubenswrapper[4045]: E0308 03:45:21.637710 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 03:45:22.041582 master-0 kubenswrapper[4045]: I0308 03:45:22.041521 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:22.294140 master-0 kubenswrapper[4045]: I0308 03:45:22.293986 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:22.295467 master-0 kubenswrapper[4045]: I0308 03:45:22.295404 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:22.295467 master-0 kubenswrapper[4045]: I0308 03:45:22.295466 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:22.295642 master-0 kubenswrapper[4045]: I0308 03:45:22.295489 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:23.041251 master-0 kubenswrapper[4045]: I0308 03:45:23.041181 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:23.599944 master-0 kubenswrapper[4045]: W0308 03:45:23.599872 4045 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:23.600916 master-0 kubenswrapper[4045]: E0308 03:45:23.599951 4045 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 03:45:24.041076 master-0 kubenswrapper[4045]: I0308 03:45:24.041006 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:25.041059 master-0 kubenswrapper[4045]: I0308 03:45:25.040921 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:26.040413 master-0 kubenswrapper[4045]: I0308 03:45:26.040281 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:26.199292 master-0 kubenswrapper[4045]: I0308 03:45:26.199190 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:26.200591 master-0 kubenswrapper[4045]: I0308 03:45:26.200525 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:26.200671 master-0 kubenswrapper[4045]: I0308 03:45:26.200628 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:26.200671 master-0 kubenswrapper[4045]: I0308 03:45:26.200654 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:26.201179 master-0 kubenswrapper[4045]: I0308 03:45:26.201142 4045 scope.go:117] "RemoveContainer" containerID="78066375e971a13b6e4a7c9120903810bb4af202f86064fd0ff8ab99a3010659" Mar 08 03:45:26.213880 master-0 kubenswrapper[4045]: E0308 03:45:26.213686 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189ac0e80e30d8be\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ac0e80e30d8be openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:06.22673939 +0000 UTC m=+5.837440348,LastTimestamp:2026-03-08 03:45:26.20485774 +0000 UTC m=+25.815558728,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:26.247708 master-0 kubenswrapper[4045]: I0308 03:45:26.247661 4045 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:45:26.247949 master-0 kubenswrapper[4045]: I0308 03:45:26.247856 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:26.249321 master-0 kubenswrapper[4045]: I0308 03:45:26.249241 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:26.249321 master-0 kubenswrapper[4045]: I0308 03:45:26.249291 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:26.249499 master-0 kubenswrapper[4045]: I0308 03:45:26.249340 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:26.483998 master-0 kubenswrapper[4045]: E0308 03:45:26.483805 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189ac0e818f47d62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ac0e818f47d62 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:06.407333218 +0000 UTC m=+6.018034186,LastTimestamp:2026-03-08 03:45:26.475925223 +0000 UTC m=+26.086626211,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:26.500466 master-0 kubenswrapper[4045]: E0308 03:45:26.500308 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189ac0e819d5926b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ac0e819d5926b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:06.422084203 +0000 UTC m=+6.032785171,LastTimestamp:2026-03-08 03:45:26.493095089 +0000 UTC m=+26.103796077,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:27.040615 master-0 kubenswrapper[4045]: I0308 03:45:27.040562 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:27.307480 master-0 kubenswrapper[4045]: I0308 03:45:27.307275 4045 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 08 03:45:27.308791 master-0 kubenswrapper[4045]: I0308 03:45:27.308669 4045 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 08 03:45:27.309419 master-0 kubenswrapper[4045]: I0308 03:45:27.309328 4045 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="1611cfa5e175032b10c844270b1926150f7a6bf4a58e7bfa0e9ab7a757d448fe" exitCode=1 Mar 08 03:45:27.309419 master-0 kubenswrapper[4045]: I0308 03:45:27.309395 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"1611cfa5e175032b10c844270b1926150f7a6bf4a58e7bfa0e9ab7a757d448fe"} Mar 08 03:45:27.309594 master-0 kubenswrapper[4045]: I0308 03:45:27.309467 4045 scope.go:117] "RemoveContainer" containerID="78066375e971a13b6e4a7c9120903810bb4af202f86064fd0ff8ab99a3010659" Mar 08 03:45:27.309736 master-0 kubenswrapper[4045]: I0308 03:45:27.309705 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:27.311903 master-0 kubenswrapper[4045]: I0308 03:45:27.311265 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:27.311903 master-0 kubenswrapper[4045]: I0308 03:45:27.311310 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:27.311903 master-0 kubenswrapper[4045]: I0308 03:45:27.311328 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:27.313527 master-0 kubenswrapper[4045]: I0308 03:45:27.312639 4045 scope.go:117] "RemoveContainer" containerID="1611cfa5e175032b10c844270b1926150f7a6bf4a58e7bfa0e9ab7a757d448fe" Mar 08 03:45:27.313527 master-0 kubenswrapper[4045]: E0308 03:45:27.312905 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 08 03:45:27.320945 master-0 kubenswrapper[4045]: E0308 03:45:27.320741 4045 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189ac0e8fdc3aa6b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189ac0e8fdc3aa6b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:45:10.246115947 +0000 UTC m=+9.856816945,LastTimestamp:2026-03-08 03:45:27.312858573 +0000 UTC m=+26.923559571,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:45:27.675989 master-0 kubenswrapper[4045]: E0308 03:45:27.675857 4045 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 03:45:27.958627 master-0 kubenswrapper[4045]: I0308 03:45:27.958529 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:27.960699 master-0 kubenswrapper[4045]: I0308 03:45:27.960047 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:27.960699 master-0 kubenswrapper[4045]: I0308 03:45:27.960111 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:27.960699 master-0 kubenswrapper[4045]: I0308 03:45:27.960135 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:27.960699 master-0 kubenswrapper[4045]: I0308 03:45:27.960230 4045 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:45:27.967789 master-0 kubenswrapper[4045]: E0308 03:45:27.967739 4045 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 08 03:45:28.039713 master-0 kubenswrapper[4045]: I0308 03:45:28.039633 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:28.314673 master-0 kubenswrapper[4045]: I0308 03:45:28.314547 4045 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 08 03:45:29.040788 master-0 kubenswrapper[4045]: I0308 03:45:29.040656 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:30.038051 master-0 kubenswrapper[4045]: I0308 03:45:30.037985 4045 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:45:30.070301 master-0 kubenswrapper[4045]: I0308 03:45:30.070199 4045 csr.go:257] certificate signing request csr-knlqm is issued Mar 08 03:45:30.915769 master-0 kubenswrapper[4045]: I0308 03:45:30.915710 4045 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 08 03:45:31.051429 master-0 kubenswrapper[4045]: I0308 03:45:31.051360 4045 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 03:45:31.071153 master-0 kubenswrapper[4045]: I0308 03:45:31.071108 4045 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 03:45:31.072966 master-0 kubenswrapper[4045]: I0308 03:45:31.072156 4045 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-09 03:37:12 +0000 UTC, rotation deadline is 2026-03-08 23:36:59.331336031 +0000 UTC Mar 08 03:45:31.073112 master-0 kubenswrapper[4045]: I0308 03:45:31.072919 4045 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 19h51m28.258429739s for next certificate rotation Mar 08 03:45:31.128427 master-0 kubenswrapper[4045]: I0308 03:45:31.128374 4045 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 03:45:31.190074 master-0 kubenswrapper[4045]: E0308 03:45:31.190037 4045 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:45:31.399866 master-0 kubenswrapper[4045]: I0308 03:45:31.399806 4045 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 03:45:31.399866 master-0 kubenswrapper[4045]: E0308 03:45:31.399864 4045 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 08 03:45:31.421092 master-0 kubenswrapper[4045]: I0308 03:45:31.421035 4045 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 03:45:31.437290 master-0 kubenswrapper[4045]: I0308 03:45:31.437240 4045 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 03:45:31.491568 master-0 kubenswrapper[4045]: I0308 03:45:31.491463 4045 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 03:45:31.754989 master-0 kubenswrapper[4045]: I0308 03:45:31.754879 4045 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 03:45:31.754989 master-0 kubenswrapper[4045]: E0308 03:45:31.754916 4045 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 08 03:45:31.853180 master-0 kubenswrapper[4045]: I0308 03:45:31.853110 4045 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 03:45:31.868779 master-0 kubenswrapper[4045]: I0308 03:45:31.868711 4045 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 03:45:31.927768 master-0 kubenswrapper[4045]: I0308 03:45:31.927707 4045 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 03:45:32.183074 master-0 kubenswrapper[4045]: I0308 03:45:32.182988 4045 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 03:45:32.183495 master-0 kubenswrapper[4045]: E0308 03:45:32.183479 4045 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 08 03:45:32.781625 master-0 kubenswrapper[4045]: I0308 03:45:32.781555 4045 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 03:45:32.797202 master-0 kubenswrapper[4045]: I0308 03:45:32.797166 4045 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 03:45:32.858160 master-0 kubenswrapper[4045]: I0308 03:45:32.858092 4045 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 03:45:33.124996 master-0 kubenswrapper[4045]: I0308 03:45:33.124879 4045 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 08 03:45:33.125264 master-0 kubenswrapper[4045]: E0308 03:45:33.125240 4045 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 08 03:45:34.681853 master-0 kubenswrapper[4045]: E0308 03:45:34.681754 4045 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-0\" not found" node="master-0" Mar 08 03:45:34.968594 master-0 kubenswrapper[4045]: I0308 03:45:34.968521 4045 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:45:34.970499 master-0 kubenswrapper[4045]: I0308 03:45:34.970453 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:45:34.970727 master-0 kubenswrapper[4045]: I0308 03:45:34.970703 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:45:34.970946 master-0 kubenswrapper[4045]: I0308 03:45:34.970914 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:45:34.971233 master-0 kubenswrapper[4045]: I0308 03:45:34.971200 4045 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:45:34.985055 master-0 kubenswrapper[4045]: I0308 03:45:34.984928 4045 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 08 03:45:34.985055 master-0 kubenswrapper[4045]: E0308 03:45:34.985015 4045 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 08 03:45:35.006697 master-0 kubenswrapper[4045]: E0308 03:45:35.006631 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:35.059491 master-0 kubenswrapper[4045]: I0308 03:45:35.059411 4045 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 08 03:45:35.074709 master-0 kubenswrapper[4045]: I0308 03:45:35.074638 4045 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 08 03:45:35.107123 master-0 kubenswrapper[4045]: E0308 03:45:35.107052 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:35.208224 master-0 kubenswrapper[4045]: E0308 03:45:35.208129 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:35.308500 master-0 kubenswrapper[4045]: E0308 03:45:35.308329 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:35.409049 master-0 kubenswrapper[4045]: E0308 03:45:35.409003 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:35.510020 master-0 kubenswrapper[4045]: E0308 03:45:35.509919 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:35.610225 master-0 kubenswrapper[4045]: E0308 03:45:35.610060 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:35.710962 master-0 kubenswrapper[4045]: E0308 03:45:35.710870 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:35.811203 master-0 kubenswrapper[4045]: E0308 03:45:35.811136 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:35.912350 master-0 kubenswrapper[4045]: E0308 03:45:35.912207 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:36.013343 master-0 kubenswrapper[4045]: E0308 03:45:36.013249 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:36.113605 master-0 kubenswrapper[4045]: E0308 03:45:36.113509 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:36.214783 master-0 kubenswrapper[4045]: E0308 03:45:36.214713 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:36.314967 master-0 kubenswrapper[4045]: E0308 03:45:36.314808 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:36.415423 master-0 kubenswrapper[4045]: E0308 03:45:36.415322 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:36.516545 master-0 kubenswrapper[4045]: E0308 03:45:36.516350 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:36.617297 master-0 kubenswrapper[4045]: E0308 03:45:36.617228 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:36.717807 master-0 kubenswrapper[4045]: E0308 03:45:36.717735 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:36.818081 master-0 kubenswrapper[4045]: E0308 03:45:36.817910 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:36.918920 master-0 kubenswrapper[4045]: E0308 03:45:36.918792 4045 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:45:36.969874 master-0 kubenswrapper[4045]: I0308 03:45:36.969783 4045 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 03:45:37.031258 master-0 kubenswrapper[4045]: I0308 03:45:37.031182 4045 apiserver.go:52] "Watching apiserver" Mar 08 03:45:37.038232 master-0 kubenswrapper[4045]: I0308 03:45:37.038163 4045 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 03:45:37.038507 master-0 kubenswrapper[4045]: I0308 03:45:37.038404 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-7c649bf6d4-99d2k"] Mar 08 03:45:37.038807 master-0 kubenswrapper[4045]: I0308 03:45:37.038767 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:45:37.041584 master-0 kubenswrapper[4045]: I0308 03:45:37.041385 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 03:45:37.043391 master-0 kubenswrapper[4045]: I0308 03:45:37.043338 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 03:45:37.044233 master-0 kubenswrapper[4045]: I0308 03:45:37.044063 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 03:45:37.077639 master-0 kubenswrapper[4045]: I0308 03:45:37.077558 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["assisted-installer/assisted-installer-controller-66tqt"] Mar 08 03:45:37.078081 master-0 kubenswrapper[4045]: I0308 03:45:37.078062 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:37.080756 master-0 kubenswrapper[4045]: I0308 03:45:37.080701 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Mar 08 03:45:37.081717 master-0 kubenswrapper[4045]: I0308 03:45:37.081657 4045 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Mar 08 03:45:37.082330 master-0 kubenswrapper[4045]: I0308 03:45:37.082289 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"kube-root-ca.crt" Mar 08 03:45:37.084758 master-0 kubenswrapper[4045]: I0308 03:45:37.084718 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"openshift-service-ca.crt" Mar 08 03:45:37.136732 master-0 kubenswrapper[4045]: I0308 03:45:37.136674 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp"] Mar 08 03:45:37.137156 master-0 kubenswrapper[4045]: I0308 03:45:37.137118 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:37.137694 master-0 kubenswrapper[4045]: I0308 03:45:37.137644 4045 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 08 03:45:37.139615 master-0 kubenswrapper[4045]: I0308 03:45:37.139577 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 03:45:37.139787 master-0 kubenswrapper[4045]: I0308 03:45:37.139745 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 03:45:37.141250 master-0 kubenswrapper[4045]: I0308 03:45:37.141206 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 03:45:37.203978 master-0 kubenswrapper[4045]: I0308 03:45:37.203890 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-sno-bootstrap-files\") pod \"assisted-installer-controller-66tqt\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:37.203978 master-0 kubenswrapper[4045]: I0308 03:45:37.203938 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/349d438d-d124-4d34-a172-4160e766c680-service-ca\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:37.203978 master-0 kubenswrapper[4045]: I0308 03:45:37.203962 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-host-ca-bundle\") pod \"assisted-installer-controller-66tqt\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:37.204928 master-0 kubenswrapper[4045]: I0308 03:45:37.204053 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-host-resolv-conf\") pod \"assisted-installer-controller-66tqt\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:37.204928 master-0 kubenswrapper[4045]: I0308 03:45:37.204111 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-host-var-run-resolv-conf\") pod \"assisted-installer-controller-66tqt\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:37.204928 master-0 kubenswrapper[4045]: I0308 03:45:37.204154 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/349d438d-d124-4d34-a172-4160e766c680-kube-api-access\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:37.204928 master-0 kubenswrapper[4045]: I0308 03:45:37.204192 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf776\" (UniqueName: \"kubernetes.io/projected/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-kube-api-access-hf776\") pod \"assisted-installer-controller-66tqt\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:37.204928 master-0 kubenswrapper[4045]: I0308 03:45:37.204266 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/349d438d-d124-4d34-a172-4160e766c680-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:37.204928 master-0 kubenswrapper[4045]: I0308 03:45:37.204300 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/349d438d-d124-4d34-a172-4160e766c680-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:37.204928 master-0 kubenswrapper[4045]: I0308 03:45:37.204337 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:37.204928 master-0 kubenswrapper[4045]: I0308 03:45:37.204456 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ddfd0e7-fe76-41bc-b316-94505df81002-metrics-tls\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:45:37.204928 master-0 kubenswrapper[4045]: I0308 03:45:37.204570 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgc7c\" (UniqueName: \"kubernetes.io/projected/3ddfd0e7-fe76-41bc-b316-94505df81002-kube-api-access-bgc7c\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:45:37.204928 master-0 kubenswrapper[4045]: I0308 03:45:37.204668 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3ddfd0e7-fe76-41bc-b316-94505df81002-host-etc-kube\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:45:37.305288 master-0 kubenswrapper[4045]: I0308 03:45:37.305227 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3ddfd0e7-fe76-41bc-b316-94505df81002-host-etc-kube\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:45:37.305618 master-0 kubenswrapper[4045]: I0308 03:45:37.305553 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3ddfd0e7-fe76-41bc-b316-94505df81002-host-etc-kube\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:45:37.305764 master-0 kubenswrapper[4045]: I0308 03:45:37.305735 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-sno-bootstrap-files\") pod \"assisted-installer-controller-66tqt\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:37.305942 master-0 kubenswrapper[4045]: I0308 03:45:37.305798 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-sno-bootstrap-files\") pod \"assisted-installer-controller-66tqt\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:37.306177 master-0 kubenswrapper[4045]: I0308 03:45:37.306069 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/349d438d-d124-4d34-a172-4160e766c680-service-ca\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:37.306177 master-0 kubenswrapper[4045]: I0308 03:45:37.306166 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-host-ca-bundle\") pod \"assisted-installer-controller-66tqt\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:37.306320 master-0 kubenswrapper[4045]: I0308 03:45:37.306254 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-host-ca-bundle\") pod \"assisted-installer-controller-66tqt\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:37.306320 master-0 kubenswrapper[4045]: I0308 03:45:37.306311 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-host-resolv-conf\") pod \"assisted-installer-controller-66tqt\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:37.306485 master-0 kubenswrapper[4045]: I0308 03:45:37.306347 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-host-var-run-resolv-conf\") pod \"assisted-installer-controller-66tqt\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:37.306485 master-0 kubenswrapper[4045]: I0308 03:45:37.306382 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf776\" (UniqueName: \"kubernetes.io/projected/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-kube-api-access-hf776\") pod \"assisted-installer-controller-66tqt\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:37.306485 master-0 kubenswrapper[4045]: I0308 03:45:37.306430 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/349d438d-d124-4d34-a172-4160e766c680-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:37.306739 master-0 kubenswrapper[4045]: I0308 03:45:37.306500 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/349d438d-d124-4d34-a172-4160e766c680-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:37.306739 master-0 kubenswrapper[4045]: I0308 03:45:37.306534 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:37.306739 master-0 kubenswrapper[4045]: I0308 03:45:37.306564 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/349d438d-d124-4d34-a172-4160e766c680-kube-api-access\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:37.306739 master-0 kubenswrapper[4045]: I0308 03:45:37.306604 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ddfd0e7-fe76-41bc-b316-94505df81002-metrics-tls\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:45:37.306739 master-0 kubenswrapper[4045]: I0308 03:45:37.306640 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgc7c\" (UniqueName: \"kubernetes.io/projected/3ddfd0e7-fe76-41bc-b316-94505df81002-kube-api-access-bgc7c\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:45:37.307200 master-0 kubenswrapper[4045]: I0308 03:45:37.307153 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-host-resolv-conf\") pod \"assisted-installer-controller-66tqt\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:37.307394 master-0 kubenswrapper[4045]: I0308 03:45:37.307360 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/349d438d-d124-4d34-a172-4160e766c680-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:37.307643 master-0 kubenswrapper[4045]: E0308 03:45:37.307572 4045 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:45:37.307643 master-0 kubenswrapper[4045]: I0308 03:45:37.307616 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/349d438d-d124-4d34-a172-4160e766c680-service-ca\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:37.307795 master-0 kubenswrapper[4045]: E0308 03:45:37.307770 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert podName:349d438d-d124-4d34-a172-4160e766c680 nodeName:}" failed. No retries permitted until 2026-03-08 03:45:37.807650676 +0000 UTC m=+37.418351674 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert") pod "cluster-version-operator-745944c6b7-gvmnp" (UID: "349d438d-d124-4d34-a172-4160e766c680") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:45:37.308152 master-0 kubenswrapper[4045]: I0308 03:45:37.308120 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-host-var-run-resolv-conf\") pod \"assisted-installer-controller-66tqt\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:37.308300 master-0 kubenswrapper[4045]: I0308 03:45:37.308235 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/349d438d-d124-4d34-a172-4160e766c680-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:37.309538 master-0 kubenswrapper[4045]: I0308 03:45:37.309238 4045 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 08 03:45:37.318358 master-0 kubenswrapper[4045]: I0308 03:45:37.318270 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ddfd0e7-fe76-41bc-b316-94505df81002-metrics-tls\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:45:37.339640 master-0 kubenswrapper[4045]: I0308 03:45:37.339404 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/349d438d-d124-4d34-a172-4160e766c680-kube-api-access\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:37.339917 master-0 kubenswrapper[4045]: I0308 03:45:37.339764 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgc7c\" (UniqueName: \"kubernetes.io/projected/3ddfd0e7-fe76-41bc-b316-94505df81002-kube-api-access-bgc7c\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:45:37.343718 master-0 kubenswrapper[4045]: I0308 03:45:37.343615 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf776\" (UniqueName: \"kubernetes.io/projected/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-kube-api-access-hf776\") pod \"assisted-installer-controller-66tqt\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:37.359221 master-0 kubenswrapper[4045]: I0308 03:45:37.359168 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:45:37.411855 master-0 kubenswrapper[4045]: I0308 03:45:37.411475 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:37.432351 master-0 kubenswrapper[4045]: W0308 03:45:37.432275 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d1b9b65_dba7_48fc_bc59_faa8f3cfcca7.slice/crio-d29086141609fa12579213578ed2d780ee581ff60e20ceb99a14fefd44548805 WatchSource:0}: Error finding container d29086141609fa12579213578ed2d780ee581ff60e20ceb99a14fefd44548805: Status 404 returned error can't find the container with id d29086141609fa12579213578ed2d780ee581ff60e20ceb99a14fefd44548805 Mar 08 03:45:37.810619 master-0 kubenswrapper[4045]: I0308 03:45:37.810560 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:37.811671 master-0 kubenswrapper[4045]: E0308 03:45:37.810879 4045 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:45:37.811942 master-0 kubenswrapper[4045]: E0308 03:45:37.811917 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert podName:349d438d-d124-4d34-a172-4160e766c680 nodeName:}" failed. No retries permitted until 2026-03-08 03:45:38.811840005 +0000 UTC m=+38.422541003 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert") pod "cluster-version-operator-745944c6b7-gvmnp" (UID: "349d438d-d124-4d34-a172-4160e766c680") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:45:38.344851 master-0 kubenswrapper[4045]: I0308 03:45:38.344788 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" event={"ID":"3ddfd0e7-fe76-41bc-b316-94505df81002","Type":"ContainerStarted","Data":"22cc81f0c9d90fe64f682c3bbb7bbcefc904c4ee2c036d7eedf6b66887f69fae"} Mar 08 03:45:38.346676 master-0 kubenswrapper[4045]: I0308 03:45:38.346577 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-66tqt" event={"ID":"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7","Type":"ContainerStarted","Data":"d29086141609fa12579213578ed2d780ee581ff60e20ceb99a14fefd44548805"} Mar 08 03:45:38.634240 master-0 kubenswrapper[4045]: I0308 03:45:38.634089 4045 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 03:45:38.819943 master-0 kubenswrapper[4045]: I0308 03:45:38.819759 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:38.820887 master-0 kubenswrapper[4045]: E0308 03:45:38.820052 4045 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:45:38.820887 master-0 kubenswrapper[4045]: E0308 03:45:38.820207 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert podName:349d438d-d124-4d34-a172-4160e766c680 nodeName:}" failed. No retries permitted until 2026-03-08 03:45:40.820166302 +0000 UTC m=+40.430867300 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert") pod "cluster-version-operator-745944c6b7-gvmnp" (UID: "349d438d-d124-4d34-a172-4160e766c680") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:45:40.616094 master-0 kubenswrapper[4045]: I0308 03:45:40.616029 4045 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 03:45:40.835068 master-0 kubenswrapper[4045]: I0308 03:45:40.834981 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:40.835298 master-0 kubenswrapper[4045]: E0308 03:45:40.835241 4045 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:45:40.835453 master-0 kubenswrapper[4045]: E0308 03:45:40.835428 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert podName:349d438d-d124-4d34-a172-4160e766c680 nodeName:}" failed. No retries permitted until 2026-03-08 03:45:44.835391153 +0000 UTC m=+44.446092101 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert") pod "cluster-version-operator-745944c6b7-gvmnp" (UID: "349d438d-d124-4d34-a172-4160e766c680") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:45:41.642600 master-0 kubenswrapper[4045]: I0308 03:45:41.638628 4045 csr.go:261] certificate signing request csr-c477h is approved, waiting to be issued Mar 08 03:45:41.648880 master-0 kubenswrapper[4045]: I0308 03:45:41.648614 4045 csr.go:257] certificate signing request csr-c477h is issued Mar 08 03:45:42.210972 master-0 kubenswrapper[4045]: I0308 03:45:42.210910 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Mar 08 03:45:42.211349 master-0 kubenswrapper[4045]: I0308 03:45:42.211317 4045 scope.go:117] "RemoveContainer" containerID="1611cfa5e175032b10c844270b1926150f7a6bf4a58e7bfa0e9ab7a757d448fe" Mar 08 03:45:42.211593 master-0 kubenswrapper[4045]: E0308 03:45:42.211566 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 08 03:45:42.355505 master-0 kubenswrapper[4045]: I0308 03:45:42.355462 4045 scope.go:117] "RemoveContainer" containerID="1611cfa5e175032b10c844270b1926150f7a6bf4a58e7bfa0e9ab7a757d448fe" Mar 08 03:45:42.355731 master-0 kubenswrapper[4045]: E0308 03:45:42.355636 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 08 03:45:42.650406 master-0 kubenswrapper[4045]: I0308 03:45:42.650273 4045 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-09 03:37:12 +0000 UTC, rotation deadline is 2026-03-09 00:58:59.127517373 +0000 UTC Mar 08 03:45:42.650406 master-0 kubenswrapper[4045]: I0308 03:45:42.650322 4045 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 21h13m16.477200006s for next certificate rotation Mar 08 03:45:43.003131 master-0 kubenswrapper[4045]: E0308 03:45:43.003073 4045 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 03:45:43.003131 master-0 kubenswrapper[4045]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,Command:[/bin/bash -c #!/bin/bash Mar 08 03:45:43.003131 master-0 kubenswrapper[4045]: set -o allexport Mar 08 03:45:43.003131 master-0 kubenswrapper[4045]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 08 03:45:43.003131 master-0 kubenswrapper[4045]: source /etc/kubernetes/apiserver-url.env Mar 08 03:45:43.003131 master-0 kubenswrapper[4045]: else Mar 08 03:45:43.003131 master-0 kubenswrapper[4045]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 08 03:45:43.003131 master-0 kubenswrapper[4045]: exit 1 Mar 08 03:45:43.003131 master-0 kubenswrapper[4045]: fi Mar 08 03:45:43.003131 master-0 kubenswrapper[4045]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 08 03:45:43.003131 master-0 kubenswrapper[4045]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9242604e78efada5aeb232d73a7963f806b754213f5d92b1dffc9b493d7b5a65,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5230462066ab36e3025524e948dd33fa6f51ee29a4f91fa469bfc268568b5fd9,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ace4dcd008420277d915fe983b07bbb50fb3ab0673f28d0166424a75bc2137e7,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8f0fda36e9a2040dbe0537361dcd73658df4e669d846f8101a8f9f29f0be9a7,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b19b9d0e5437b0bb19cafc3fb516f654c911cdf11184c0de9a27b43c6b80c9ce,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:89cb093f319eaa04acfe9431b8697bffbc71ab670546f7ed257daa332165c626,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3aa7c84e73a2a19cc9baca38b7e86dfcde579aa88221647c332c83f047d5ae6d,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bfe4d3125d98cc501d5a529d3ae2497106a2bbb5a6dd06df7c0e0930d168212,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b62afe74fdcb011a4a8c8fa5572dbab2514dda673ae4be4c6beaef92d28216ba,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bgc7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-7c649bf6d4-99d2k_openshift-network-operator(3ddfd0e7-fe76-41bc-b316-94505df81002): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 03:45:43.003131 master-0 kubenswrapper[4045]: > logger="UnhandledError" Mar 08 03:45:43.004278 master-0 kubenswrapper[4045]: E0308 03:45:43.004243 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" podUID="3ddfd0e7-fe76-41bc-b316-94505df81002" Mar 08 03:45:43.038089 master-0 kubenswrapper[4045]: E0308 03:45:43.038011 4045 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:assisted-installer-controller,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CLUSTER_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:cluster-id,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:INVENTORY_URL,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:inventory-url,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:PULL_SECRET_TOKEN,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-secret,},Key:pull-secret-token,Optional:nil,},},},EnvVar{Name:CA_CERT_PATH,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:ca-cert-path,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:SKIP_CERT_VERIFICATION,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:skip-cert-verification,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:OPENSHIFT_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:NOTIFY_NUM_REBOOTS,Value:true,ValueFrom:nil,},EnvVar{Name:HIGH_AVAILABILITY_MODE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:high-availability-mode,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:CHECK_CLUSTER_VERSION,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:check-cluster-version,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:MUST_GATHER_IMAGE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:must-gather-image,Optional:*true,},SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-ca-bundle,ReadOnly:false,MountPath:/etc/pki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-run-resolv-conf,ReadOnly:false,MountPath:/tmp/var-run-resolv.conf,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-resolv-conf,ReadOnly:false,MountPath:/tmp/host-resolv.conf,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:sno-bootstrap-files,ReadOnly:false,MountPath:/tmp/bootstrap-secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hf776,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[KILL MKNOD SETGID SETUID],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000120000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod assisted-installer-controller-66tqt_assisted-installer(7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 03:45:43.039751 master-0 kubenswrapper[4045]: E0308 03:45:43.039295 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"assisted-installer-controller\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="assisted-installer/assisted-installer-controller-66tqt" podUID="7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" Mar 08 03:45:43.366601 master-0 kubenswrapper[4045]: E0308 03:45:43.366491 4045 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 03:45:43.366601 master-0 kubenswrapper[4045]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,Command:[/bin/bash -c #!/bin/bash Mar 08 03:45:43.366601 master-0 kubenswrapper[4045]: set -o allexport Mar 08 03:45:43.366601 master-0 kubenswrapper[4045]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 08 03:45:43.366601 master-0 kubenswrapper[4045]: source /etc/kubernetes/apiserver-url.env Mar 08 03:45:43.366601 master-0 kubenswrapper[4045]: else Mar 08 03:45:43.366601 master-0 kubenswrapper[4045]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 08 03:45:43.366601 master-0 kubenswrapper[4045]: exit 1 Mar 08 03:45:43.366601 master-0 kubenswrapper[4045]: fi Mar 08 03:45:43.366601 master-0 kubenswrapper[4045]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 08 03:45:43.366601 master-0 kubenswrapper[4045]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9242604e78efada5aeb232d73a7963f806b754213f5d92b1dffc9b493d7b5a65,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5230462066ab36e3025524e948dd33fa6f51ee29a4f91fa469bfc268568b5fd9,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ace4dcd008420277d915fe983b07bbb50fb3ab0673f28d0166424a75bc2137e7,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8f0fda36e9a2040dbe0537361dcd73658df4e669d846f8101a8f9f29f0be9a7,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b19b9d0e5437b0bb19cafc3fb516f654c911cdf11184c0de9a27b43c6b80c9ce,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:89cb093f319eaa04acfe9431b8697bffbc71ab670546f7ed257daa332165c626,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3aa7c84e73a2a19cc9baca38b7e86dfcde579aa88221647c332c83f047d5ae6d,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bfe4d3125d98cc501d5a529d3ae2497106a2bbb5a6dd06df7c0e0930d168212,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b62afe74fdcb011a4a8c8fa5572dbab2514dda673ae4be4c6beaef92d28216ba,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bgc7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-7c649bf6d4-99d2k_openshift-network-operator(3ddfd0e7-fe76-41bc-b316-94505df81002): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 03:45:43.366601 master-0 kubenswrapper[4045]: > logger="UnhandledError" Mar 08 03:45:43.367460 master-0 kubenswrapper[4045]: E0308 03:45:43.366553 4045 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:assisted-installer-controller,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CLUSTER_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:cluster-id,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:INVENTORY_URL,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:inventory-url,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:PULL_SECRET_TOKEN,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-secret,},Key:pull-secret-token,Optional:nil,},},},EnvVar{Name:CA_CERT_PATH,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:ca-cert-path,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:SKIP_CERT_VERIFICATION,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:skip-cert-verification,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:OPENSHIFT_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:NOTIFY_NUM_REBOOTS,Value:true,ValueFrom:nil,},EnvVar{Name:HIGH_AVAILABILITY_MODE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:high-availability-mode,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:CHECK_CLUSTER_VERSION,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:check-cluster-version,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:MUST_GATHER_IMAGE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:must-gather-image,Optional:*true,},SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-ca-bundle,ReadOnly:false,MountPath:/etc/pki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-run-resolv-conf,ReadOnly:false,MountPath:/tmp/var-run-resolv.conf,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-resolv-conf,ReadOnly:false,MountPath:/tmp/host-resolv.conf,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:sno-bootstrap-files,ReadOnly:false,MountPath:/tmp/bootstrap-secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hf776,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[KILL MKNOD SETGID SETUID],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000120000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod assisted-installer-controller-66tqt_assisted-installer(7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 03:45:43.367982 master-0 kubenswrapper[4045]: E0308 03:45:43.367901 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"assisted-installer-controller\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="assisted-installer/assisted-installer-controller-66tqt" podUID="7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" Mar 08 03:45:43.367982 master-0 kubenswrapper[4045]: E0308 03:45:43.367935 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" podUID="3ddfd0e7-fe76-41bc-b316-94505df81002" Mar 08 03:45:43.650680 master-0 kubenswrapper[4045]: I0308 03:45:43.650504 4045 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-09 03:37:12 +0000 UTC, rotation deadline is 2026-03-08 23:46:24.681686615 +0000 UTC Mar 08 03:45:43.650680 master-0 kubenswrapper[4045]: I0308 03:45:43.650566 4045 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 20h0m41.031126051s for next certificate rotation Mar 08 03:45:44.867397 master-0 kubenswrapper[4045]: I0308 03:45:44.867294 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:44.868326 master-0 kubenswrapper[4045]: E0308 03:45:44.867488 4045 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:45:44.868326 master-0 kubenswrapper[4045]: E0308 03:45:44.867591 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert podName:349d438d-d124-4d34-a172-4160e766c680 nodeName:}" failed. No retries permitted until 2026-03-08 03:45:52.86755875 +0000 UTC m=+52.478259748 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert") pod "cluster-version-operator-745944c6b7-gvmnp" (UID: "349d438d-d124-4d34-a172-4160e766c680") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:45:45.127369 master-0 kubenswrapper[4045]: I0308 03:45:45.127218 4045 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 03:45:52.927078 master-0 kubenswrapper[4045]: I0308 03:45:52.926895 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:45:52.927078 master-0 kubenswrapper[4045]: E0308 03:45:52.927079 4045 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:45:52.929423 master-0 kubenswrapper[4045]: E0308 03:45:52.927168 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert podName:349d438d-d124-4d34-a172-4160e766c680 nodeName:}" failed. No retries permitted until 2026-03-08 03:46:08.927140195 +0000 UTC m=+68.537841193 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert") pod "cluster-version-operator-745944c6b7-gvmnp" (UID: "349d438d-d124-4d34-a172-4160e766c680") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:45:54.383910 master-0 kubenswrapper[4045]: I0308 03:45:54.383796 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-66tqt" event={"ID":"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7","Type":"ContainerStarted","Data":"d11b35d3ea3d0150cbdfe887feb70180d8c9d1802a844e12699e549dc588011a"} Mar 08 03:45:54.398763 master-0 kubenswrapper[4045]: I0308 03:45:54.398679 4045 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="assisted-installer/assisted-installer-controller-66tqt" podStartSLOduration=285.79751122 podStartE2EDuration="4m51.398655183s" podCreationTimestamp="2026-03-08 03:41:03 +0000 UTC" firstStartedPulling="2026-03-08 03:45:37.436557992 +0000 UTC m=+37.047258990" lastFinishedPulling="2026-03-08 03:45:43.037701995 +0000 UTC m=+42.648402953" observedRunningTime="2026-03-08 03:45:54.398276953 +0000 UTC m=+54.008977951" watchObservedRunningTime="2026-03-08 03:45:54.398655183 +0000 UTC m=+54.009356171" Mar 08 03:45:55.387675 master-0 kubenswrapper[4045]: I0308 03:45:55.387239 4045 generic.go:334] "Generic (PLEG): container finished" podID="7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" containerID="d11b35d3ea3d0150cbdfe887feb70180d8c9d1802a844e12699e549dc588011a" exitCode=0 Mar 08 03:45:55.388343 master-0 kubenswrapper[4045]: I0308 03:45:55.387355 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-66tqt" event={"ID":"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7","Type":"ContainerDied","Data":"d11b35d3ea3d0150cbdfe887feb70180d8c9d1802a844e12699e549dc588011a"} Mar 08 03:45:55.390569 master-0 kubenswrapper[4045]: I0308 03:45:55.390525 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" event={"ID":"3ddfd0e7-fe76-41bc-b316-94505df81002","Type":"ContainerStarted","Data":"8f8cad46e77715e164ec2e62df8c2ee60a2b96fa8c918baa1589a3082317e15b"} Mar 08 03:45:56.424512 master-0 kubenswrapper[4045]: I0308 03:45:56.424451 4045 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:56.444008 master-0 kubenswrapper[4045]: I0308 03:45:56.443930 4045 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" podStartSLOduration=14.824670693 podStartE2EDuration="20.443905573s" podCreationTimestamp="2026-03-08 03:45:36 +0000 UTC" firstStartedPulling="2026-03-08 03:45:37.38352875 +0000 UTC m=+36.994229748" lastFinishedPulling="2026-03-08 03:45:43.00276367 +0000 UTC m=+42.613464628" observedRunningTime="2026-03-08 03:45:55.427952349 +0000 UTC m=+55.038653397" watchObservedRunningTime="2026-03-08 03:45:56.443905573 +0000 UTC m=+56.054606551" Mar 08 03:45:56.557093 master-0 kubenswrapper[4045]: I0308 03:45:56.556969 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-host-var-run-resolv-conf\") pod \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " Mar 08 03:45:56.557093 master-0 kubenswrapper[4045]: I0308 03:45:56.557066 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf776\" (UniqueName: \"kubernetes.io/projected/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-kube-api-access-hf776\") pod \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " Mar 08 03:45:56.557428 master-0 kubenswrapper[4045]: I0308 03:45:56.557114 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-host-ca-bundle\") pod \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " Mar 08 03:45:56.557428 master-0 kubenswrapper[4045]: I0308 03:45:56.557150 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-host-var-run-resolv-conf" (OuterVolumeSpecName: "host-var-run-resolv-conf") pod "7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" (UID: "7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7"). InnerVolumeSpecName "host-var-run-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:45:56.557428 master-0 kubenswrapper[4045]: I0308 03:45:56.557214 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-sno-bootstrap-files" (OuterVolumeSpecName: "sno-bootstrap-files") pod "7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" (UID: "7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7"). InnerVolumeSpecName "sno-bootstrap-files". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:45:56.557428 master-0 kubenswrapper[4045]: I0308 03:45:56.557162 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-sno-bootstrap-files\") pod \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " Mar 08 03:45:56.557428 master-0 kubenswrapper[4045]: I0308 03:45:56.557292 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-host-ca-bundle" (OuterVolumeSpecName: "host-ca-bundle") pod "7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" (UID: "7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7"). InnerVolumeSpecName "host-ca-bundle". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:45:56.557428 master-0 kubenswrapper[4045]: I0308 03:45:56.557316 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-host-resolv-conf\") pod \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\" (UID: \"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7\") " Mar 08 03:45:56.557428 master-0 kubenswrapper[4045]: I0308 03:45:56.557352 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-host-resolv-conf" (OuterVolumeSpecName: "host-resolv-conf") pod "7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" (UID: "7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7"). InnerVolumeSpecName "host-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:45:56.557929 master-0 kubenswrapper[4045]: I0308 03:45:56.557497 4045 reconciler_common.go:293] "Volume detached for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-host-var-run-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 08 03:45:56.557929 master-0 kubenswrapper[4045]: I0308 03:45:56.557523 4045 reconciler_common.go:293] "Volume detached for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-host-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:45:56.557929 master-0 kubenswrapper[4045]: I0308 03:45:56.557541 4045 reconciler_common.go:293] "Volume detached for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-sno-bootstrap-files\") on node \"master-0\" DevicePath \"\"" Mar 08 03:45:56.557929 master-0 kubenswrapper[4045]: I0308 03:45:56.557558 4045 reconciler_common.go:293] "Volume detached for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-host-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 08 03:45:56.567084 master-0 kubenswrapper[4045]: I0308 03:45:56.567008 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-kube-api-access-hf776" (OuterVolumeSpecName: "kube-api-access-hf776") pod "7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" (UID: "7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7"). InnerVolumeSpecName "kube-api-access-hf776". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:45:56.658013 master-0 kubenswrapper[4045]: I0308 03:45:56.657814 4045 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf776\" (UniqueName: \"kubernetes.io/projected/7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7-kube-api-access-hf776\") on node \"master-0\" DevicePath \"\"" Mar 08 03:45:57.200808 master-0 kubenswrapper[4045]: I0308 03:45:57.200337 4045 scope.go:117] "RemoveContainer" containerID="1611cfa5e175032b10c844270b1926150f7a6bf4a58e7bfa0e9ab7a757d448fe" Mar 08 03:45:57.400121 master-0 kubenswrapper[4045]: I0308 03:45:57.400046 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-66tqt" event={"ID":"7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7","Type":"ContainerDied","Data":"d29086141609fa12579213578ed2d780ee581ff60e20ceb99a14fefd44548805"} Mar 08 03:45:57.400121 master-0 kubenswrapper[4045]: I0308 03:45:57.400119 4045 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d29086141609fa12579213578ed2d780ee581ff60e20ceb99a14fefd44548805" Mar 08 03:45:57.400308 master-0 kubenswrapper[4045]: I0308 03:45:57.400217 4045 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:45:57.406358 master-0 kubenswrapper[4045]: I0308 03:45:57.406312 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/mtu-prober-6hl29"] Mar 08 03:45:57.406559 master-0 kubenswrapper[4045]: E0308 03:45:57.406419 4045 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" containerName="assisted-installer-controller" Mar 08 03:45:57.406559 master-0 kubenswrapper[4045]: I0308 03:45:57.406438 4045 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" containerName="assisted-installer-controller" Mar 08 03:45:57.406559 master-0 kubenswrapper[4045]: I0308 03:45:57.406477 4045 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" containerName="assisted-installer-controller" Mar 08 03:45:57.406715 master-0 kubenswrapper[4045]: I0308 03:45:57.406697 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-6hl29" Mar 08 03:45:57.563413 master-0 kubenswrapper[4045]: I0308 03:45:57.563321 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x782\" (UniqueName: \"kubernetes.io/projected/689a1fe4-9189-4a55-a61a-94a155b8040d-kube-api-access-5x782\") pod \"mtu-prober-6hl29\" (UID: \"689a1fe4-9189-4a55-a61a-94a155b8040d\") " pod="openshift-network-operator/mtu-prober-6hl29" Mar 08 03:45:57.664236 master-0 kubenswrapper[4045]: I0308 03:45:57.664137 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x782\" (UniqueName: \"kubernetes.io/projected/689a1fe4-9189-4a55-a61a-94a155b8040d-kube-api-access-5x782\") pod \"mtu-prober-6hl29\" (UID: \"689a1fe4-9189-4a55-a61a-94a155b8040d\") " pod="openshift-network-operator/mtu-prober-6hl29" Mar 08 03:45:57.694191 master-0 kubenswrapper[4045]: I0308 03:45:57.694147 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x782\" (UniqueName: \"kubernetes.io/projected/689a1fe4-9189-4a55-a61a-94a155b8040d-kube-api-access-5x782\") pod \"mtu-prober-6hl29\" (UID: \"689a1fe4-9189-4a55-a61a-94a155b8040d\") " pod="openshift-network-operator/mtu-prober-6hl29" Mar 08 03:45:57.743881 master-0 kubenswrapper[4045]: I0308 03:45:57.743733 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-6hl29" Mar 08 03:45:57.764707 master-0 kubenswrapper[4045]: W0308 03:45:57.764642 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod689a1fe4_9189_4a55_a61a_94a155b8040d.slice/crio-e5952924b1f9e22bfe7dad849c062189ae8d85553be131f53d8d3ab00c359665 WatchSource:0}: Error finding container e5952924b1f9e22bfe7dad849c062189ae8d85553be131f53d8d3ab00c359665: Status 404 returned error can't find the container with id e5952924b1f9e22bfe7dad849c062189ae8d85553be131f53d8d3ab00c359665 Mar 08 03:45:58.405374 master-0 kubenswrapper[4045]: I0308 03:45:58.405159 4045 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 08 03:45:58.405964 master-0 kubenswrapper[4045]: I0308 03:45:58.405891 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"2c8736855304b1b6928cbfdc88bfeac2e98662a8092340731da4a5d87e7dfa39"} Mar 08 03:45:58.413565 master-0 kubenswrapper[4045]: I0308 03:45:58.412806 4045 generic.go:334] "Generic (PLEG): container finished" podID="689a1fe4-9189-4a55-a61a-94a155b8040d" containerID="4998ca5636ddd8d905b67c8fb24fdf903c161f9cdcca4bdb3b01719d5f1d5376" exitCode=0 Mar 08 03:45:58.413565 master-0 kubenswrapper[4045]: I0308 03:45:58.412891 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-6hl29" event={"ID":"689a1fe4-9189-4a55-a61a-94a155b8040d","Type":"ContainerDied","Data":"4998ca5636ddd8d905b67c8fb24fdf903c161f9cdcca4bdb3b01719d5f1d5376"} Mar 08 03:45:58.413565 master-0 kubenswrapper[4045]: I0308 03:45:58.412926 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-6hl29" event={"ID":"689a1fe4-9189-4a55-a61a-94a155b8040d","Type":"ContainerStarted","Data":"e5952924b1f9e22bfe7dad849c062189ae8d85553be131f53d8d3ab00c359665"} Mar 08 03:45:58.449772 master-0 kubenswrapper[4045]: I0308 03:45:58.449670 4045 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podStartSLOduration=16.449637016 podStartE2EDuration="16.449637016s" podCreationTimestamp="2026-03-08 03:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:45:58.429951405 +0000 UTC m=+58.040652353" watchObservedRunningTime="2026-03-08 03:45:58.449637016 +0000 UTC m=+58.060338005" Mar 08 03:45:59.440182 master-0 kubenswrapper[4045]: I0308 03:45:59.440152 4045 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-6hl29" Mar 08 03:45:59.578233 master-0 kubenswrapper[4045]: I0308 03:45:59.578169 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x782\" (UniqueName: \"kubernetes.io/projected/689a1fe4-9189-4a55-a61a-94a155b8040d-kube-api-access-5x782\") pod \"689a1fe4-9189-4a55-a61a-94a155b8040d\" (UID: \"689a1fe4-9189-4a55-a61a-94a155b8040d\") " Mar 08 03:45:59.583757 master-0 kubenswrapper[4045]: I0308 03:45:59.583675 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/689a1fe4-9189-4a55-a61a-94a155b8040d-kube-api-access-5x782" (OuterVolumeSpecName: "kube-api-access-5x782") pod "689a1fe4-9189-4a55-a61a-94a155b8040d" (UID: "689a1fe4-9189-4a55-a61a-94a155b8040d"). InnerVolumeSpecName "kube-api-access-5x782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:45:59.679414 master-0 kubenswrapper[4045]: I0308 03:45:59.679227 4045 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x782\" (UniqueName: \"kubernetes.io/projected/689a1fe4-9189-4a55-a61a-94a155b8040d-kube-api-access-5x782\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:00.420796 master-0 kubenswrapper[4045]: I0308 03:46:00.420712 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-6hl29" event={"ID":"689a1fe4-9189-4a55-a61a-94a155b8040d","Type":"ContainerDied","Data":"e5952924b1f9e22bfe7dad849c062189ae8d85553be131f53d8d3ab00c359665"} Mar 08 03:46:00.420796 master-0 kubenswrapper[4045]: I0308 03:46:00.420777 4045 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5952924b1f9e22bfe7dad849c062189ae8d85553be131f53d8d3ab00c359665" Mar 08 03:46:00.421159 master-0 kubenswrapper[4045]: I0308 03:46:00.420795 4045 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-6hl29" Mar 08 03:46:02.428684 master-0 kubenswrapper[4045]: I0308 03:46:02.428635 4045 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-network-operator/mtu-prober-6hl29"] Mar 08 03:46:02.432131 master-0 kubenswrapper[4045]: I0308 03:46:02.432073 4045 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-network-operator/mtu-prober-6hl29"] Mar 08 03:46:03.204331 master-0 kubenswrapper[4045]: I0308 03:46:03.204246 4045 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="689a1fe4-9189-4a55-a61a-94a155b8040d" path="/var/lib/kubelet/pods/689a1fe4-9189-4a55-a61a-94a155b8040d/volumes" Mar 08 03:46:07.285951 master-0 kubenswrapper[4045]: I0308 03:46:07.283661 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-rpppb"] Mar 08 03:46:07.285951 master-0 kubenswrapper[4045]: E0308 03:46:07.283972 4045 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="689a1fe4-9189-4a55-a61a-94a155b8040d" containerName="prober" Mar 08 03:46:07.285951 master-0 kubenswrapper[4045]: I0308 03:46:07.284008 4045 state_mem.go:107] "Deleted CPUSet assignment" podUID="689a1fe4-9189-4a55-a61a-94a155b8040d" containerName="prober" Mar 08 03:46:07.285951 master-0 kubenswrapper[4045]: I0308 03:46:07.284276 4045 memory_manager.go:354] "RemoveStaleState removing state" podUID="689a1fe4-9189-4a55-a61a-94a155b8040d" containerName="prober" Mar 08 03:46:07.285951 master-0 kubenswrapper[4045]: I0308 03:46:07.284634 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.288075 master-0 kubenswrapper[4045]: I0308 03:46:07.287967 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 03:46:07.288075 master-0 kubenswrapper[4045]: I0308 03:46:07.287967 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 03:46:07.288245 master-0 kubenswrapper[4045]: I0308 03:46:07.287967 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 03:46:07.288245 master-0 kubenswrapper[4045]: I0308 03:46:07.288187 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 03:46:07.436081 master-0 kubenswrapper[4045]: I0308 03:46:07.435985 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-os-release\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.436081 master-0 kubenswrapper[4045]: I0308 03:46:07.436056 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-k8s-cni-cncf-io\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.436081 master-0 kubenswrapper[4045]: I0308 03:46:07.436091 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-etc-kubernetes\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.436395 master-0 kubenswrapper[4045]: I0308 03:46:07.436168 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-socket-dir-parent\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.436395 master-0 kubenswrapper[4045]: I0308 03:46:07.436214 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-cni-bin\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.436395 master-0 kubenswrapper[4045]: I0308 03:46:07.436251 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-cnibin\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.436395 master-0 kubenswrapper[4045]: I0308 03:46:07.436283 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-netns\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.436395 master-0 kubenswrapper[4045]: I0308 03:46:07.436311 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-daemon-config\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.436395 master-0 kubenswrapper[4045]: I0308 03:46:07.436341 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-multus-certs\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.436395 master-0 kubenswrapper[4045]: I0308 03:46:07.436371 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-system-cni-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.436395 master-0 kubenswrapper[4045]: I0308 03:46:07.436397 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-cni-binary-copy\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.436681 master-0 kubenswrapper[4045]: I0308 03:46:07.436438 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-kubelet\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.436681 master-0 kubenswrapper[4045]: I0308 03:46:07.436501 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-cni-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.436681 master-0 kubenswrapper[4045]: I0308 03:46:07.436550 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-hostroot\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.436681 master-0 kubenswrapper[4045]: I0308 03:46:07.436578 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-conf-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.436681 master-0 kubenswrapper[4045]: I0308 03:46:07.436606 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nk8r\" (UniqueName: \"kubernetes.io/projected/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-kube-api-access-7nk8r\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.436953 master-0 kubenswrapper[4045]: I0308 03:46:07.436707 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-cni-multus\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.482859 master-0 kubenswrapper[4045]: I0308 03:46:07.482764 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-g564l"] Mar 08 03:46:07.485096 master-0 kubenswrapper[4045]: I0308 03:46:07.485055 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.489170 master-0 kubenswrapper[4045]: I0308 03:46:07.489109 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 08 03:46:07.489357 master-0 kubenswrapper[4045]: I0308 03:46:07.489317 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 03:46:07.538176 master-0 kubenswrapper[4045]: I0308 03:46:07.538012 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-k8s-cni-cncf-io\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.538176 master-0 kubenswrapper[4045]: I0308 03:46:07.538088 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-etc-kubernetes\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.538176 master-0 kubenswrapper[4045]: I0308 03:46:07.538116 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-socket-dir-parent\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.538176 master-0 kubenswrapper[4045]: I0308 03:46:07.538181 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-etc-kubernetes\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.538468 master-0 kubenswrapper[4045]: I0308 03:46:07.538220 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-cni-bin\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.538468 master-0 kubenswrapper[4045]: I0308 03:46:07.538244 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-cnibin\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.538468 master-0 kubenswrapper[4045]: I0308 03:46:07.538266 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-netns\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.538468 master-0 kubenswrapper[4045]: I0308 03:46:07.538287 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-daemon-config\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.538468 master-0 kubenswrapper[4045]: I0308 03:46:07.538324 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-netns\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.538468 master-0 kubenswrapper[4045]: I0308 03:46:07.538325 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-socket-dir-parent\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.538468 master-0 kubenswrapper[4045]: I0308 03:46:07.538356 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-cni-bin\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.538468 master-0 kubenswrapper[4045]: I0308 03:46:07.538393 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-k8s-cni-cncf-io\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.538753 master-0 kubenswrapper[4045]: I0308 03:46:07.538496 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-multus-certs\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.538753 master-0 kubenswrapper[4045]: I0308 03:46:07.538607 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-multus-certs\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.538857 master-0 kubenswrapper[4045]: I0308 03:46:07.538777 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-system-cni-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.538857 master-0 kubenswrapper[4045]: I0308 03:46:07.538849 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-cni-binary-copy\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.538947 master-0 kubenswrapper[4045]: I0308 03:46:07.538884 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-kubelet\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.538947 master-0 kubenswrapper[4045]: I0308 03:46:07.538848 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-cnibin\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.538947 master-0 kubenswrapper[4045]: I0308 03:46:07.538921 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-cni-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.539058 master-0 kubenswrapper[4045]: I0308 03:46:07.538958 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-hostroot\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.539058 master-0 kubenswrapper[4045]: I0308 03:46:07.538963 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-kubelet\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.539058 master-0 kubenswrapper[4045]: I0308 03:46:07.538991 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-conf-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.539058 master-0 kubenswrapper[4045]: I0308 03:46:07.539005 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-hostroot\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.539058 master-0 kubenswrapper[4045]: I0308 03:46:07.538961 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-system-cni-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.539058 master-0 kubenswrapper[4045]: I0308 03:46:07.539024 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nk8r\" (UniqueName: \"kubernetes.io/projected/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-kube-api-access-7nk8r\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.539285 master-0 kubenswrapper[4045]: I0308 03:46:07.539152 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-cni-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.539285 master-0 kubenswrapper[4045]: I0308 03:46:07.539162 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-conf-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.539285 master-0 kubenswrapper[4045]: I0308 03:46:07.539208 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-cni-multus\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.539285 master-0 kubenswrapper[4045]: I0308 03:46:07.539187 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-cni-multus\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.539285 master-0 kubenswrapper[4045]: I0308 03:46:07.539263 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-os-release\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.539475 master-0 kubenswrapper[4045]: I0308 03:46:07.539352 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-os-release\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.539586 master-0 kubenswrapper[4045]: I0308 03:46:07.539543 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-daemon-config\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.539635 master-0 kubenswrapper[4045]: I0308 03:46:07.539603 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-cni-binary-copy\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.556922 master-0 kubenswrapper[4045]: I0308 03:46:07.556838 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nk8r\" (UniqueName: \"kubernetes.io/projected/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-kube-api-access-7nk8r\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.609369 master-0 kubenswrapper[4045]: I0308 03:46:07.609252 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rpppb" Mar 08 03:46:07.624741 master-0 kubenswrapper[4045]: W0308 03:46:07.624679 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod093f17f0_2818_4e24_b3c3_6ab4da9d21fb.slice/crio-0ec1fcf833bb575029f4371f595adf3e92b6ae14914f83458d311cb85210d774 WatchSource:0}: Error finding container 0ec1fcf833bb575029f4371f595adf3e92b6ae14914f83458d311cb85210d774: Status 404 returned error can't find the container with id 0ec1fcf833bb575029f4371f595adf3e92b6ae14914f83458d311cb85210d774 Mar 08 03:46:07.639620 master-0 kubenswrapper[4045]: I0308 03:46:07.639546 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-cni-binary-copy\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.639620 master-0 kubenswrapper[4045]: I0308 03:46:07.639604 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw7bx\" (UniqueName: \"kubernetes.io/projected/4a19441e-e61b-4d58-85db-813ae88e1f9b-kube-api-access-dw7bx\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.639956 master-0 kubenswrapper[4045]: I0308 03:46:07.639637 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-system-cni-dir\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.639956 master-0 kubenswrapper[4045]: I0308 03:46:07.639743 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-cnibin\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.639956 master-0 kubenswrapper[4045]: I0308 03:46:07.639800 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.639956 master-0 kubenswrapper[4045]: I0308 03:46:07.639878 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-os-release\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.639956 master-0 kubenswrapper[4045]: I0308 03:46:07.639911 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.639956 master-0 kubenswrapper[4045]: I0308 03:46:07.639947 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-whereabouts-configmap\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.751236 master-0 kubenswrapper[4045]: I0308 03:46:07.740748 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-os-release\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.751236 master-0 kubenswrapper[4045]: I0308 03:46:07.740858 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.751236 master-0 kubenswrapper[4045]: I0308 03:46:07.740912 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-whereabouts-configmap\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.751236 master-0 kubenswrapper[4045]: I0308 03:46:07.740946 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-os-release\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.751236 master-0 kubenswrapper[4045]: I0308 03:46:07.741096 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-cni-binary-copy\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.751236 master-0 kubenswrapper[4045]: I0308 03:46:07.741169 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw7bx\" (UniqueName: \"kubernetes.io/projected/4a19441e-e61b-4d58-85db-813ae88e1f9b-kube-api-access-dw7bx\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.751236 master-0 kubenswrapper[4045]: I0308 03:46:07.741226 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-system-cni-dir\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.751236 master-0 kubenswrapper[4045]: I0308 03:46:07.741289 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.751236 master-0 kubenswrapper[4045]: I0308 03:46:07.741362 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-system-cni-dir\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.751236 master-0 kubenswrapper[4045]: I0308 03:46:07.741499 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-cnibin\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.751236 master-0 kubenswrapper[4045]: I0308 03:46:07.741702 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-cnibin\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.751236 master-0 kubenswrapper[4045]: I0308 03:46:07.741785 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.751236 master-0 kubenswrapper[4045]: I0308 03:46:07.742349 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-whereabouts-configmap\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.751236 master-0 kubenswrapper[4045]: I0308 03:46:07.742528 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-cni-binary-copy\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.751236 master-0 kubenswrapper[4045]: I0308 03:46:07.742751 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.780475 master-0 kubenswrapper[4045]: I0308 03:46:07.780377 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw7bx\" (UniqueName: \"kubernetes.io/projected/4a19441e-e61b-4d58-85db-813ae88e1f9b-kube-api-access-dw7bx\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.826241 master-0 kubenswrapper[4045]: I0308 03:46:07.826067 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:46:07.842448 master-0 kubenswrapper[4045]: W0308 03:46:07.842227 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a19441e_e61b_4d58_85db_813ae88e1f9b.slice/crio-ec83c044c04d6837d5d5f7d4c71e74473794e6ee1e718df488cf45a934fcc03a WatchSource:0}: Error finding container ec83c044c04d6837d5d5f7d4c71e74473794e6ee1e718df488cf45a934fcc03a: Status 404 returned error can't find the container with id ec83c044c04d6837d5d5f7d4c71e74473794e6ee1e718df488cf45a934fcc03a Mar 08 03:46:08.278857 master-0 kubenswrapper[4045]: I0308 03:46:08.276237 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-schjl"] Mar 08 03:46:08.278857 master-0 kubenswrapper[4045]: I0308 03:46:08.277783 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:08.278857 master-0 kubenswrapper[4045]: E0308 03:46:08.278099 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:08.444288 master-0 kubenswrapper[4045]: I0308 03:46:08.444212 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rpppb" event={"ID":"093f17f0-2818-4e24-b3c3-6ab4da9d21fb","Type":"ContainerStarted","Data":"0ec1fcf833bb575029f4371f595adf3e92b6ae14914f83458d311cb85210d774"} Mar 08 03:46:08.447016 master-0 kubenswrapper[4045]: I0308 03:46:08.446946 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g564l" event={"ID":"4a19441e-e61b-4d58-85db-813ae88e1f9b","Type":"ContainerStarted","Data":"ec83c044c04d6837d5d5f7d4c71e74473794e6ee1e718df488cf45a934fcc03a"} Mar 08 03:46:08.448342 master-0 kubenswrapper[4045]: I0308 03:46:08.448281 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:08.448457 master-0 kubenswrapper[4045]: I0308 03:46:08.448364 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmhtb\" (UniqueName: \"kubernetes.io/projected/d5044ffd-0686-4679-9894-e696faf33699-kube-api-access-mmhtb\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:08.549505 master-0 kubenswrapper[4045]: I0308 03:46:08.549289 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmhtb\" (UniqueName: \"kubernetes.io/projected/d5044ffd-0686-4679-9894-e696faf33699-kube-api-access-mmhtb\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:08.549505 master-0 kubenswrapper[4045]: I0308 03:46:08.549352 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:08.549505 master-0 kubenswrapper[4045]: E0308 03:46:08.549454 4045 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:46:08.549505 master-0 kubenswrapper[4045]: E0308 03:46:08.549510 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs podName:d5044ffd-0686-4679-9894-e696faf33699 nodeName:}" failed. No retries permitted until 2026-03-08 03:46:09.049492581 +0000 UTC m=+68.660193539 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs") pod "network-metrics-daemon-schjl" (UID: "d5044ffd-0686-4679-9894-e696faf33699") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:46:08.570222 master-0 kubenswrapper[4045]: I0308 03:46:08.570177 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmhtb\" (UniqueName: \"kubernetes.io/projected/d5044ffd-0686-4679-9894-e696faf33699-kube-api-access-mmhtb\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:08.953277 master-0 kubenswrapper[4045]: I0308 03:46:08.953222 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:46:08.953570 master-0 kubenswrapper[4045]: E0308 03:46:08.953406 4045 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:46:08.953570 master-0 kubenswrapper[4045]: E0308 03:46:08.953506 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert podName:349d438d-d124-4d34-a172-4160e766c680 nodeName:}" failed. No retries permitted until 2026-03-08 03:46:40.953480929 +0000 UTC m=+100.564181887 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert") pod "cluster-version-operator-745944c6b7-gvmnp" (UID: "349d438d-d124-4d34-a172-4160e766c680") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:46:09.054948 master-0 kubenswrapper[4045]: I0308 03:46:09.054864 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:09.055140 master-0 kubenswrapper[4045]: E0308 03:46:09.055038 4045 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:46:09.055140 master-0 kubenswrapper[4045]: E0308 03:46:09.055119 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs podName:d5044ffd-0686-4679-9894-e696faf33699 nodeName:}" failed. No retries permitted until 2026-03-08 03:46:10.055101031 +0000 UTC m=+69.665801989 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs") pod "network-metrics-daemon-schjl" (UID: "d5044ffd-0686-4679-9894-e696faf33699") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:46:10.062287 master-0 kubenswrapper[4045]: I0308 03:46:10.061994 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:10.062287 master-0 kubenswrapper[4045]: E0308 03:46:10.062139 4045 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:46:10.062786 master-0 kubenswrapper[4045]: E0308 03:46:10.062349 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs podName:d5044ffd-0686-4679-9894-e696faf33699 nodeName:}" failed. No retries permitted until 2026-03-08 03:46:12.062334182 +0000 UTC m=+71.673035140 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs") pod "network-metrics-daemon-schjl" (UID: "d5044ffd-0686-4679-9894-e696faf33699") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:46:10.199450 master-0 kubenswrapper[4045]: I0308 03:46:10.199420 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:10.199757 master-0 kubenswrapper[4045]: E0308 03:46:10.199539 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:10.454856 master-0 kubenswrapper[4045]: I0308 03:46:10.454080 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g564l" event={"ID":"4a19441e-e61b-4d58-85db-813ae88e1f9b","Type":"ContainerStarted","Data":"46250aae369897400569e4111703b276aadaa65120ad7d4c39a342c4f39e31c8"} Mar 08 03:46:11.464572 master-0 kubenswrapper[4045]: I0308 03:46:11.464522 4045 generic.go:334] "Generic (PLEG): container finished" podID="4a19441e-e61b-4d58-85db-813ae88e1f9b" containerID="46250aae369897400569e4111703b276aadaa65120ad7d4c39a342c4f39e31c8" exitCode=0 Mar 08 03:46:11.465673 master-0 kubenswrapper[4045]: I0308 03:46:11.465539 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g564l" event={"ID":"4a19441e-e61b-4d58-85db-813ae88e1f9b","Type":"ContainerDied","Data":"46250aae369897400569e4111703b276aadaa65120ad7d4c39a342c4f39e31c8"} Mar 08 03:46:12.086280 master-0 kubenswrapper[4045]: I0308 03:46:12.086219 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:12.086475 master-0 kubenswrapper[4045]: E0308 03:46:12.086393 4045 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:46:12.086526 master-0 kubenswrapper[4045]: E0308 03:46:12.086503 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs podName:d5044ffd-0686-4679-9894-e696faf33699 nodeName:}" failed. No retries permitted until 2026-03-08 03:46:16.08647468 +0000 UTC m=+75.697175658 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs") pod "network-metrics-daemon-schjl" (UID: "d5044ffd-0686-4679-9894-e696faf33699") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:46:12.199640 master-0 kubenswrapper[4045]: I0308 03:46:12.199587 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:12.199892 master-0 kubenswrapper[4045]: E0308 03:46:12.199719 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:14.199414 master-0 kubenswrapper[4045]: I0308 03:46:14.198865 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:14.199414 master-0 kubenswrapper[4045]: E0308 03:46:14.199017 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:16.120372 master-0 kubenswrapper[4045]: I0308 03:46:16.120249 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:16.121270 master-0 kubenswrapper[4045]: E0308 03:46:16.120464 4045 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:46:16.121270 master-0 kubenswrapper[4045]: E0308 03:46:16.120557 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs podName:d5044ffd-0686-4679-9894-e696faf33699 nodeName:}" failed. No retries permitted until 2026-03-08 03:46:24.120533976 +0000 UTC m=+83.731234944 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs") pod "network-metrics-daemon-schjl" (UID: "d5044ffd-0686-4679-9894-e696faf33699") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:46:16.199306 master-0 kubenswrapper[4045]: I0308 03:46:16.199257 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:16.201212 master-0 kubenswrapper[4045]: E0308 03:46:16.199407 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:18.199042 master-0 kubenswrapper[4045]: I0308 03:46:18.198780 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:18.200289 master-0 kubenswrapper[4045]: E0308 03:46:18.199295 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:18.211710 master-0 kubenswrapper[4045]: W0308 03:46:18.211651 4045 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 08 03:46:18.212996 master-0 kubenswrapper[4045]: I0308 03:46:18.212957 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 08 03:46:19.681601 master-0 kubenswrapper[4045]: I0308 03:46:19.677169 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh"] Mar 08 03:46:19.681601 master-0 kubenswrapper[4045]: I0308 03:46:19.677451 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:46:19.681601 master-0 kubenswrapper[4045]: I0308 03:46:19.678783 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 03:46:19.681601 master-0 kubenswrapper[4045]: I0308 03:46:19.679442 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 03:46:19.681601 master-0 kubenswrapper[4045]: I0308 03:46:19.679564 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 03:46:19.681601 master-0 kubenswrapper[4045]: I0308 03:46:19.679781 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 03:46:19.681601 master-0 kubenswrapper[4045]: I0308 03:46:19.679955 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 03:46:19.721949 master-0 kubenswrapper[4045]: I0308 03:46:19.721843 4045 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0-master-0" podStartSLOduration=1.721799233 podStartE2EDuration="1.721799233s" podCreationTimestamp="2026-03-08 03:46:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:46:19.701949205 +0000 UTC m=+79.312650173" watchObservedRunningTime="2026-03-08 03:46:19.721799233 +0000 UTC m=+79.332500191" Mar 08 03:46:19.849523 master-0 kubenswrapper[4045]: I0308 03:46:19.849070 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6gml\" (UniqueName: \"kubernetes.io/projected/7e5935ea-8d95-45e3-b836-c7892953ef3d-kube-api-access-c6gml\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:46:19.849523 master-0 kubenswrapper[4045]: I0308 03:46:19.849132 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7e5935ea-8d95-45e3-b836-c7892953ef3d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:46:19.849523 master-0 kubenswrapper[4045]: I0308 03:46:19.849153 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7e5935ea-8d95-45e3-b836-c7892953ef3d-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:46:19.849523 master-0 kubenswrapper[4045]: I0308 03:46:19.849169 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7e5935ea-8d95-45e3-b836-c7892953ef3d-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:46:19.897535 master-0 kubenswrapper[4045]: I0308 03:46:19.897468 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-77cnk"] Mar 08 03:46:19.898727 master-0 kubenswrapper[4045]: I0308 03:46:19.898682 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:19.901138 master-0 kubenswrapper[4045]: I0308 03:46:19.900955 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 03:46:19.901614 master-0 kubenswrapper[4045]: I0308 03:46:19.901576 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 03:46:19.950051 master-0 kubenswrapper[4045]: I0308 03:46:19.949966 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6gml\" (UniqueName: \"kubernetes.io/projected/7e5935ea-8d95-45e3-b836-c7892953ef3d-kube-api-access-c6gml\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:46:19.950051 master-0 kubenswrapper[4045]: I0308 03:46:19.950052 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7e5935ea-8d95-45e3-b836-c7892953ef3d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:46:19.950383 master-0 kubenswrapper[4045]: I0308 03:46:19.950109 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7e5935ea-8d95-45e3-b836-c7892953ef3d-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:46:19.950383 master-0 kubenswrapper[4045]: I0308 03:46:19.950136 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7e5935ea-8d95-45e3-b836-c7892953ef3d-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:46:19.951322 master-0 kubenswrapper[4045]: I0308 03:46:19.951288 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7e5935ea-8d95-45e3-b836-c7892953ef3d-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:46:19.952057 master-0 kubenswrapper[4045]: I0308 03:46:19.951977 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7e5935ea-8d95-45e3-b836-c7892953ef3d-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:46:19.954085 master-0 kubenswrapper[4045]: I0308 03:46:19.954035 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7e5935ea-8d95-45e3-b836-c7892953ef3d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:46:19.972591 master-0 kubenswrapper[4045]: I0308 03:46:19.972550 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6gml\" (UniqueName: \"kubernetes.io/projected/7e5935ea-8d95-45e3-b836-c7892953ef3d-kube-api-access-c6gml\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:46:20.002601 master-0 kubenswrapper[4045]: I0308 03:46:20.002543 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:46:20.051442 master-0 kubenswrapper[4045]: I0308 03:46:20.051356 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-node-log\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.051442 master-0 kubenswrapper[4045]: I0308 03:46:20.051438 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-run-netns\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.051442 master-0 kubenswrapper[4045]: I0308 03:46:20.051460 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-run-openvswitch\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.051736 master-0 kubenswrapper[4045]: I0308 03:46:20.051505 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k967k\" (UniqueName: \"kubernetes.io/projected/bead1d9b-4518-46a9-bb0b-50316252eb1c-kube-api-access-k967k\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.051736 master-0 kubenswrapper[4045]: I0308 03:46:20.051561 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-kubelet\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.051736 master-0 kubenswrapper[4045]: I0308 03:46:20.051629 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-var-lib-openvswitch\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.051736 master-0 kubenswrapper[4045]: I0308 03:46:20.051666 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-log-socket\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.051736 master-0 kubenswrapper[4045]: I0308 03:46:20.051692 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-etc-openvswitch\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.051736 master-0 kubenswrapper[4045]: I0308 03:46:20.051726 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-run-systemd\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.051996 master-0 kubenswrapper[4045]: I0308 03:46:20.051749 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-run-ovn\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.051996 master-0 kubenswrapper[4045]: I0308 03:46:20.051778 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.051996 master-0 kubenswrapper[4045]: I0308 03:46:20.051806 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bead1d9b-4518-46a9-bb0b-50316252eb1c-ovn-node-metrics-cert\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.051996 master-0 kubenswrapper[4045]: I0308 03:46:20.051931 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bead1d9b-4518-46a9-bb0b-50316252eb1c-env-overrides\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.051996 master-0 kubenswrapper[4045]: I0308 03:46:20.051977 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bead1d9b-4518-46a9-bb0b-50316252eb1c-ovnkube-script-lib\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.052167 master-0 kubenswrapper[4045]: I0308 03:46:20.052010 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-systemd-units\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.052167 master-0 kubenswrapper[4045]: I0308 03:46:20.052042 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-run-ovn-kubernetes\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.052167 master-0 kubenswrapper[4045]: I0308 03:46:20.052071 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-cni-netd\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.052167 master-0 kubenswrapper[4045]: I0308 03:46:20.052103 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-cni-bin\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.052167 master-0 kubenswrapper[4045]: I0308 03:46:20.052150 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bead1d9b-4518-46a9-bb0b-50316252eb1c-ovnkube-config\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.052338 master-0 kubenswrapper[4045]: I0308 03:46:20.052186 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-slash\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.152609 master-0 kubenswrapper[4045]: I0308 03:46:20.152534 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-kubelet\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.152609 master-0 kubenswrapper[4045]: I0308 03:46:20.152591 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-var-lib-openvswitch\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.152943 master-0 kubenswrapper[4045]: I0308 03:46:20.152684 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-kubelet\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.152943 master-0 kubenswrapper[4045]: I0308 03:46:20.152765 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-log-socket\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.152943 master-0 kubenswrapper[4045]: I0308 03:46:20.152793 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-etc-openvswitch\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.152943 master-0 kubenswrapper[4045]: I0308 03:46:20.152815 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-run-systemd\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.152943 master-0 kubenswrapper[4045]: I0308 03:46:20.152845 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-run-ovn\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.152943 master-0 kubenswrapper[4045]: I0308 03:46:20.152872 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.152943 master-0 kubenswrapper[4045]: I0308 03:46:20.152894 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bead1d9b-4518-46a9-bb0b-50316252eb1c-ovn-node-metrics-cert\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.152943 master-0 kubenswrapper[4045]: I0308 03:46:20.152927 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-systemd-units\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.152943 master-0 kubenswrapper[4045]: I0308 03:46:20.152952 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-run-ovn-kubernetes\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.153189 master-0 kubenswrapper[4045]: I0308 03:46:20.152974 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-cni-netd\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.153189 master-0 kubenswrapper[4045]: I0308 03:46:20.152992 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bead1d9b-4518-46a9-bb0b-50316252eb1c-env-overrides\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.153189 master-0 kubenswrapper[4045]: I0308 03:46:20.153008 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bead1d9b-4518-46a9-bb0b-50316252eb1c-ovnkube-script-lib\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.153189 master-0 kubenswrapper[4045]: I0308 03:46:20.153033 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-cni-bin\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.153189 master-0 kubenswrapper[4045]: I0308 03:46:20.153065 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bead1d9b-4518-46a9-bb0b-50316252eb1c-ovnkube-config\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.153189 master-0 kubenswrapper[4045]: I0308 03:46:20.153090 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-slash\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.153189 master-0 kubenswrapper[4045]: I0308 03:46:20.153110 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-run-netns\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.153189 master-0 kubenswrapper[4045]: I0308 03:46:20.153128 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-run-openvswitch\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.153189 master-0 kubenswrapper[4045]: I0308 03:46:20.153144 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-node-log\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.153189 master-0 kubenswrapper[4045]: I0308 03:46:20.153165 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k967k\" (UniqueName: \"kubernetes.io/projected/bead1d9b-4518-46a9-bb0b-50316252eb1c-kube-api-access-k967k\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.153463 master-0 kubenswrapper[4045]: I0308 03:46:20.153409 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-run-ovn-kubernetes\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.153543 master-0 kubenswrapper[4045]: I0308 03:46:20.153506 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-cni-netd\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.153543 master-0 kubenswrapper[4045]: I0308 03:46:20.153503 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-var-lib-openvswitch\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.153615 master-0 kubenswrapper[4045]: I0308 03:46:20.153573 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-etc-openvswitch\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.153651 master-0 kubenswrapper[4045]: I0308 03:46:20.153613 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-log-socket\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.153713 master-0 kubenswrapper[4045]: I0308 03:46:20.153678 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-run-systemd\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.153752 master-0 kubenswrapper[4045]: I0308 03:46:20.153734 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-run-ovn\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.153830 master-0 kubenswrapper[4045]: I0308 03:46:20.153785 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.154312 master-0 kubenswrapper[4045]: I0308 03:46:20.154275 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bead1d9b-4518-46a9-bb0b-50316252eb1c-ovnkube-script-lib\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.154584 master-0 kubenswrapper[4045]: I0308 03:46:20.154537 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-slash\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.154620 master-0 kubenswrapper[4045]: I0308 03:46:20.154588 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bead1d9b-4518-46a9-bb0b-50316252eb1c-env-overrides\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.154654 master-0 kubenswrapper[4045]: I0308 03:46:20.154616 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-cni-bin\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.154654 master-0 kubenswrapper[4045]: I0308 03:46:20.154629 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-run-netns\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.154711 master-0 kubenswrapper[4045]: I0308 03:46:20.154668 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-run-openvswitch\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.154711 master-0 kubenswrapper[4045]: I0308 03:46:20.154671 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-node-log\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.154711 master-0 kubenswrapper[4045]: I0308 03:46:20.154696 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-systemd-units\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.155635 master-0 kubenswrapper[4045]: I0308 03:46:20.155596 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bead1d9b-4518-46a9-bb0b-50316252eb1c-ovnkube-config\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.157242 master-0 kubenswrapper[4045]: I0308 03:46:20.157200 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bead1d9b-4518-46a9-bb0b-50316252eb1c-ovn-node-metrics-cert\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.176419 master-0 kubenswrapper[4045]: I0308 03:46:20.176371 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k967k\" (UniqueName: \"kubernetes.io/projected/bead1d9b-4518-46a9-bb0b-50316252eb1c-kube-api-access-k967k\") pod \"ovnkube-node-77cnk\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:20.198733 master-0 kubenswrapper[4045]: I0308 03:46:20.198685 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:20.198969 master-0 kubenswrapper[4045]: E0308 03:46:20.198812 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:20.211027 master-0 kubenswrapper[4045]: I0308 03:46:20.210915 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:22.199313 master-0 kubenswrapper[4045]: I0308 03:46:22.199232 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:22.200151 master-0 kubenswrapper[4045]: E0308 03:46:22.199483 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:22.351662 master-0 kubenswrapper[4045]: W0308 03:46:22.351574 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e5935ea_8d95_45e3_b836_c7892953ef3d.slice/crio-c5cc16f26a63d054e0857f2a2f1278a7512a2a20bea66d9521aa218fb1539d3c WatchSource:0}: Error finding container c5cc16f26a63d054e0857f2a2f1278a7512a2a20bea66d9521aa218fb1539d3c: Status 404 returned error can't find the container with id c5cc16f26a63d054e0857f2a2f1278a7512a2a20bea66d9521aa218fb1539d3c Mar 08 03:46:22.352812 master-0 kubenswrapper[4045]: W0308 03:46:22.352592 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbead1d9b_4518_46a9_bb0b_50316252eb1c.slice/crio-bf7e5b799b502d9b50bfc698c644dababdb09d8ea5fe3988d4811d2c5a723d8c WatchSource:0}: Error finding container bf7e5b799b502d9b50bfc698c644dababdb09d8ea5fe3988d4811d2c5a723d8c: Status 404 returned error can't find the container with id bf7e5b799b502d9b50bfc698c644dababdb09d8ea5fe3988d4811d2c5a723d8c Mar 08 03:46:22.491655 master-0 kubenswrapper[4045]: I0308 03:46:22.491566 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" event={"ID":"7e5935ea-8d95-45e3-b836-c7892953ef3d","Type":"ContainerStarted","Data":"c5cc16f26a63d054e0857f2a2f1278a7512a2a20bea66d9521aa218fb1539d3c"} Mar 08 03:46:22.493998 master-0 kubenswrapper[4045]: I0308 03:46:22.493947 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerStarted","Data":"bf7e5b799b502d9b50bfc698c644dababdb09d8ea5fe3988d4811d2c5a723d8c"} Mar 08 03:46:22.875588 master-0 kubenswrapper[4045]: I0308 03:46:22.875463 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xmgpj"] Mar 08 03:46:22.876400 master-0 kubenswrapper[4045]: I0308 03:46:22.876027 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:22.877767 master-0 kubenswrapper[4045]: E0308 03:46:22.877302 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:46:22.977062 master-0 kubenswrapper[4045]: I0308 03:46:22.976952 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kc5q\" (UniqueName: \"kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q\") pod \"network-check-target-xmgpj\" (UID: \"e93b5361-30e6-44fd-a59e-2bc410c59480\") " pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:23.078856 master-0 kubenswrapper[4045]: I0308 03:46:23.078363 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kc5q\" (UniqueName: \"kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q\") pod \"network-check-target-xmgpj\" (UID: \"e93b5361-30e6-44fd-a59e-2bc410c59480\") " pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:23.095188 master-0 kubenswrapper[4045]: E0308 03:46:23.094810 4045 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 03:46:23.095188 master-0 kubenswrapper[4045]: E0308 03:46:23.094881 4045 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 03:46:23.095188 master-0 kubenswrapper[4045]: E0308 03:46:23.094897 4045 projected.go:194] Error preparing data for projected volume kube-api-access-4kc5q for pod openshift-network-diagnostics/network-check-target-xmgpj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:46:23.095188 master-0 kubenswrapper[4045]: E0308 03:46:23.094967 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q podName:e93b5361-30e6-44fd-a59e-2bc410c59480 nodeName:}" failed. No retries permitted until 2026-03-08 03:46:23.594945335 +0000 UTC m=+83.205646303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4kc5q" (UniqueName: "kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q") pod "network-check-target-xmgpj" (UID: "e93b5361-30e6-44fd-a59e-2bc410c59480") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:46:23.498038 master-0 kubenswrapper[4045]: I0308 03:46:23.497993 4045 generic.go:334] "Generic (PLEG): container finished" podID="4a19441e-e61b-4d58-85db-813ae88e1f9b" containerID="3fee6f2c5c3a300e3baa43dfa5cafdfb86c438810bfee3e881783f584b000768" exitCode=0 Mar 08 03:46:23.498504 master-0 kubenswrapper[4045]: I0308 03:46:23.498048 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g564l" event={"ID":"4a19441e-e61b-4d58-85db-813ae88e1f9b","Type":"ContainerDied","Data":"3fee6f2c5c3a300e3baa43dfa5cafdfb86c438810bfee3e881783f584b000768"} Mar 08 03:46:23.501363 master-0 kubenswrapper[4045]: I0308 03:46:23.501346 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" event={"ID":"7e5935ea-8d95-45e3-b836-c7892953ef3d","Type":"ContainerStarted","Data":"3e22396eeb28c7316f58c28a7c2e113bfdb84d8e2c54810a985cd46330595d01"} Mar 08 03:46:23.502981 master-0 kubenswrapper[4045]: I0308 03:46:23.502949 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rpppb" event={"ID":"093f17f0-2818-4e24-b3c3-6ab4da9d21fb","Type":"ContainerStarted","Data":"7433817f7ff7f07218ce884c99740cbd84f7966ba34a1b30f3935a6751d309f7"} Mar 08 03:46:23.528456 master-0 kubenswrapper[4045]: I0308 03:46:23.527678 4045 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rpppb" podStartSLOduration=1.700483938 podStartE2EDuration="16.527635221s" podCreationTimestamp="2026-03-08 03:46:07 +0000 UTC" firstStartedPulling="2026-03-08 03:46:07.627210769 +0000 UTC m=+67.237911757" lastFinishedPulling="2026-03-08 03:46:22.454362032 +0000 UTC m=+82.065063040" observedRunningTime="2026-03-08 03:46:23.526923574 +0000 UTC m=+83.137624532" watchObservedRunningTime="2026-03-08 03:46:23.527635221 +0000 UTC m=+83.138336179" Mar 08 03:46:23.685800 master-0 kubenswrapper[4045]: I0308 03:46:23.685661 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kc5q\" (UniqueName: \"kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q\") pod \"network-check-target-xmgpj\" (UID: \"e93b5361-30e6-44fd-a59e-2bc410c59480\") " pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:23.686008 master-0 kubenswrapper[4045]: E0308 03:46:23.685848 4045 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 03:46:23.686008 master-0 kubenswrapper[4045]: E0308 03:46:23.685869 4045 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 03:46:23.686008 master-0 kubenswrapper[4045]: E0308 03:46:23.685881 4045 projected.go:194] Error preparing data for projected volume kube-api-access-4kc5q for pod openshift-network-diagnostics/network-check-target-xmgpj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:46:23.686008 master-0 kubenswrapper[4045]: E0308 03:46:23.685924 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q podName:e93b5361-30e6-44fd-a59e-2bc410c59480 nodeName:}" failed. No retries permitted until 2026-03-08 03:46:24.68591095 +0000 UTC m=+84.296611908 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4kc5q" (UniqueName: "kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q") pod "network-check-target-xmgpj" (UID: "e93b5361-30e6-44fd-a59e-2bc410c59480") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:46:24.191869 master-0 kubenswrapper[4045]: I0308 03:46:24.191763 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:24.192141 master-0 kubenswrapper[4045]: E0308 03:46:24.191902 4045 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:46:24.192141 master-0 kubenswrapper[4045]: E0308 03:46:24.191955 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs podName:d5044ffd-0686-4679-9894-e696faf33699 nodeName:}" failed. No retries permitted until 2026-03-08 03:46:40.191942065 +0000 UTC m=+99.802643023 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs") pod "network-metrics-daemon-schjl" (UID: "d5044ffd-0686-4679-9894-e696faf33699") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:46:24.199985 master-0 kubenswrapper[4045]: I0308 03:46:24.199626 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:24.200339 master-0 kubenswrapper[4045]: E0308 03:46:24.199814 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:24.695994 master-0 kubenswrapper[4045]: I0308 03:46:24.695860 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kc5q\" (UniqueName: \"kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q\") pod \"network-check-target-xmgpj\" (UID: \"e93b5361-30e6-44fd-a59e-2bc410c59480\") " pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:24.696966 master-0 kubenswrapper[4045]: E0308 03:46:24.696471 4045 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 03:46:24.696966 master-0 kubenswrapper[4045]: E0308 03:46:24.696516 4045 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 03:46:24.696966 master-0 kubenswrapper[4045]: E0308 03:46:24.696532 4045 projected.go:194] Error preparing data for projected volume kube-api-access-4kc5q for pod openshift-network-diagnostics/network-check-target-xmgpj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:46:24.696966 master-0 kubenswrapper[4045]: E0308 03:46:24.696593 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q podName:e93b5361-30e6-44fd-a59e-2bc410c59480 nodeName:}" failed. No retries permitted until 2026-03-08 03:46:26.696570715 +0000 UTC m=+86.307271663 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4kc5q" (UniqueName: "kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q") pod "network-check-target-xmgpj" (UID: "e93b5361-30e6-44fd-a59e-2bc410c59480") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:46:25.198939 master-0 kubenswrapper[4045]: I0308 03:46:25.198881 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:25.199146 master-0 kubenswrapper[4045]: E0308 03:46:25.198988 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:46:25.483175 master-0 kubenswrapper[4045]: I0308 03:46:25.482355 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-ggzm8"] Mar 08 03:46:25.483175 master-0 kubenswrapper[4045]: I0308 03:46:25.482891 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:46:25.488493 master-0 kubenswrapper[4045]: I0308 03:46:25.487816 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 03:46:25.489746 master-0 kubenswrapper[4045]: I0308 03:46:25.489717 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 03:46:25.493866 master-0 kubenswrapper[4045]: I0308 03:46:25.491166 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 03:46:25.493866 master-0 kubenswrapper[4045]: I0308 03:46:25.491774 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 03:46:25.494091 master-0 kubenswrapper[4045]: I0308 03:46:25.493984 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 03:46:25.509530 master-0 kubenswrapper[4045]: I0308 03:46:25.509483 4045 generic.go:334] "Generic (PLEG): container finished" podID="4a19441e-e61b-4d58-85db-813ae88e1f9b" containerID="5b762e908370687f296be27f837eacb773e5b2c7f10d7523e57f3d511196e87d" exitCode=0 Mar 08 03:46:25.509530 master-0 kubenswrapper[4045]: I0308 03:46:25.509524 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g564l" event={"ID":"4a19441e-e61b-4d58-85db-813ae88e1f9b","Type":"ContainerDied","Data":"5b762e908370687f296be27f837eacb773e5b2c7f10d7523e57f3d511196e87d"} Mar 08 03:46:25.607237 master-0 kubenswrapper[4045]: I0308 03:46:25.607175 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/164586b1-f133-4427-8ab6-eb0839b79738-env-overrides\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:46:25.607391 master-0 kubenswrapper[4045]: I0308 03:46:25.607258 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4stz\" (UniqueName: \"kubernetes.io/projected/164586b1-f133-4427-8ab6-eb0839b79738-kube-api-access-r4stz\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:46:25.607391 master-0 kubenswrapper[4045]: I0308 03:46:25.607292 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/164586b1-f133-4427-8ab6-eb0839b79738-webhook-cert\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:46:25.607391 master-0 kubenswrapper[4045]: I0308 03:46:25.607314 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/164586b1-f133-4427-8ab6-eb0839b79738-ovnkube-identity-cm\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:46:25.708568 master-0 kubenswrapper[4045]: I0308 03:46:25.708507 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4stz\" (UniqueName: \"kubernetes.io/projected/164586b1-f133-4427-8ab6-eb0839b79738-kube-api-access-r4stz\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:46:25.709159 master-0 kubenswrapper[4045]: I0308 03:46:25.709112 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/164586b1-f133-4427-8ab6-eb0839b79738-webhook-cert\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:46:25.709216 master-0 kubenswrapper[4045]: I0308 03:46:25.709175 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/164586b1-f133-4427-8ab6-eb0839b79738-ovnkube-identity-cm\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:46:25.709323 master-0 kubenswrapper[4045]: I0308 03:46:25.709233 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/164586b1-f133-4427-8ab6-eb0839b79738-env-overrides\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:46:25.710066 master-0 kubenswrapper[4045]: I0308 03:46:25.710042 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/164586b1-f133-4427-8ab6-eb0839b79738-env-overrides\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:46:25.711019 master-0 kubenswrapper[4045]: I0308 03:46:25.710987 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/164586b1-f133-4427-8ab6-eb0839b79738-ovnkube-identity-cm\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:46:25.718977 master-0 kubenswrapper[4045]: I0308 03:46:25.718937 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/164586b1-f133-4427-8ab6-eb0839b79738-webhook-cert\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:46:25.726941 master-0 kubenswrapper[4045]: I0308 03:46:25.726910 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4stz\" (UniqueName: \"kubernetes.io/projected/164586b1-f133-4427-8ab6-eb0839b79738-kube-api-access-r4stz\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:46:25.808243 master-0 kubenswrapper[4045]: I0308 03:46:25.808109 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:46:26.199623 master-0 kubenswrapper[4045]: I0308 03:46:26.199579 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:26.199925 master-0 kubenswrapper[4045]: E0308 03:46:26.199710 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:26.212188 master-0 kubenswrapper[4045]: I0308 03:46:26.212134 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 08 03:46:26.514755 master-0 kubenswrapper[4045]: I0308 03:46:26.514693 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-ggzm8" event={"ID":"164586b1-f133-4427-8ab6-eb0839b79738","Type":"ContainerStarted","Data":"6588c21791f0b9fd7a866ced5165aad3ddf504a15e8585434bc4836ba3395293"} Mar 08 03:46:26.715813 master-0 kubenswrapper[4045]: I0308 03:46:26.715761 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kc5q\" (UniqueName: \"kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q\") pod \"network-check-target-xmgpj\" (UID: \"e93b5361-30e6-44fd-a59e-2bc410c59480\") " pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:26.718344 master-0 kubenswrapper[4045]: E0308 03:46:26.715901 4045 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 03:46:26.718344 master-0 kubenswrapper[4045]: E0308 03:46:26.715919 4045 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 03:46:26.718344 master-0 kubenswrapper[4045]: E0308 03:46:26.715930 4045 projected.go:194] Error preparing data for projected volume kube-api-access-4kc5q for pod openshift-network-diagnostics/network-check-target-xmgpj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:46:26.718344 master-0 kubenswrapper[4045]: E0308 03:46:26.715983 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q podName:e93b5361-30e6-44fd-a59e-2bc410c59480 nodeName:}" failed. No retries permitted until 2026-03-08 03:46:30.715970049 +0000 UTC m=+90.326671007 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-4kc5q" (UniqueName: "kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q") pod "network-check-target-xmgpj" (UID: "e93b5361-30e6-44fd-a59e-2bc410c59480") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:46:27.199552 master-0 kubenswrapper[4045]: I0308 03:46:27.199468 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:27.199805 master-0 kubenswrapper[4045]: E0308 03:46:27.199630 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:46:27.521675 master-0 kubenswrapper[4045]: I0308 03:46:27.521498 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g564l" event={"ID":"4a19441e-e61b-4d58-85db-813ae88e1f9b","Type":"ContainerDied","Data":"6808f9225d491b34c3cf7a01d443c6732f48fff26280b582719d87525223329a"} Mar 08 03:46:27.522004 master-0 kubenswrapper[4045]: I0308 03:46:27.521330 4045 generic.go:334] "Generic (PLEG): container finished" podID="4a19441e-e61b-4d58-85db-813ae88e1f9b" containerID="6808f9225d491b34c3cf7a01d443c6732f48fff26280b582719d87525223329a" exitCode=0 Mar 08 03:46:28.199388 master-0 kubenswrapper[4045]: I0308 03:46:28.199319 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:28.200032 master-0 kubenswrapper[4045]: E0308 03:46:28.199476 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:28.225388 master-0 kubenswrapper[4045]: I0308 03:46:28.225278 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 08 03:46:28.230174 master-0 kubenswrapper[4045]: I0308 03:46:28.230108 4045 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podStartSLOduration=2.230094945 podStartE2EDuration="2.230094945s" podCreationTimestamp="2026-03-08 03:46:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:46:28.22987851 +0000 UTC m=+87.840579478" watchObservedRunningTime="2026-03-08 03:46:28.230094945 +0000 UTC m=+87.840795903" Mar 08 03:46:29.199842 master-0 kubenswrapper[4045]: I0308 03:46:29.199715 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:29.200776 master-0 kubenswrapper[4045]: E0308 03:46:29.200126 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:46:30.199054 master-0 kubenswrapper[4045]: I0308 03:46:30.199016 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:30.199233 master-0 kubenswrapper[4045]: E0308 03:46:30.199137 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:30.749283 master-0 kubenswrapper[4045]: I0308 03:46:30.749212 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kc5q\" (UniqueName: \"kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q\") pod \"network-check-target-xmgpj\" (UID: \"e93b5361-30e6-44fd-a59e-2bc410c59480\") " pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:30.749854 master-0 kubenswrapper[4045]: E0308 03:46:30.749346 4045 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 03:46:30.749854 master-0 kubenswrapper[4045]: E0308 03:46:30.749365 4045 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 03:46:30.749854 master-0 kubenswrapper[4045]: E0308 03:46:30.749376 4045 projected.go:194] Error preparing data for projected volume kube-api-access-4kc5q for pod openshift-network-diagnostics/network-check-target-xmgpj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:46:30.749854 master-0 kubenswrapper[4045]: E0308 03:46:30.749419 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q podName:e93b5361-30e6-44fd-a59e-2bc410c59480 nodeName:}" failed. No retries permitted until 2026-03-08 03:46:38.749405949 +0000 UTC m=+98.360106907 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-4kc5q" (UniqueName: "kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q") pod "network-check-target-xmgpj" (UID: "e93b5361-30e6-44fd-a59e-2bc410c59480") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:46:31.199135 master-0 kubenswrapper[4045]: I0308 03:46:31.199082 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:31.200082 master-0 kubenswrapper[4045]: E0308 03:46:31.200022 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:46:32.199576 master-0 kubenswrapper[4045]: I0308 03:46:32.199499 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:32.200152 master-0 kubenswrapper[4045]: E0308 03:46:32.199707 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:33.199081 master-0 kubenswrapper[4045]: I0308 03:46:33.198975 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:33.199325 master-0 kubenswrapper[4045]: E0308 03:46:33.199159 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:46:34.198693 master-0 kubenswrapper[4045]: I0308 03:46:34.198646 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:34.199558 master-0 kubenswrapper[4045]: E0308 03:46:34.198753 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:35.199059 master-0 kubenswrapper[4045]: I0308 03:46:35.198983 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:35.199059 master-0 kubenswrapper[4045]: E0308 03:46:35.199121 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:46:36.199840 master-0 kubenswrapper[4045]: I0308 03:46:36.199763 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:36.200392 master-0 kubenswrapper[4045]: E0308 03:46:36.199987 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:37.201653 master-0 kubenswrapper[4045]: I0308 03:46:37.200374 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:37.201653 master-0 kubenswrapper[4045]: E0308 03:46:37.200482 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:46:37.234187 master-0 kubenswrapper[4045]: I0308 03:46:37.234042 4045 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-controller-manager-master-0" podStartSLOduration=9.234030666 podStartE2EDuration="9.234030666s" podCreationTimestamp="2026-03-08 03:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:46:31.215070711 +0000 UTC m=+90.825771669" watchObservedRunningTime="2026-03-08 03:46:37.234030666 +0000 UTC m=+96.844731634" Mar 08 03:46:37.234387 master-0 kubenswrapper[4045]: I0308 03:46:37.234363 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 08 03:46:38.199527 master-0 kubenswrapper[4045]: I0308 03:46:38.198993 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:38.199527 master-0 kubenswrapper[4045]: E0308 03:46:38.199124 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:38.813110 master-0 kubenswrapper[4045]: I0308 03:46:38.813018 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kc5q\" (UniqueName: \"kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q\") pod \"network-check-target-xmgpj\" (UID: \"e93b5361-30e6-44fd-a59e-2bc410c59480\") " pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:38.814082 master-0 kubenswrapper[4045]: E0308 03:46:38.813258 4045 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 03:46:38.814082 master-0 kubenswrapper[4045]: E0308 03:46:38.813308 4045 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 03:46:38.814082 master-0 kubenswrapper[4045]: E0308 03:46:38.813320 4045 projected.go:194] Error preparing data for projected volume kube-api-access-4kc5q for pod openshift-network-diagnostics/network-check-target-xmgpj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:46:38.814082 master-0 kubenswrapper[4045]: E0308 03:46:38.813378 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q podName:e93b5361-30e6-44fd-a59e-2bc410c59480 nodeName:}" failed. No retries permitted until 2026-03-08 03:46:54.813362241 +0000 UTC m=+114.424063199 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-4kc5q" (UniqueName: "kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q") pod "network-check-target-xmgpj" (UID: "e93b5361-30e6-44fd-a59e-2bc410c59480") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:46:39.198863 master-0 kubenswrapper[4045]: I0308 03:46:39.198795 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:39.199079 master-0 kubenswrapper[4045]: E0308 03:46:39.199024 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:46:40.198859 master-0 kubenswrapper[4045]: I0308 03:46:40.198769 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:40.199660 master-0 kubenswrapper[4045]: E0308 03:46:40.198898 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:40.223659 master-0 kubenswrapper[4045]: I0308 03:46:40.223623 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:40.223791 master-0 kubenswrapper[4045]: E0308 03:46:40.223671 4045 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:46:40.223791 master-0 kubenswrapper[4045]: E0308 03:46:40.223725 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs podName:d5044ffd-0686-4679-9894-e696faf33699 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:12.223708314 +0000 UTC m=+131.834409272 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs") pod "network-metrics-daemon-schjl" (UID: "d5044ffd-0686-4679-9894-e696faf33699") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:46:41.031126 master-0 kubenswrapper[4045]: I0308 03:46:41.031071 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:46:41.031313 master-0 kubenswrapper[4045]: E0308 03:46:41.031219 4045 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:46:41.031313 master-0 kubenswrapper[4045]: E0308 03:46:41.031286 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert podName:349d438d-d124-4d34-a172-4160e766c680 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:45.031265702 +0000 UTC m=+164.641966660 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert") pod "cluster-version-operator-745944c6b7-gvmnp" (UID: "349d438d-d124-4d34-a172-4160e766c680") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:46:41.198854 master-0 kubenswrapper[4045]: I0308 03:46:41.198794 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:41.199863 master-0 kubenswrapper[4045]: E0308 03:46:41.199800 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:46:41.564717 master-0 kubenswrapper[4045]: I0308 03:46:41.564489 4045 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-scheduler-master-0" podStartSLOduration=4.5644499750000005 podStartE2EDuration="4.564449975s" podCreationTimestamp="2026-03-08 03:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:46:41.562299275 +0000 UTC m=+101.173000263" watchObservedRunningTime="2026-03-08 03:46:41.564449975 +0000 UTC m=+101.175150963" Mar 08 03:46:42.198855 master-0 kubenswrapper[4045]: I0308 03:46:42.198746 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:42.199266 master-0 kubenswrapper[4045]: E0308 03:46:42.198978 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:43.199964 master-0 kubenswrapper[4045]: I0308 03:46:43.199812 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:43.200714 master-0 kubenswrapper[4045]: E0308 03:46:43.200077 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:46:44.198759 master-0 kubenswrapper[4045]: I0308 03:46:44.198719 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:44.198985 master-0 kubenswrapper[4045]: E0308 03:46:44.198942 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:44.563005 master-0 kubenswrapper[4045]: I0308 03:46:44.562462 4045 generic.go:334] "Generic (PLEG): container finished" podID="4a19441e-e61b-4d58-85db-813ae88e1f9b" containerID="e3fd81814e45a4dba9c86317c3e1475a8abfc09eed557e2f1e1628bc58babab2" exitCode=0 Mar 08 03:46:44.563005 master-0 kubenswrapper[4045]: I0308 03:46:44.562514 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g564l" event={"ID":"4a19441e-e61b-4d58-85db-813ae88e1f9b","Type":"ContainerDied","Data":"e3fd81814e45a4dba9c86317c3e1475a8abfc09eed557e2f1e1628bc58babab2"} Mar 08 03:46:44.565719 master-0 kubenswrapper[4045]: I0308 03:46:44.565663 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" event={"ID":"7e5935ea-8d95-45e3-b836-c7892953ef3d","Type":"ContainerStarted","Data":"7fe9302ada8235a3afd5b8f3fc53b3d920a5fbae69778891c3722690a5eb8590"} Mar 08 03:46:44.567648 master-0 kubenswrapper[4045]: I0308 03:46:44.567592 4045 generic.go:334] "Generic (PLEG): container finished" podID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerID="a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295" exitCode=0 Mar 08 03:46:44.567736 master-0 kubenswrapper[4045]: I0308 03:46:44.567680 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerDied","Data":"a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295"} Mar 08 03:46:44.578100 master-0 kubenswrapper[4045]: I0308 03:46:44.578045 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-ggzm8" event={"ID":"164586b1-f133-4427-8ab6-eb0839b79738","Type":"ContainerStarted","Data":"eca6f5647fbdf9b3ef8c7044a7fb91cd16de860543c74991829e340da4a238fe"} Mar 08 03:46:44.578100 master-0 kubenswrapper[4045]: I0308 03:46:44.578092 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-ggzm8" event={"ID":"164586b1-f133-4427-8ab6-eb0839b79738","Type":"ContainerStarted","Data":"95ebb0d3c7a2d58de076ad2d0e4bb2608c677ee3a5e1cb18663eca2276ae9b72"} Mar 08 03:46:44.607377 master-0 kubenswrapper[4045]: I0308 03:46:44.607287 4045 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-ggzm8" podStartSLOduration=1.344213018 podStartE2EDuration="19.607266633s" podCreationTimestamp="2026-03-08 03:46:25 +0000 UTC" firstStartedPulling="2026-03-08 03:46:25.824006171 +0000 UTC m=+85.434707129" lastFinishedPulling="2026-03-08 03:46:44.087059756 +0000 UTC m=+103.697760744" observedRunningTime="2026-03-08 03:46:44.606351782 +0000 UTC m=+104.217052770" watchObservedRunningTime="2026-03-08 03:46:44.607266633 +0000 UTC m=+104.217967631" Mar 08 03:46:44.670983 master-0 kubenswrapper[4045]: I0308 03:46:44.666197 4045 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" podStartSLOduration=4.207333644 podStartE2EDuration="25.666168642s" podCreationTimestamp="2026-03-08 03:46:19 +0000 UTC" firstStartedPulling="2026-03-08 03:46:22.628226568 +0000 UTC m=+82.238927526" lastFinishedPulling="2026-03-08 03:46:44.087061556 +0000 UTC m=+103.697762524" observedRunningTime="2026-03-08 03:46:44.634735071 +0000 UTC m=+104.245436069" watchObservedRunningTime="2026-03-08 03:46:44.666168642 +0000 UTC m=+104.276869630" Mar 08 03:46:45.199649 master-0 kubenswrapper[4045]: I0308 03:46:45.199328 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:45.199789 master-0 kubenswrapper[4045]: E0308 03:46:45.199728 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:46:45.318091 master-0 kubenswrapper[4045]: I0308 03:46:45.318024 4045 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-77cnk"] Mar 08 03:46:45.586447 master-0 kubenswrapper[4045]: I0308 03:46:45.586310 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerStarted","Data":"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d"} Mar 08 03:46:45.586447 master-0 kubenswrapper[4045]: I0308 03:46:45.586361 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerStarted","Data":"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1"} Mar 08 03:46:45.586447 master-0 kubenswrapper[4045]: I0308 03:46:45.586376 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerStarted","Data":"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1"} Mar 08 03:46:45.586447 master-0 kubenswrapper[4045]: I0308 03:46:45.586397 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerStarted","Data":"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582"} Mar 08 03:46:45.586447 master-0 kubenswrapper[4045]: I0308 03:46:45.586408 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerStarted","Data":"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c"} Mar 08 03:46:45.586447 master-0 kubenswrapper[4045]: I0308 03:46:45.586420 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerStarted","Data":"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa"} Mar 08 03:46:45.591953 master-0 kubenswrapper[4045]: I0308 03:46:45.591893 4045 generic.go:334] "Generic (PLEG): container finished" podID="4a19441e-e61b-4d58-85db-813ae88e1f9b" containerID="1362430063d3256452d2e164a40d29841ede63c548ec607c9dddfdd02a33cead" exitCode=0 Mar 08 03:46:45.593423 master-0 kubenswrapper[4045]: I0308 03:46:45.593366 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g564l" event={"ID":"4a19441e-e61b-4d58-85db-813ae88e1f9b","Type":"ContainerDied","Data":"1362430063d3256452d2e164a40d29841ede63c548ec607c9dddfdd02a33cead"} Mar 08 03:46:46.199936 master-0 kubenswrapper[4045]: I0308 03:46:46.199863 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:46.200174 master-0 kubenswrapper[4045]: E0308 03:46:46.200069 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:46.600332 master-0 kubenswrapper[4045]: I0308 03:46:46.600204 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g564l" event={"ID":"4a19441e-e61b-4d58-85db-813ae88e1f9b","Type":"ContainerStarted","Data":"2248160d030097a9a3013ae3d7089c5ed17f3b5dd45a9b13e6f9549c78a0eb68"} Mar 08 03:46:47.199459 master-0 kubenswrapper[4045]: I0308 03:46:47.199360 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:47.199772 master-0 kubenswrapper[4045]: E0308 03:46:47.199571 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:46:47.609949 master-0 kubenswrapper[4045]: I0308 03:46:47.609877 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerStarted","Data":"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043"} Mar 08 03:46:48.199343 master-0 kubenswrapper[4045]: I0308 03:46:48.199243 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:48.199624 master-0 kubenswrapper[4045]: E0308 03:46:48.199414 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:49.200050 master-0 kubenswrapper[4045]: I0308 03:46:49.199929 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:49.200976 master-0 kubenswrapper[4045]: E0308 03:46:49.200095 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:46:50.199961 master-0 kubenswrapper[4045]: I0308 03:46:50.199489 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:50.199961 master-0 kubenswrapper[4045]: E0308 03:46:50.199933 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:50.628366 master-0 kubenswrapper[4045]: I0308 03:46:50.628192 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerStarted","Data":"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62"} Mar 08 03:46:50.628572 master-0 kubenswrapper[4045]: I0308 03:46:50.628497 4045 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="ovn-controller" containerID="cri-o://b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa" gracePeriod=30 Mar 08 03:46:50.628657 master-0 kubenswrapper[4045]: I0308 03:46:50.628616 4045 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:50.628716 master-0 kubenswrapper[4045]: I0308 03:46:50.628690 4045 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="nbdb" containerID="cri-o://4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d" gracePeriod=30 Mar 08 03:46:50.628783 master-0 kubenswrapper[4045]: I0308 03:46:50.628655 4045 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1" gracePeriod=30 Mar 08 03:46:50.628783 master-0 kubenswrapper[4045]: I0308 03:46:50.628734 4045 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="kube-rbac-proxy-node" containerID="cri-o://8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582" gracePeriod=30 Mar 08 03:46:50.628922 master-0 kubenswrapper[4045]: I0308 03:46:50.628769 4045 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="northd" containerID="cri-o://added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1" gracePeriod=30 Mar 08 03:46:50.628922 master-0 kubenswrapper[4045]: I0308 03:46:50.628750 4045 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="ovn-acl-logging" containerID="cri-o://162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c" gracePeriod=30 Mar 08 03:46:50.629027 master-0 kubenswrapper[4045]: I0308 03:46:50.628899 4045 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="sbdb" containerID="cri-o://0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043" gracePeriod=30 Mar 08 03:46:50.643020 master-0 kubenswrapper[4045]: E0308 03:46:50.642651 4045 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 08 03:46:50.658199 master-0 kubenswrapper[4045]: E0308 03:46:50.647247 4045 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 08 03:46:50.658199 master-0 kubenswrapper[4045]: E0308 03:46:50.648967 4045 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 08 03:46:50.658199 master-0 kubenswrapper[4045]: E0308 03:46:50.649018 4045 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="sbdb" Mar 08 03:46:50.664731 master-0 kubenswrapper[4045]: I0308 03:46:50.662818 4045 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="ovnkube-controller" containerID="cri-o://3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62" gracePeriod=30 Mar 08 03:46:50.664731 master-0 kubenswrapper[4045]: I0308 03:46:50.664170 4045 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" podStartSLOduration=9.961614493 podStartE2EDuration="31.664142323s" podCreationTimestamp="2026-03-08 03:46:19 +0000 UTC" firstStartedPulling="2026-03-08 03:46:22.357937519 +0000 UTC m=+81.968638517" lastFinishedPulling="2026-03-08 03:46:44.060465349 +0000 UTC m=+103.671166347" observedRunningTime="2026-03-08 03:46:50.663585409 +0000 UTC m=+110.274286457" watchObservedRunningTime="2026-03-08 03:46:50.664142323 +0000 UTC m=+110.274843321" Mar 08 03:46:50.664731 master-0 kubenswrapper[4045]: I0308 03:46:50.664678 4045 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-g564l" podStartSLOduration=7.507687305 podStartE2EDuration="43.664667035s" podCreationTimestamp="2026-03-08 03:46:07 +0000 UTC" firstStartedPulling="2026-03-08 03:46:07.846755092 +0000 UTC m=+67.457456090" lastFinishedPulling="2026-03-08 03:46:44.003734852 +0000 UTC m=+103.614435820" observedRunningTime="2026-03-08 03:46:46.632134676 +0000 UTC m=+106.242835694" watchObservedRunningTime="2026-03-08 03:46:50.664667035 +0000 UTC m=+110.275368043" Mar 08 03:46:51.198913 master-0 kubenswrapper[4045]: I0308 03:46:51.198865 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:51.199613 master-0 kubenswrapper[4045]: E0308 03:46:51.199567 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:46:51.310324 master-0 kubenswrapper[4045]: I0308 03:46:51.310268 4045 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-77cnk_bead1d9b-4518-46a9-bb0b-50316252eb1c/ovnkube-controller/0.log" Mar 08 03:46:51.313096 master-0 kubenswrapper[4045]: I0308 03:46:51.313051 4045 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-77cnk_bead1d9b-4518-46a9-bb0b-50316252eb1c/kube-rbac-proxy-ovn-metrics/0.log" Mar 08 03:46:51.313790 master-0 kubenswrapper[4045]: I0308 03:46:51.313751 4045 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-77cnk_bead1d9b-4518-46a9-bb0b-50316252eb1c/kube-rbac-proxy-node/0.log" Mar 08 03:46:51.314485 master-0 kubenswrapper[4045]: I0308 03:46:51.314447 4045 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-77cnk_bead1d9b-4518-46a9-bb0b-50316252eb1c/ovn-acl-logging/0.log" Mar 08 03:46:51.315346 master-0 kubenswrapper[4045]: I0308 03:46:51.315301 4045 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-77cnk_bead1d9b-4518-46a9-bb0b-50316252eb1c/ovn-controller/0.log" Mar 08 03:46:51.316003 master-0 kubenswrapper[4045]: I0308 03:46:51.315963 4045 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:51.363386 master-0 kubenswrapper[4045]: I0308 03:46:51.363318 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-run-ovn-kubernetes\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.363577 master-0 kubenswrapper[4045]: I0308 03:46:51.363414 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k967k\" (UniqueName: \"kubernetes.io/projected/bead1d9b-4518-46a9-bb0b-50316252eb1c-kube-api-access-k967k\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.363577 master-0 kubenswrapper[4045]: I0308 03:46:51.363466 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-systemd-units\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.363577 master-0 kubenswrapper[4045]: I0308 03:46:51.363416 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:46:51.363577 master-0 kubenswrapper[4045]: I0308 03:46:51.363523 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-kubelet\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.363728 master-0 kubenswrapper[4045]: I0308 03:46:51.363579 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bead1d9b-4518-46a9-bb0b-50316252eb1c-ovn-node-metrics-cert\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.363728 master-0 kubenswrapper[4045]: I0308 03:46:51.363628 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.363728 master-0 kubenswrapper[4045]: I0308 03:46:51.363586 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:46:51.363728 master-0 kubenswrapper[4045]: I0308 03:46:51.363615 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:46:51.363728 master-0 kubenswrapper[4045]: I0308 03:46:51.363689 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bead1d9b-4518-46a9-bb0b-50316252eb1c-ovnkube-script-lib\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.363728 master-0 kubenswrapper[4045]: I0308 03:46:51.363699 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:46:51.364008 master-0 kubenswrapper[4045]: I0308 03:46:51.363734 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-run-openvswitch\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.364008 master-0 kubenswrapper[4045]: I0308 03:46:51.363784 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-node-log\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.364008 master-0 kubenswrapper[4045]: I0308 03:46:51.363872 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bead1d9b-4518-46a9-bb0b-50316252eb1c-env-overrides\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.364008 master-0 kubenswrapper[4045]: I0308 03:46:51.363923 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-node-log" (OuterVolumeSpecName: "node-log") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:46:51.364008 master-0 kubenswrapper[4045]: I0308 03:46:51.363930 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:46:51.364008 master-0 kubenswrapper[4045]: I0308 03:46:51.363976 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-cni-netd\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.364008 master-0 kubenswrapper[4045]: I0308 03:46:51.364007 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-cni-bin\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.364335 master-0 kubenswrapper[4045]: I0308 03:46:51.364028 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-run-netns\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.364335 master-0 kubenswrapper[4045]: I0308 03:46:51.364083 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:46:51.364335 master-0 kubenswrapper[4045]: I0308 03:46:51.364092 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-run-ovn\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.364335 master-0 kubenswrapper[4045]: I0308 03:46:51.364089 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:46:51.364335 master-0 kubenswrapper[4045]: I0308 03:46:51.364146 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:46:51.364335 master-0 kubenswrapper[4045]: I0308 03:46:51.364116 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:46:51.364335 master-0 kubenswrapper[4045]: I0308 03:46:51.364121 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bead1d9b-4518-46a9-bb0b-50316252eb1c-ovnkube-config\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.364335 master-0 kubenswrapper[4045]: I0308 03:46:51.364241 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-slash\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.364335 master-0 kubenswrapper[4045]: I0308 03:46:51.364284 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-slash" (OuterVolumeSpecName: "host-slash") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:46:51.364727 master-0 kubenswrapper[4045]: I0308 03:46:51.364364 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-var-lib-openvswitch\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.364727 master-0 kubenswrapper[4045]: I0308 03:46:51.364388 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:46:51.364727 master-0 kubenswrapper[4045]: I0308 03:46:51.364394 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-etc-openvswitch\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.364727 master-0 kubenswrapper[4045]: I0308 03:46:51.364421 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:46:51.364727 master-0 kubenswrapper[4045]: I0308 03:46:51.364452 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-log-socket\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.364727 master-0 kubenswrapper[4045]: I0308 03:46:51.364474 4045 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-run-systemd\") pod \"bead1d9b-4518-46a9-bb0b-50316252eb1c\" (UID: \"bead1d9b-4518-46a9-bb0b-50316252eb1c\") " Mar 08 03:46:51.364727 master-0 kubenswrapper[4045]: I0308 03:46:51.364523 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-log-socket" (OuterVolumeSpecName: "log-socket") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:46:51.364727 master-0 kubenswrapper[4045]: I0308 03:46:51.364563 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bead1d9b-4518-46a9-bb0b-50316252eb1c-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:46:51.364727 master-0 kubenswrapper[4045]: I0308 03:46:51.364618 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bead1d9b-4518-46a9-bb0b-50316252eb1c-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:46:51.364727 master-0 kubenswrapper[4045]: I0308 03:46:51.364639 4045 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-var-lib-cni-networks-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.364727 master-0 kubenswrapper[4045]: I0308 03:46:51.364698 4045 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-run-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.364727 master-0 kubenswrapper[4045]: I0308 03:46:51.364729 4045 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-node-log\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.365165 master-0 kubenswrapper[4045]: I0308 03:46:51.364757 4045 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-cni-netd\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.365165 master-0 kubenswrapper[4045]: I0308 03:46:51.364786 4045 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-cni-bin\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.365165 master-0 kubenswrapper[4045]: I0308 03:46:51.364812 4045 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-run-netns\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.365165 master-0 kubenswrapper[4045]: I0308 03:46:51.364780 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bead1d9b-4518-46a9-bb0b-50316252eb1c-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:46:51.365165 master-0 kubenswrapper[4045]: I0308 03:46:51.364880 4045 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.365165 master-0 kubenswrapper[4045]: I0308 03:46:51.364971 4045 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bead1d9b-4518-46a9-bb0b-50316252eb1c-ovnkube-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.365165 master-0 kubenswrapper[4045]: I0308 03:46:51.365029 4045 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-slash\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.365165 master-0 kubenswrapper[4045]: I0308 03:46:51.365057 4045 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-var-lib-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.365165 master-0 kubenswrapper[4045]: I0308 03:46:51.365083 4045 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-etc-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.365165 master-0 kubenswrapper[4045]: I0308 03:46:51.365112 4045 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-log-socket\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.365165 master-0 kubenswrapper[4045]: I0308 03:46:51.365139 4045 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-run-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.365165 master-0 kubenswrapper[4045]: I0308 03:46:51.365168 4045 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-systemd-units\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.365593 master-0 kubenswrapper[4045]: I0308 03:46:51.365196 4045 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-host-kubelet\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.374856 master-0 kubenswrapper[4045]: I0308 03:46:51.373338 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bead1d9b-4518-46a9-bb0b-50316252eb1c-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:46:51.379852 master-0 kubenswrapper[4045]: I0308 03:46:51.378004 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bead1d9b-4518-46a9-bb0b-50316252eb1c-kube-api-access-k967k" (OuterVolumeSpecName: "kube-api-access-k967k") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "kube-api-access-k967k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:46:51.379852 master-0 kubenswrapper[4045]: I0308 03:46:51.378273 4045 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "bead1d9b-4518-46a9-bb0b-50316252eb1c" (UID: "bead1d9b-4518-46a9-bb0b-50316252eb1c"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:46:51.411711 master-0 kubenswrapper[4045]: I0308 03:46:51.411661 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jc6rf"] Mar 08 03:46:51.411891 master-0 kubenswrapper[4045]: E0308 03:46:51.411781 4045 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="nbdb" Mar 08 03:46:51.411891 master-0 kubenswrapper[4045]: I0308 03:46:51.411799 4045 state_mem.go:107] "Deleted CPUSet assignment" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="nbdb" Mar 08 03:46:51.411891 master-0 kubenswrapper[4045]: E0308 03:46:51.411812 4045 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="sbdb" Mar 08 03:46:51.411891 master-0 kubenswrapper[4045]: I0308 03:46:51.411846 4045 state_mem.go:107] "Deleted CPUSet assignment" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="sbdb" Mar 08 03:46:51.411891 master-0 kubenswrapper[4045]: E0308 03:46:51.411857 4045 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="kube-rbac-proxy-node" Mar 08 03:46:51.411891 master-0 kubenswrapper[4045]: I0308 03:46:51.411866 4045 state_mem.go:107] "Deleted CPUSet assignment" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="kube-rbac-proxy-node" Mar 08 03:46:51.411891 master-0 kubenswrapper[4045]: E0308 03:46:51.411877 4045 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="northd" Mar 08 03:46:51.411891 master-0 kubenswrapper[4045]: I0308 03:46:51.411889 4045 state_mem.go:107] "Deleted CPUSet assignment" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="northd" Mar 08 03:46:51.411891 master-0 kubenswrapper[4045]: E0308 03:46:51.411898 4045 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="ovnkube-controller" Mar 08 03:46:51.412135 master-0 kubenswrapper[4045]: I0308 03:46:51.411907 4045 state_mem.go:107] "Deleted CPUSet assignment" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="ovnkube-controller" Mar 08 03:46:51.412135 master-0 kubenswrapper[4045]: E0308 03:46:51.411918 4045 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="ovn-acl-logging" Mar 08 03:46:51.412135 master-0 kubenswrapper[4045]: I0308 03:46:51.411928 4045 state_mem.go:107] "Deleted CPUSet assignment" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="ovn-acl-logging" Mar 08 03:46:51.412135 master-0 kubenswrapper[4045]: E0308 03:46:51.411939 4045 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="kubecfg-setup" Mar 08 03:46:51.412135 master-0 kubenswrapper[4045]: I0308 03:46:51.411949 4045 state_mem.go:107] "Deleted CPUSet assignment" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="kubecfg-setup" Mar 08 03:46:51.412135 master-0 kubenswrapper[4045]: E0308 03:46:51.411959 4045 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="ovn-controller" Mar 08 03:46:51.412135 master-0 kubenswrapper[4045]: I0308 03:46:51.411968 4045 state_mem.go:107] "Deleted CPUSet assignment" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="ovn-controller" Mar 08 03:46:51.412135 master-0 kubenswrapper[4045]: E0308 03:46:51.411978 4045 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 03:46:51.412135 master-0 kubenswrapper[4045]: I0308 03:46:51.411989 4045 state_mem.go:107] "Deleted CPUSet assignment" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 03:46:51.412135 master-0 kubenswrapper[4045]: I0308 03:46:51.412036 4045 memory_manager.go:354] "RemoveStaleState removing state" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="kube-rbac-proxy-node" Mar 08 03:46:51.412135 master-0 kubenswrapper[4045]: I0308 03:46:51.412046 4045 memory_manager.go:354] "RemoveStaleState removing state" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="northd" Mar 08 03:46:51.412135 master-0 kubenswrapper[4045]: I0308 03:46:51.412054 4045 memory_manager.go:354] "RemoveStaleState removing state" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="sbdb" Mar 08 03:46:51.412135 master-0 kubenswrapper[4045]: I0308 03:46:51.412062 4045 memory_manager.go:354] "RemoveStaleState removing state" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="ovnkube-controller" Mar 08 03:46:51.412135 master-0 kubenswrapper[4045]: I0308 03:46:51.412070 4045 memory_manager.go:354] "RemoveStaleState removing state" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 03:46:51.412135 master-0 kubenswrapper[4045]: I0308 03:46:51.412077 4045 memory_manager.go:354] "RemoveStaleState removing state" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="ovn-acl-logging" Mar 08 03:46:51.412135 master-0 kubenswrapper[4045]: I0308 03:46:51.412085 4045 memory_manager.go:354] "RemoveStaleState removing state" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="nbdb" Mar 08 03:46:51.412135 master-0 kubenswrapper[4045]: I0308 03:46:51.412095 4045 memory_manager.go:354] "RemoveStaleState removing state" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerName="ovn-controller" Mar 08 03:46:51.412841 master-0 kubenswrapper[4045]: I0308 03:46:51.412795 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.465910 master-0 kubenswrapper[4045]: I0308 03:46:51.465772 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-cni-bin\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.465910 master-0 kubenswrapper[4045]: I0308 03:46:51.465889 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovn-node-metrics-cert\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466085 master-0 kubenswrapper[4045]: I0308 03:46:51.465967 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-run-ovn-kubernetes\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466085 master-0 kubenswrapper[4045]: I0308 03:46:51.466002 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-log-socket\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466085 master-0 kubenswrapper[4045]: I0308 03:46:51.466035 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-kubelet\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466085 master-0 kubenswrapper[4045]: I0308 03:46:51.466069 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-systemd-units\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466183 master-0 kubenswrapper[4045]: I0308 03:46:51.466112 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-run-netns\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466183 master-0 kubenswrapper[4045]: I0308 03:46:51.466143 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovnkube-config\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466232 master-0 kubenswrapper[4045]: I0308 03:46:51.466205 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-env-overrides\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466256 master-0 kubenswrapper[4045]: I0308 03:46:51.466242 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vklx\" (UniqueName: \"kubernetes.io/projected/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-kube-api-access-2vklx\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466289 master-0 kubenswrapper[4045]: I0308 03:46:51.466270 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-systemd\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466289 master-0 kubenswrapper[4045]: I0308 03:46:51.466284 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-etc-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466337 master-0 kubenswrapper[4045]: I0308 03:46:51.466315 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466451 master-0 kubenswrapper[4045]: I0308 03:46:51.466392 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-slash\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466497 master-0 kubenswrapper[4045]: I0308 03:46:51.466477 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466553 master-0 kubenswrapper[4045]: I0308 03:46:51.466523 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-ovn\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466604 master-0 kubenswrapper[4045]: I0308 03:46:51.466579 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-var-lib-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466634 master-0 kubenswrapper[4045]: I0308 03:46:51.466620 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-cni-netd\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466676 master-0 kubenswrapper[4045]: I0308 03:46:51.466653 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovnkube-script-lib\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466747 master-0 kubenswrapper[4045]: I0308 03:46:51.466714 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-node-log\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.466800 master-0 kubenswrapper[4045]: I0308 03:46:51.466780 4045 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bead1d9b-4518-46a9-bb0b-50316252eb1c-run-systemd\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.466856 master-0 kubenswrapper[4045]: I0308 03:46:51.466808 4045 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k967k\" (UniqueName: \"kubernetes.io/projected/bead1d9b-4518-46a9-bb0b-50316252eb1c-kube-api-access-k967k\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.466886 master-0 kubenswrapper[4045]: I0308 03:46:51.466867 4045 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bead1d9b-4518-46a9-bb0b-50316252eb1c-ovn-node-metrics-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.466919 master-0 kubenswrapper[4045]: I0308 03:46:51.466889 4045 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bead1d9b-4518-46a9-bb0b-50316252eb1c-ovnkube-script-lib\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.466919 master-0 kubenswrapper[4045]: I0308 03:46:51.466907 4045 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bead1d9b-4518-46a9-bb0b-50316252eb1c-env-overrides\") on node \"master-0\" DevicePath \"\"" Mar 08 03:46:51.567284 master-0 kubenswrapper[4045]: I0308 03:46:51.567217 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-node-log\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.567284 master-0 kubenswrapper[4045]: I0308 03:46:51.567278 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-cni-bin\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.567493 master-0 kubenswrapper[4045]: I0308 03:46:51.567401 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-node-log\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.567532 master-0 kubenswrapper[4045]: I0308 03:46:51.567492 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovn-node-metrics-cert\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.567734 master-0 kubenswrapper[4045]: I0308 03:46:51.567697 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-cni-bin\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.567812 master-0 kubenswrapper[4045]: I0308 03:46:51.567776 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-run-ovn-kubernetes\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.567935 master-0 kubenswrapper[4045]: I0308 03:46:51.567900 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-log-socket\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.567935 master-0 kubenswrapper[4045]: I0308 03:46:51.567913 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-run-ovn-kubernetes\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.567989 master-0 kubenswrapper[4045]: I0308 03:46:51.567951 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-kubelet\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.567989 master-0 kubenswrapper[4045]: I0308 03:46:51.567982 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-systemd-units\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.568149 master-0 kubenswrapper[4045]: I0308 03:46:51.568109 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-kubelet\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.568192 master-0 kubenswrapper[4045]: I0308 03:46:51.568115 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-log-socket\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.568192 master-0 kubenswrapper[4045]: I0308 03:46:51.568017 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-run-netns\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.568248 master-0 kubenswrapper[4045]: I0308 03:46:51.568196 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-systemd-units\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.568277 master-0 kubenswrapper[4045]: I0308 03:46:51.568246 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-run-netns\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.568277 master-0 kubenswrapper[4045]: I0308 03:46:51.568245 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovnkube-config\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.568526 master-0 kubenswrapper[4045]: I0308 03:46:51.568481 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-env-overrides\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.568700 master-0 kubenswrapper[4045]: I0308 03:46:51.568652 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vklx\" (UniqueName: \"kubernetes.io/projected/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-kube-api-access-2vklx\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.568762 master-0 kubenswrapper[4045]: I0308 03:46:51.568739 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-etc-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.568847 master-0 kubenswrapper[4045]: I0308 03:46:51.568807 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-systemd\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.568899 master-0 kubenswrapper[4045]: I0308 03:46:51.568871 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-slash\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.568939 master-0 kubenswrapper[4045]: I0308 03:46:51.568907 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.568979 master-0 kubenswrapper[4045]: I0308 03:46:51.568962 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.569035 master-0 kubenswrapper[4045]: I0308 03:46:51.569010 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-ovn\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.569083 master-0 kubenswrapper[4045]: I0308 03:46:51.569059 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-var-lib-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.569127 master-0 kubenswrapper[4045]: I0308 03:46:51.569066 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-etc-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.569127 master-0 kubenswrapper[4045]: I0308 03:46:51.569094 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-cni-netd\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.569127 master-0 kubenswrapper[4045]: I0308 03:46:51.569100 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-slash\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.569237 master-0 kubenswrapper[4045]: I0308 03:46:51.569007 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-systemd\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.569237 master-0 kubenswrapper[4045]: I0308 03:46:51.569214 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.569298 master-0 kubenswrapper[4045]: I0308 03:46:51.569248 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-cni-netd\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.569298 master-0 kubenswrapper[4045]: I0308 03:46:51.569221 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-var-lib-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.569364 master-0 kubenswrapper[4045]: I0308 03:46:51.569290 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-ovn\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.569364 master-0 kubenswrapper[4045]: I0308 03:46:51.569296 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovnkube-script-lib\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.569438 master-0 kubenswrapper[4045]: I0308 03:46:51.569374 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.569541 master-0 kubenswrapper[4045]: I0308 03:46:51.569501 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-env-overrides\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.569658 master-0 kubenswrapper[4045]: I0308 03:46:51.569626 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovnkube-config\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.570395 master-0 kubenswrapper[4045]: I0308 03:46:51.570348 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovnkube-script-lib\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.572866 master-0 kubenswrapper[4045]: I0308 03:46:51.572818 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovn-node-metrics-cert\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.587932 master-0 kubenswrapper[4045]: I0308 03:46:51.587898 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vklx\" (UniqueName: \"kubernetes.io/projected/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-kube-api-access-2vklx\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.633775 master-0 kubenswrapper[4045]: I0308 03:46:51.633722 4045 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-77cnk_bead1d9b-4518-46a9-bb0b-50316252eb1c/ovnkube-controller/0.log" Mar 08 03:46:51.635961 master-0 kubenswrapper[4045]: I0308 03:46:51.635940 4045 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-77cnk_bead1d9b-4518-46a9-bb0b-50316252eb1c/kube-rbac-proxy-ovn-metrics/0.log" Mar 08 03:46:51.636528 master-0 kubenswrapper[4045]: I0308 03:46:51.636499 4045 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-77cnk_bead1d9b-4518-46a9-bb0b-50316252eb1c/kube-rbac-proxy-node/0.log" Mar 08 03:46:51.637121 master-0 kubenswrapper[4045]: I0308 03:46:51.637105 4045 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-77cnk_bead1d9b-4518-46a9-bb0b-50316252eb1c/ovn-acl-logging/0.log" Mar 08 03:46:51.637934 master-0 kubenswrapper[4045]: I0308 03:46:51.637902 4045 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-77cnk_bead1d9b-4518-46a9-bb0b-50316252eb1c/ovn-controller/0.log" Mar 08 03:46:51.638456 master-0 kubenswrapper[4045]: I0308 03:46:51.638433 4045 generic.go:334] "Generic (PLEG): container finished" podID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerID="3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62" exitCode=1 Mar 08 03:46:51.638491 master-0 kubenswrapper[4045]: I0308 03:46:51.638457 4045 generic.go:334] "Generic (PLEG): container finished" podID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerID="0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043" exitCode=0 Mar 08 03:46:51.638491 master-0 kubenswrapper[4045]: I0308 03:46:51.638465 4045 generic.go:334] "Generic (PLEG): container finished" podID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerID="4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d" exitCode=0 Mar 08 03:46:51.638491 master-0 kubenswrapper[4045]: I0308 03:46:51.638473 4045 generic.go:334] "Generic (PLEG): container finished" podID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerID="added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1" exitCode=0 Mar 08 03:46:51.638491 master-0 kubenswrapper[4045]: I0308 03:46:51.638480 4045 generic.go:334] "Generic (PLEG): container finished" podID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerID="c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1" exitCode=143 Mar 08 03:46:51.638491 master-0 kubenswrapper[4045]: I0308 03:46:51.638487 4045 generic.go:334] "Generic (PLEG): container finished" podID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerID="8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582" exitCode=143 Mar 08 03:46:51.638612 master-0 kubenswrapper[4045]: I0308 03:46:51.638493 4045 generic.go:334] "Generic (PLEG): container finished" podID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerID="162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c" exitCode=143 Mar 08 03:46:51.638612 master-0 kubenswrapper[4045]: I0308 03:46:51.638482 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerDied","Data":"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62"} Mar 08 03:46:51.638612 master-0 kubenswrapper[4045]: I0308 03:46:51.638557 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerDied","Data":"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043"} Mar 08 03:46:51.638612 master-0 kubenswrapper[4045]: I0308 03:46:51.638582 4045 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" Mar 08 03:46:51.638612 master-0 kubenswrapper[4045]: I0308 03:46:51.638601 4045 scope.go:117] "RemoveContainer" containerID="3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62" Mar 08 03:46:51.638729 master-0 kubenswrapper[4045]: I0308 03:46:51.638502 4045 generic.go:334] "Generic (PLEG): container finished" podID="bead1d9b-4518-46a9-bb0b-50316252eb1c" containerID="b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa" exitCode=143 Mar 08 03:46:51.638729 master-0 kubenswrapper[4045]: I0308 03:46:51.638582 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerDied","Data":"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d"} Mar 08 03:46:51.638778 master-0 kubenswrapper[4045]: I0308 03:46:51.638742 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerDied","Data":"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1"} Mar 08 03:46:51.638778 master-0 kubenswrapper[4045]: I0308 03:46:51.638760 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerDied","Data":"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1"} Mar 08 03:46:51.638778 master-0 kubenswrapper[4045]: I0308 03:46:51.638770 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerDied","Data":"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582"} Mar 08 03:46:51.638903 master-0 kubenswrapper[4045]: I0308 03:46:51.638780 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c"} Mar 08 03:46:51.638954 master-0 kubenswrapper[4045]: I0308 03:46:51.638904 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa"} Mar 08 03:46:51.638954 master-0 kubenswrapper[4045]: I0308 03:46:51.638910 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295"} Mar 08 03:46:51.638954 master-0 kubenswrapper[4045]: I0308 03:46:51.638919 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerDied","Data":"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c"} Mar 08 03:46:51.638954 master-0 kubenswrapper[4045]: I0308 03:46:51.638927 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62"} Mar 08 03:46:51.638954 master-0 kubenswrapper[4045]: I0308 03:46:51.638934 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043"} Mar 08 03:46:51.638954 master-0 kubenswrapper[4045]: I0308 03:46:51.638942 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d"} Mar 08 03:46:51.639083 master-0 kubenswrapper[4045]: I0308 03:46:51.638947 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1"} Mar 08 03:46:51.639083 master-0 kubenswrapper[4045]: I0308 03:46:51.638971 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1"} Mar 08 03:46:51.639083 master-0 kubenswrapper[4045]: I0308 03:46:51.638976 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582"} Mar 08 03:46:51.639083 master-0 kubenswrapper[4045]: I0308 03:46:51.638981 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c"} Mar 08 03:46:51.639083 master-0 kubenswrapper[4045]: I0308 03:46:51.638986 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa"} Mar 08 03:46:51.639083 master-0 kubenswrapper[4045]: I0308 03:46:51.638991 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295"} Mar 08 03:46:51.639083 master-0 kubenswrapper[4045]: I0308 03:46:51.639000 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerDied","Data":"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa"} Mar 08 03:46:51.639083 master-0 kubenswrapper[4045]: I0308 03:46:51.639011 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62"} Mar 08 03:46:51.639083 master-0 kubenswrapper[4045]: I0308 03:46:51.639017 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043"} Mar 08 03:46:51.639083 master-0 kubenswrapper[4045]: I0308 03:46:51.639064 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d"} Mar 08 03:46:51.639083 master-0 kubenswrapper[4045]: I0308 03:46:51.639072 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1"} Mar 08 03:46:51.639083 master-0 kubenswrapper[4045]: I0308 03:46:51.639077 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1"} Mar 08 03:46:51.639083 master-0 kubenswrapper[4045]: I0308 03:46:51.639082 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582"} Mar 08 03:46:51.639083 master-0 kubenswrapper[4045]: I0308 03:46:51.639087 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c"} Mar 08 03:46:51.639593 master-0 kubenswrapper[4045]: I0308 03:46:51.639092 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa"} Mar 08 03:46:51.639593 master-0 kubenswrapper[4045]: I0308 03:46:51.639097 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295"} Mar 08 03:46:51.639593 master-0 kubenswrapper[4045]: I0308 03:46:51.639105 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-77cnk" event={"ID":"bead1d9b-4518-46a9-bb0b-50316252eb1c","Type":"ContainerDied","Data":"bf7e5b799b502d9b50bfc698c644dababdb09d8ea5fe3988d4811d2c5a723d8c"} Mar 08 03:46:51.639593 master-0 kubenswrapper[4045]: I0308 03:46:51.639114 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62"} Mar 08 03:46:51.639593 master-0 kubenswrapper[4045]: I0308 03:46:51.639120 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043"} Mar 08 03:46:51.639593 master-0 kubenswrapper[4045]: I0308 03:46:51.639125 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d"} Mar 08 03:46:51.639593 master-0 kubenswrapper[4045]: I0308 03:46:51.639129 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1"} Mar 08 03:46:51.639593 master-0 kubenswrapper[4045]: I0308 03:46:51.639134 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1"} Mar 08 03:46:51.639593 master-0 kubenswrapper[4045]: I0308 03:46:51.639140 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582"} Mar 08 03:46:51.639593 master-0 kubenswrapper[4045]: I0308 03:46:51.639145 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c"} Mar 08 03:46:51.639593 master-0 kubenswrapper[4045]: I0308 03:46:51.639186 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa"} Mar 08 03:46:51.639593 master-0 kubenswrapper[4045]: I0308 03:46:51.639195 4045 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295"} Mar 08 03:46:51.673564 master-0 kubenswrapper[4045]: I0308 03:46:51.673514 4045 scope.go:117] "RemoveContainer" containerID="0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043" Mar 08 03:46:51.693733 master-0 kubenswrapper[4045]: I0308 03:46:51.690481 4045 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-77cnk"] Mar 08 03:46:51.694673 master-0 kubenswrapper[4045]: I0308 03:46:51.694620 4045 scope.go:117] "RemoveContainer" containerID="4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d" Mar 08 03:46:51.701073 master-0 kubenswrapper[4045]: I0308 03:46:51.701030 4045 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-77cnk"] Mar 08 03:46:51.714500 master-0 kubenswrapper[4045]: I0308 03:46:51.714457 4045 scope.go:117] "RemoveContainer" containerID="added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1" Mar 08 03:46:51.728371 master-0 kubenswrapper[4045]: I0308 03:46:51.728334 4045 scope.go:117] "RemoveContainer" containerID="c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1" Mar 08 03:46:51.733915 master-0 kubenswrapper[4045]: I0308 03:46:51.733883 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:51.745792 master-0 kubenswrapper[4045]: I0308 03:46:51.745748 4045 scope.go:117] "RemoveContainer" containerID="8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582" Mar 08 03:46:51.754032 master-0 kubenswrapper[4045]: W0308 03:46:51.753979 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0f5f3f3_0856_4da3_9157_15f65c6aba6e.slice/crio-e8733f46dd1d2647e586c0cc9b5a4ebea38d695f856a8c74190015b70d99a33e WatchSource:0}: Error finding container e8733f46dd1d2647e586c0cc9b5a4ebea38d695f856a8c74190015b70d99a33e: Status 404 returned error can't find the container with id e8733f46dd1d2647e586c0cc9b5a4ebea38d695f856a8c74190015b70d99a33e Mar 08 03:46:51.779277 master-0 kubenswrapper[4045]: I0308 03:46:51.779246 4045 scope.go:117] "RemoveContainer" containerID="162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c" Mar 08 03:46:51.789706 master-0 kubenswrapper[4045]: I0308 03:46:51.789660 4045 scope.go:117] "RemoveContainer" containerID="b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa" Mar 08 03:46:51.802448 master-0 kubenswrapper[4045]: I0308 03:46:51.802406 4045 scope.go:117] "RemoveContainer" containerID="a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295" Mar 08 03:46:51.819947 master-0 kubenswrapper[4045]: I0308 03:46:51.819915 4045 scope.go:117] "RemoveContainer" containerID="3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62" Mar 08 03:46:51.820418 master-0 kubenswrapper[4045]: E0308 03:46:51.820368 4045 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62\": container with ID starting with 3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62 not found: ID does not exist" containerID="3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62" Mar 08 03:46:51.820456 master-0 kubenswrapper[4045]: I0308 03:46:51.820422 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62"} err="failed to get container status \"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62\": rpc error: code = NotFound desc = could not find container \"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62\": container with ID starting with 3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62 not found: ID does not exist" Mar 08 03:46:51.820490 master-0 kubenswrapper[4045]: I0308 03:46:51.820460 4045 scope.go:117] "RemoveContainer" containerID="0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043" Mar 08 03:46:51.821219 master-0 kubenswrapper[4045]: E0308 03:46:51.821170 4045 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043\": container with ID starting with 0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043 not found: ID does not exist" containerID="0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043" Mar 08 03:46:51.821266 master-0 kubenswrapper[4045]: I0308 03:46:51.821216 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043"} err="failed to get container status \"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043\": rpc error: code = NotFound desc = could not find container \"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043\": container with ID starting with 0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043 not found: ID does not exist" Mar 08 03:46:51.821266 master-0 kubenswrapper[4045]: I0308 03:46:51.821244 4045 scope.go:117] "RemoveContainer" containerID="4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d" Mar 08 03:46:51.821685 master-0 kubenswrapper[4045]: E0308 03:46:51.821644 4045 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d\": container with ID starting with 4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d not found: ID does not exist" containerID="4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d" Mar 08 03:46:51.821733 master-0 kubenswrapper[4045]: I0308 03:46:51.821683 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d"} err="failed to get container status \"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d\": rpc error: code = NotFound desc = could not find container \"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d\": container with ID starting with 4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d not found: ID does not exist" Mar 08 03:46:51.821733 master-0 kubenswrapper[4045]: I0308 03:46:51.821707 4045 scope.go:117] "RemoveContainer" containerID="added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1" Mar 08 03:46:51.822336 master-0 kubenswrapper[4045]: E0308 03:46:51.822289 4045 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1\": container with ID starting with added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1 not found: ID does not exist" containerID="added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1" Mar 08 03:46:51.822390 master-0 kubenswrapper[4045]: I0308 03:46:51.822333 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1"} err="failed to get container status \"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1\": rpc error: code = NotFound desc = could not find container \"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1\": container with ID starting with added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1 not found: ID does not exist" Mar 08 03:46:51.822390 master-0 kubenswrapper[4045]: I0308 03:46:51.822361 4045 scope.go:117] "RemoveContainer" containerID="c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1" Mar 08 03:46:51.822769 master-0 kubenswrapper[4045]: E0308 03:46:51.822718 4045 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1\": container with ID starting with c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1 not found: ID does not exist" containerID="c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1" Mar 08 03:46:51.822817 master-0 kubenswrapper[4045]: I0308 03:46:51.822766 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1"} err="failed to get container status \"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1\": rpc error: code = NotFound desc = could not find container \"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1\": container with ID starting with c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1 not found: ID does not exist" Mar 08 03:46:51.822817 master-0 kubenswrapper[4045]: I0308 03:46:51.822799 4045 scope.go:117] "RemoveContainer" containerID="8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582" Mar 08 03:46:51.823295 master-0 kubenswrapper[4045]: E0308 03:46:51.823258 4045 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582\": container with ID starting with 8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582 not found: ID does not exist" containerID="8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582" Mar 08 03:46:51.823295 master-0 kubenswrapper[4045]: I0308 03:46:51.823284 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582"} err="failed to get container status \"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582\": rpc error: code = NotFound desc = could not find container \"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582\": container with ID starting with 8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582 not found: ID does not exist" Mar 08 03:46:51.823375 master-0 kubenswrapper[4045]: I0308 03:46:51.823300 4045 scope.go:117] "RemoveContainer" containerID="162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c" Mar 08 03:46:51.823609 master-0 kubenswrapper[4045]: E0308 03:46:51.823560 4045 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c\": container with ID starting with 162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c not found: ID does not exist" containerID="162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c" Mar 08 03:46:51.823659 master-0 kubenswrapper[4045]: I0308 03:46:51.823604 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c"} err="failed to get container status \"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c\": rpc error: code = NotFound desc = could not find container \"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c\": container with ID starting with 162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c not found: ID does not exist" Mar 08 03:46:51.823659 master-0 kubenswrapper[4045]: I0308 03:46:51.823633 4045 scope.go:117] "RemoveContainer" containerID="b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa" Mar 08 03:46:51.824277 master-0 kubenswrapper[4045]: E0308 03:46:51.824235 4045 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa\": container with ID starting with b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa not found: ID does not exist" containerID="b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa" Mar 08 03:46:51.824345 master-0 kubenswrapper[4045]: I0308 03:46:51.824272 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa"} err="failed to get container status \"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa\": rpc error: code = NotFound desc = could not find container \"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa\": container with ID starting with b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa not found: ID does not exist" Mar 08 03:46:51.824345 master-0 kubenswrapper[4045]: I0308 03:46:51.824294 4045 scope.go:117] "RemoveContainer" containerID="a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295" Mar 08 03:46:51.824640 master-0 kubenswrapper[4045]: E0308 03:46:51.824583 4045 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295\": container with ID starting with a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295 not found: ID does not exist" containerID="a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295" Mar 08 03:46:51.824780 master-0 kubenswrapper[4045]: I0308 03:46:51.824697 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295"} err="failed to get container status \"a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295\": rpc error: code = NotFound desc = could not find container \"a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295\": container with ID starting with a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295 not found: ID does not exist" Mar 08 03:46:51.824841 master-0 kubenswrapper[4045]: I0308 03:46:51.824776 4045 scope.go:117] "RemoveContainer" containerID="3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62" Mar 08 03:46:51.825311 master-0 kubenswrapper[4045]: I0308 03:46:51.825259 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62"} err="failed to get container status \"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62\": rpc error: code = NotFound desc = could not find container \"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62\": container with ID starting with 3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62 not found: ID does not exist" Mar 08 03:46:51.825311 master-0 kubenswrapper[4045]: I0308 03:46:51.825301 4045 scope.go:117] "RemoveContainer" containerID="0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043" Mar 08 03:46:51.825763 master-0 kubenswrapper[4045]: I0308 03:46:51.825707 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043"} err="failed to get container status \"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043\": rpc error: code = NotFound desc = could not find container \"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043\": container with ID starting with 0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043 not found: ID does not exist" Mar 08 03:46:51.825763 master-0 kubenswrapper[4045]: I0308 03:46:51.825757 4045 scope.go:117] "RemoveContainer" containerID="4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d" Mar 08 03:46:51.826275 master-0 kubenswrapper[4045]: I0308 03:46:51.826236 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d"} err="failed to get container status \"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d\": rpc error: code = NotFound desc = could not find container \"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d\": container with ID starting with 4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d not found: ID does not exist" Mar 08 03:46:51.826275 master-0 kubenswrapper[4045]: I0308 03:46:51.826270 4045 scope.go:117] "RemoveContainer" containerID="added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1" Mar 08 03:46:51.826942 master-0 kubenswrapper[4045]: I0308 03:46:51.826907 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1"} err="failed to get container status \"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1\": rpc error: code = NotFound desc = could not find container \"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1\": container with ID starting with added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1 not found: ID does not exist" Mar 08 03:46:51.826942 master-0 kubenswrapper[4045]: I0308 03:46:51.826935 4045 scope.go:117] "RemoveContainer" containerID="c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1" Mar 08 03:46:51.827343 master-0 kubenswrapper[4045]: I0308 03:46:51.827308 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1"} err="failed to get container status \"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1\": rpc error: code = NotFound desc = could not find container \"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1\": container with ID starting with c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1 not found: ID does not exist" Mar 08 03:46:51.827343 master-0 kubenswrapper[4045]: I0308 03:46:51.827334 4045 scope.go:117] "RemoveContainer" containerID="8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582" Mar 08 03:46:51.827774 master-0 kubenswrapper[4045]: I0308 03:46:51.827728 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582"} err="failed to get container status \"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582\": rpc error: code = NotFound desc = could not find container \"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582\": container with ID starting with 8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582 not found: ID does not exist" Mar 08 03:46:51.827814 master-0 kubenswrapper[4045]: I0308 03:46:51.827772 4045 scope.go:117] "RemoveContainer" containerID="162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c" Mar 08 03:46:51.828361 master-0 kubenswrapper[4045]: I0308 03:46:51.828318 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c"} err="failed to get container status \"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c\": rpc error: code = NotFound desc = could not find container \"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c\": container with ID starting with 162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c not found: ID does not exist" Mar 08 03:46:51.828361 master-0 kubenswrapper[4045]: I0308 03:46:51.828355 4045 scope.go:117] "RemoveContainer" containerID="b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa" Mar 08 03:46:51.828788 master-0 kubenswrapper[4045]: I0308 03:46:51.828722 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa"} err="failed to get container status \"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa\": rpc error: code = NotFound desc = could not find container \"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa\": container with ID starting with b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa not found: ID does not exist" Mar 08 03:46:51.828788 master-0 kubenswrapper[4045]: I0308 03:46:51.828780 4045 scope.go:117] "RemoveContainer" containerID="a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295" Mar 08 03:46:51.829314 master-0 kubenswrapper[4045]: I0308 03:46:51.829270 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295"} err="failed to get container status \"a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295\": rpc error: code = NotFound desc = could not find container \"a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295\": container with ID starting with a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295 not found: ID does not exist" Mar 08 03:46:51.829314 master-0 kubenswrapper[4045]: I0308 03:46:51.829307 4045 scope.go:117] "RemoveContainer" containerID="3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62" Mar 08 03:46:51.829744 master-0 kubenswrapper[4045]: I0308 03:46:51.829698 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62"} err="failed to get container status \"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62\": rpc error: code = NotFound desc = could not find container \"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62\": container with ID starting with 3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62 not found: ID does not exist" Mar 08 03:46:51.829744 master-0 kubenswrapper[4045]: I0308 03:46:51.829740 4045 scope.go:117] "RemoveContainer" containerID="0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043" Mar 08 03:46:51.831109 master-0 kubenswrapper[4045]: I0308 03:46:51.831057 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043"} err="failed to get container status \"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043\": rpc error: code = NotFound desc = could not find container \"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043\": container with ID starting with 0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043 not found: ID does not exist" Mar 08 03:46:51.831148 master-0 kubenswrapper[4045]: I0308 03:46:51.831120 4045 scope.go:117] "RemoveContainer" containerID="4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d" Mar 08 03:46:51.831622 master-0 kubenswrapper[4045]: I0308 03:46:51.831589 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d"} err="failed to get container status \"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d\": rpc error: code = NotFound desc = could not find container \"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d\": container with ID starting with 4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d not found: ID does not exist" Mar 08 03:46:51.831622 master-0 kubenswrapper[4045]: I0308 03:46:51.831612 4045 scope.go:117] "RemoveContainer" containerID="added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1" Mar 08 03:46:51.832068 master-0 kubenswrapper[4045]: I0308 03:46:51.832018 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1"} err="failed to get container status \"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1\": rpc error: code = NotFound desc = could not find container \"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1\": container with ID starting with added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1 not found: ID does not exist" Mar 08 03:46:51.832111 master-0 kubenswrapper[4045]: I0308 03:46:51.832067 4045 scope.go:117] "RemoveContainer" containerID="c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1" Mar 08 03:46:51.832427 master-0 kubenswrapper[4045]: I0308 03:46:51.832391 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1"} err="failed to get container status \"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1\": rpc error: code = NotFound desc = could not find container \"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1\": container with ID starting with c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1 not found: ID does not exist" Mar 08 03:46:51.832427 master-0 kubenswrapper[4045]: I0308 03:46:51.832420 4045 scope.go:117] "RemoveContainer" containerID="8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582" Mar 08 03:46:51.832911 master-0 kubenswrapper[4045]: I0308 03:46:51.832872 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582"} err="failed to get container status \"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582\": rpc error: code = NotFound desc = could not find container \"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582\": container with ID starting with 8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582 not found: ID does not exist" Mar 08 03:46:51.832911 master-0 kubenswrapper[4045]: I0308 03:46:51.832902 4045 scope.go:117] "RemoveContainer" containerID="162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c" Mar 08 03:46:51.833270 master-0 kubenswrapper[4045]: I0308 03:46:51.833226 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c"} err="failed to get container status \"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c\": rpc error: code = NotFound desc = could not find container \"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c\": container with ID starting with 162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c not found: ID does not exist" Mar 08 03:46:51.833302 master-0 kubenswrapper[4045]: I0308 03:46:51.833272 4045 scope.go:117] "RemoveContainer" containerID="b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa" Mar 08 03:46:51.833683 master-0 kubenswrapper[4045]: I0308 03:46:51.833643 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa"} err="failed to get container status \"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa\": rpc error: code = NotFound desc = could not find container \"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa\": container with ID starting with b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa not found: ID does not exist" Mar 08 03:46:51.833718 master-0 kubenswrapper[4045]: I0308 03:46:51.833682 4045 scope.go:117] "RemoveContainer" containerID="a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295" Mar 08 03:46:51.834153 master-0 kubenswrapper[4045]: I0308 03:46:51.834115 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295"} err="failed to get container status \"a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295\": rpc error: code = NotFound desc = could not find container \"a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295\": container with ID starting with a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295 not found: ID does not exist" Mar 08 03:46:51.834199 master-0 kubenswrapper[4045]: I0308 03:46:51.834156 4045 scope.go:117] "RemoveContainer" containerID="3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62" Mar 08 03:46:51.834592 master-0 kubenswrapper[4045]: I0308 03:46:51.834534 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62"} err="failed to get container status \"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62\": rpc error: code = NotFound desc = could not find container \"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62\": container with ID starting with 3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62 not found: ID does not exist" Mar 08 03:46:51.834592 master-0 kubenswrapper[4045]: I0308 03:46:51.834580 4045 scope.go:117] "RemoveContainer" containerID="0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043" Mar 08 03:46:51.834994 master-0 kubenswrapper[4045]: I0308 03:46:51.834949 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043"} err="failed to get container status \"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043\": rpc error: code = NotFound desc = could not find container \"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043\": container with ID starting with 0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043 not found: ID does not exist" Mar 08 03:46:51.835030 master-0 kubenswrapper[4045]: I0308 03:46:51.834991 4045 scope.go:117] "RemoveContainer" containerID="4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d" Mar 08 03:46:51.835530 master-0 kubenswrapper[4045]: I0308 03:46:51.835485 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d"} err="failed to get container status \"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d\": rpc error: code = NotFound desc = could not find container \"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d\": container with ID starting with 4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d not found: ID does not exist" Mar 08 03:46:51.835530 master-0 kubenswrapper[4045]: I0308 03:46:51.835522 4045 scope.go:117] "RemoveContainer" containerID="added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1" Mar 08 03:46:51.835856 master-0 kubenswrapper[4045]: I0308 03:46:51.835799 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1"} err="failed to get container status \"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1\": rpc error: code = NotFound desc = could not find container \"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1\": container with ID starting with added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1 not found: ID does not exist" Mar 08 03:46:51.835891 master-0 kubenswrapper[4045]: I0308 03:46:51.835858 4045 scope.go:117] "RemoveContainer" containerID="c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1" Mar 08 03:46:51.836326 master-0 kubenswrapper[4045]: I0308 03:46:51.836277 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1"} err="failed to get container status \"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1\": rpc error: code = NotFound desc = could not find container \"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1\": container with ID starting with c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1 not found: ID does not exist" Mar 08 03:46:51.836359 master-0 kubenswrapper[4045]: I0308 03:46:51.836321 4045 scope.go:117] "RemoveContainer" containerID="8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582" Mar 08 03:46:51.836840 master-0 kubenswrapper[4045]: I0308 03:46:51.836764 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582"} err="failed to get container status \"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582\": rpc error: code = NotFound desc = could not find container \"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582\": container with ID starting with 8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582 not found: ID does not exist" Mar 08 03:46:51.836894 master-0 kubenswrapper[4045]: I0308 03:46:51.836845 4045 scope.go:117] "RemoveContainer" containerID="162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c" Mar 08 03:46:51.837285 master-0 kubenswrapper[4045]: I0308 03:46:51.837238 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c"} err="failed to get container status \"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c\": rpc error: code = NotFound desc = could not find container \"162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c\": container with ID starting with 162bfc7ea9c50f70286b0884bf41923bf20ff8ea7664c8a71743a1c7c8573a1c not found: ID does not exist" Mar 08 03:46:51.837322 master-0 kubenswrapper[4045]: I0308 03:46:51.837281 4045 scope.go:117] "RemoveContainer" containerID="b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa" Mar 08 03:46:51.837698 master-0 kubenswrapper[4045]: I0308 03:46:51.837653 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa"} err="failed to get container status \"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa\": rpc error: code = NotFound desc = could not find container \"b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa\": container with ID starting with b5c04e0d909b0e59758f566b9d8ba4fceeebd98c7282262f63efcc5e2e1556aa not found: ID does not exist" Mar 08 03:46:51.837698 master-0 kubenswrapper[4045]: I0308 03:46:51.837691 4045 scope.go:117] "RemoveContainer" containerID="a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295" Mar 08 03:46:51.838252 master-0 kubenswrapper[4045]: I0308 03:46:51.838194 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295"} err="failed to get container status \"a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295\": rpc error: code = NotFound desc = could not find container \"a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295\": container with ID starting with a7aef1217ff8dd575e82935ee2c21e6786c8c9c8970173f89e65462060324295 not found: ID does not exist" Mar 08 03:46:51.838404 master-0 kubenswrapper[4045]: I0308 03:46:51.838269 4045 scope.go:117] "RemoveContainer" containerID="3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62" Mar 08 03:46:51.838968 master-0 kubenswrapper[4045]: I0308 03:46:51.838907 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62"} err="failed to get container status \"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62\": rpc error: code = NotFound desc = could not find container \"3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62\": container with ID starting with 3f44665a0e4601a97013bbddc2ac5c3b1b785966923a1d4ccdeea57349497e62 not found: ID does not exist" Mar 08 03:46:51.838968 master-0 kubenswrapper[4045]: I0308 03:46:51.838948 4045 scope.go:117] "RemoveContainer" containerID="0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043" Mar 08 03:46:51.839520 master-0 kubenswrapper[4045]: I0308 03:46:51.839445 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043"} err="failed to get container status \"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043\": rpc error: code = NotFound desc = could not find container \"0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043\": container with ID starting with 0c56e83d2b8fbc6b8ea63ba1babdcd52ef268c17322e406b122313f96a406043 not found: ID does not exist" Mar 08 03:46:51.839520 master-0 kubenswrapper[4045]: I0308 03:46:51.839495 4045 scope.go:117] "RemoveContainer" containerID="4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d" Mar 08 03:46:51.839970 master-0 kubenswrapper[4045]: I0308 03:46:51.839911 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d"} err="failed to get container status \"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d\": rpc error: code = NotFound desc = could not find container \"4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d\": container with ID starting with 4d481f1172ff59ed351fa2a42b4d240305d289745eaec056f759d0eb06f2433d not found: ID does not exist" Mar 08 03:46:51.839970 master-0 kubenswrapper[4045]: I0308 03:46:51.839944 4045 scope.go:117] "RemoveContainer" containerID="added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1" Mar 08 03:46:51.840298 master-0 kubenswrapper[4045]: I0308 03:46:51.840234 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1"} err="failed to get container status \"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1\": rpc error: code = NotFound desc = could not find container \"added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1\": container with ID starting with added2934331879c4be6e788743b0207f2205fcc24178ac7510edf4a1e18f2c1 not found: ID does not exist" Mar 08 03:46:51.840298 master-0 kubenswrapper[4045]: I0308 03:46:51.840274 4045 scope.go:117] "RemoveContainer" containerID="c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1" Mar 08 03:46:51.840581 master-0 kubenswrapper[4045]: I0308 03:46:51.840527 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1"} err="failed to get container status \"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1\": rpc error: code = NotFound desc = could not find container \"c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1\": container with ID starting with c3bba27bb7d1e48a9e3c0a737f52dc765a3c2662df797adf3042f2f12df4fbe1 not found: ID does not exist" Mar 08 03:46:51.840581 master-0 kubenswrapper[4045]: I0308 03:46:51.840556 4045 scope.go:117] "RemoveContainer" containerID="8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582" Mar 08 03:46:51.841327 master-0 kubenswrapper[4045]: I0308 03:46:51.841263 4045 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582"} err="failed to get container status \"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582\": rpc error: code = NotFound desc = could not find container \"8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582\": container with ID starting with 8f2375b1b30b12e920ef023118d16bc0d52d883528b42117bf2fe57d5e36d582 not found: ID does not exist" Mar 08 03:46:52.199593 master-0 kubenswrapper[4045]: I0308 03:46:52.199550 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:52.199819 master-0 kubenswrapper[4045]: E0308 03:46:52.199715 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:52.646583 master-0 kubenswrapper[4045]: I0308 03:46:52.646404 4045 generic.go:334] "Generic (PLEG): container finished" podID="f0f5f3f3-0856-4da3-9157-15f65c6aba6e" containerID="e04ec38e07d8783fc2ade88328995e37c561797980580532badf766ca8953982" exitCode=0 Mar 08 03:46:52.646583 master-0 kubenswrapper[4045]: I0308 03:46:52.646459 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" event={"ID":"f0f5f3f3-0856-4da3-9157-15f65c6aba6e","Type":"ContainerDied","Data":"e04ec38e07d8783fc2ade88328995e37c561797980580532badf766ca8953982"} Mar 08 03:46:52.646583 master-0 kubenswrapper[4045]: I0308 03:46:52.646503 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" event={"ID":"f0f5f3f3-0856-4da3-9157-15f65c6aba6e","Type":"ContainerStarted","Data":"e8733f46dd1d2647e586c0cc9b5a4ebea38d695f856a8c74190015b70d99a33e"} Mar 08 03:46:53.203514 master-0 kubenswrapper[4045]: I0308 03:46:53.203111 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:53.203666 master-0 kubenswrapper[4045]: E0308 03:46:53.203605 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:46:53.207258 master-0 kubenswrapper[4045]: I0308 03:46:53.207202 4045 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bead1d9b-4518-46a9-bb0b-50316252eb1c" path="/var/lib/kubelet/pods/bead1d9b-4518-46a9-bb0b-50316252eb1c/volumes" Mar 08 03:46:53.680433 master-0 kubenswrapper[4045]: I0308 03:46:53.680259 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" event={"ID":"f0f5f3f3-0856-4da3-9157-15f65c6aba6e","Type":"ContainerStarted","Data":"d5af0be9c5febd1a424ba8af4579e07e0c6479c09acaa22ed67931b840c36b26"} Mar 08 03:46:53.680433 master-0 kubenswrapper[4045]: I0308 03:46:53.680333 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" event={"ID":"f0f5f3f3-0856-4da3-9157-15f65c6aba6e","Type":"ContainerStarted","Data":"c364e4447e65956b5d35d4a5b6ed95b327701a8872b6a13b4c9bcf041d653034"} Mar 08 03:46:53.680433 master-0 kubenswrapper[4045]: I0308 03:46:53.680354 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" event={"ID":"f0f5f3f3-0856-4da3-9157-15f65c6aba6e","Type":"ContainerStarted","Data":"576a6240fb10bb981eb126a27ac83eb802b2639539b84f25be9d3bda7fcdefc9"} Mar 08 03:46:53.680433 master-0 kubenswrapper[4045]: I0308 03:46:53.680373 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" event={"ID":"f0f5f3f3-0856-4da3-9157-15f65c6aba6e","Type":"ContainerStarted","Data":"5b2459302e30b01e63a9385188b22521286fb55c41ccf8a385364b72629c6ed9"} Mar 08 03:46:53.680433 master-0 kubenswrapper[4045]: I0308 03:46:53.680391 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" event={"ID":"f0f5f3f3-0856-4da3-9157-15f65c6aba6e","Type":"ContainerStarted","Data":"dd4a003acec96043cfdc4ec7ce41949ee0557b61e56f55ab7a2c23f468c34709"} Mar 08 03:46:53.680433 master-0 kubenswrapper[4045]: I0308 03:46:53.680408 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" event={"ID":"f0f5f3f3-0856-4da3-9157-15f65c6aba6e","Type":"ContainerStarted","Data":"bcf82b929ccc12e6c4861b2440ceabd6b67a69aa04618280ac2518eccf0a41c5"} Mar 08 03:46:54.200000 master-0 kubenswrapper[4045]: I0308 03:46:54.199908 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:54.200238 master-0 kubenswrapper[4045]: E0308 03:46:54.200100 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:54.903807 master-0 kubenswrapper[4045]: I0308 03:46:54.903599 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kc5q\" (UniqueName: \"kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q\") pod \"network-check-target-xmgpj\" (UID: \"e93b5361-30e6-44fd-a59e-2bc410c59480\") " pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:54.904750 master-0 kubenswrapper[4045]: E0308 03:46:54.903913 4045 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 03:46:54.904750 master-0 kubenswrapper[4045]: E0308 03:46:54.903975 4045 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 03:46:54.904750 master-0 kubenswrapper[4045]: E0308 03:46:54.903998 4045 projected.go:194] Error preparing data for projected volume kube-api-access-4kc5q for pod openshift-network-diagnostics/network-check-target-xmgpj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:46:54.904750 master-0 kubenswrapper[4045]: E0308 03:46:54.904081 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q podName:e93b5361-30e6-44fd-a59e-2bc410c59480 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.904052106 +0000 UTC m=+146.514753144 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-4kc5q" (UniqueName: "kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q") pod "network-check-target-xmgpj" (UID: "e93b5361-30e6-44fd-a59e-2bc410c59480") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:46:55.199356 master-0 kubenswrapper[4045]: I0308 03:46:55.199286 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:55.199549 master-0 kubenswrapper[4045]: E0308 03:46:55.199467 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:46:56.199580 master-0 kubenswrapper[4045]: I0308 03:46:56.199449 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:56.200516 master-0 kubenswrapper[4045]: E0308 03:46:56.199674 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:56.706167 master-0 kubenswrapper[4045]: I0308 03:46:56.706045 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" event={"ID":"f0f5f3f3-0856-4da3-9157-15f65c6aba6e","Type":"ContainerStarted","Data":"7fff976283dd27d8ca912cc3d84cf6b397497454193d758e94b4d0eea365959e"} Mar 08 03:46:57.199113 master-0 kubenswrapper[4045]: I0308 03:46:57.199025 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:57.199319 master-0 kubenswrapper[4045]: E0308 03:46:57.199205 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:46:58.199305 master-0 kubenswrapper[4045]: I0308 03:46:58.199226 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:58.200464 master-0 kubenswrapper[4045]: E0308 03:46:58.199415 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:58.716813 master-0 kubenswrapper[4045]: I0308 03:46:58.716720 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" event={"ID":"f0f5f3f3-0856-4da3-9157-15f65c6aba6e","Type":"ContainerStarted","Data":"ffa4ac058c2c77885b2158d700a76da6e27818c268da8d4b9f6dcd657d049b37"} Mar 08 03:46:58.717277 master-0 kubenswrapper[4045]: I0308 03:46:58.717120 4045 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:58.717277 master-0 kubenswrapper[4045]: I0308 03:46:58.717162 4045 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:58.717277 master-0 kubenswrapper[4045]: I0308 03:46:58.717175 4045 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:58.744486 master-0 kubenswrapper[4045]: I0308 03:46:58.744420 4045 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" podStartSLOduration=7.744402476 podStartE2EDuration="7.744402476s" podCreationTimestamp="2026-03-08 03:46:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:46:58.742810128 +0000 UTC m=+118.353511126" watchObservedRunningTime="2026-03-08 03:46:58.744402476 +0000 UTC m=+118.355103434" Mar 08 03:46:58.751361 master-0 kubenswrapper[4045]: I0308 03:46:58.751318 4045 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:58.751710 master-0 kubenswrapper[4045]: I0308 03:46:58.751686 4045 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:46:59.164851 master-0 kubenswrapper[4045]: I0308 03:46:59.162660 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-schjl"] Mar 08 03:46:59.164851 master-0 kubenswrapper[4045]: I0308 03:46:59.162922 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:46:59.164851 master-0 kubenswrapper[4045]: E0308 03:46:59.163165 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:46:59.168693 master-0 kubenswrapper[4045]: I0308 03:46:59.168656 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xmgpj"] Mar 08 03:46:59.168765 master-0 kubenswrapper[4045]: I0308 03:46:59.168722 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:46:59.168817 master-0 kubenswrapper[4045]: E0308 03:46:59.168772 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:47:01.038895 master-0 kubenswrapper[4045]: E0308 03:47:01.037498 4045 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 08 03:47:01.199132 master-0 kubenswrapper[4045]: I0308 03:47:01.199051 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:47:01.200925 master-0 kubenswrapper[4045]: E0308 03:47:01.200850 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:47:01.201155 master-0 kubenswrapper[4045]: I0308 03:47:01.200975 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:47:01.201487 master-0 kubenswrapper[4045]: E0308 03:47:01.201421 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:47:01.215761 master-0 kubenswrapper[4045]: E0308 03:47:01.215586 4045 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 03:47:03.198893 master-0 kubenswrapper[4045]: I0308 03:47:03.198729 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:47:03.200168 master-0 kubenswrapper[4045]: E0308 03:47:03.198941 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:47:03.200168 master-0 kubenswrapper[4045]: I0308 03:47:03.198998 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:47:03.200168 master-0 kubenswrapper[4045]: E0308 03:47:03.199198 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:47:05.199461 master-0 kubenswrapper[4045]: I0308 03:47:05.199366 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:47:05.200305 master-0 kubenswrapper[4045]: I0308 03:47:05.199691 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:47:05.200305 master-0 kubenswrapper[4045]: E0308 03:47:05.199807 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmgpj" podUID="e93b5361-30e6-44fd-a59e-2bc410c59480" Mar 08 03:47:05.200305 master-0 kubenswrapper[4045]: E0308 03:47:05.199969 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-schjl" podUID="d5044ffd-0686-4679-9894-e696faf33699" Mar 08 03:47:07.199636 master-0 kubenswrapper[4045]: I0308 03:47:07.199511 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:47:07.200655 master-0 kubenswrapper[4045]: I0308 03:47:07.199678 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:47:07.206631 master-0 kubenswrapper[4045]: I0308 03:47:07.206578 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 03:47:07.207100 master-0 kubenswrapper[4045]: I0308 03:47:07.207046 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 03:47:07.209893 master-0 kubenswrapper[4045]: I0308 03:47:07.209808 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 03:47:07.335199 master-0 kubenswrapper[4045]: I0308 03:47:07.334705 4045 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Mar 08 03:47:08.017337 master-0 kubenswrapper[4045]: I0308 03:47:08.017243 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7"] Mar 08 03:47:08.017772 master-0 kubenswrapper[4045]: I0308 03:47:08.017731 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.020700 master-0 kubenswrapper[4045]: I0308 03:47:08.020477 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 03:47:08.020897 master-0 kubenswrapper[4045]: I0308 03:47:08.020787 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 03:47:08.023356 master-0 kubenswrapper[4045]: I0308 03:47:08.021256 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 03:47:08.023356 master-0 kubenswrapper[4045]: I0308 03:47:08.021493 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 03:47:08.023356 master-0 kubenswrapper[4045]: I0308 03:47:08.021677 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 03:47:08.023356 master-0 kubenswrapper[4045]: I0308 03:47:08.022104 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 03:47:08.023356 master-0 kubenswrapper[4045]: I0308 03:47:08.022159 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 03:47:08.206283 master-0 kubenswrapper[4045]: I0308 03:47:08.206112 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-ca\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.206283 master-0 kubenswrapper[4045]: I0308 03:47:08.206237 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7752f9-7b9a-451f-997a-e9f696d38b34-serving-cert\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.206283 master-0 kubenswrapper[4045]: I0308 03:47:08.206274 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b5zb\" (UniqueName: \"kubernetes.io/projected/5a7752f9-7b9a-451f-997a-e9f696d38b34-kube-api-access-8b5zb\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.206283 master-0 kubenswrapper[4045]: I0308 03:47:08.206333 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-config\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.207323 master-0 kubenswrapper[4045]: I0308 03:47:08.206387 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-client\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.207323 master-0 kubenswrapper[4045]: I0308 03:47:08.206434 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.211360 master-0 kubenswrapper[4045]: I0308 03:47:08.211303 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-4mr6p"] Mar 08 03:47:08.211774 master-0 kubenswrapper[4045]: I0308 03:47:08.211738 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:08.216036 master-0 kubenswrapper[4045]: I0308 03:47:08.215970 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58"] Mar 08 03:47:08.216509 master-0 kubenswrapper[4045]: I0308 03:47:08.216460 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:47:08.223865 master-0 kubenswrapper[4045]: I0308 03:47:08.222293 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5"] Mar 08 03:47:08.223865 master-0 kubenswrapper[4045]: I0308 03:47:08.222933 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:08.224078 master-0 kubenswrapper[4045]: I0308 03:47:08.223990 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj"] Mar 08 03:47:08.224558 master-0 kubenswrapper[4045]: I0308 03:47:08.224502 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:08.235716 master-0 kubenswrapper[4045]: I0308 03:47:08.235623 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6"] Mar 08 03:47:08.236209 master-0 kubenswrapper[4045]: I0308 03:47:08.236174 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:47:08.238239 master-0 kubenswrapper[4045]: I0308 03:47:08.237256 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq"] Mar 08 03:47:08.238239 master-0 kubenswrapper[4045]: I0308 03:47:08.237847 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:08.239121 master-0 kubenswrapper[4045]: I0308 03:47:08.239065 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-t77qr"] Mar 08 03:47:08.239752 master-0 kubenswrapper[4045]: I0308 03:47:08.239708 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:08.241421 master-0 kubenswrapper[4045]: I0308 03:47:08.240649 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795"] Mar 08 03:47:08.241421 master-0 kubenswrapper[4045]: I0308 03:47:08.241256 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:47:08.251917 master-0 kubenswrapper[4045]: I0308 03:47:08.251621 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 03:47:08.254206 master-0 kubenswrapper[4045]: I0308 03:47:08.253895 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52"] Mar 08 03:47:08.254206 master-0 kubenswrapper[4045]: I0308 03:47:08.253940 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 03:47:08.254748 master-0 kubenswrapper[4045]: I0308 03:47:08.254507 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 08 03:47:08.254748 master-0 kubenswrapper[4045]: I0308 03:47:08.254544 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 03:47:08.254748 master-0 kubenswrapper[4045]: I0308 03:47:08.254646 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 08 03:47:08.254748 master-0 kubenswrapper[4045]: I0308 03:47:08.254558 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 08 03:47:08.255062 master-0 kubenswrapper[4045]: I0308 03:47:08.254722 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 08 03:47:08.255062 master-0 kubenswrapper[4045]: I0308 03:47:08.255002 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 03:47:08.255645 master-0 kubenswrapper[4045]: I0308 03:47:08.255567 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:08.255744 master-0 kubenswrapper[4045]: I0308 03:47:08.255684 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 08 03:47:08.255744 master-0 kubenswrapper[4045]: I0308 03:47:08.255715 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 03:47:08.256289 master-0 kubenswrapper[4045]: I0308 03:47:08.256079 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 08 03:47:08.256289 master-0 kubenswrapper[4045]: I0308 03:47:08.256091 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 08 03:47:08.260749 master-0 kubenswrapper[4045]: I0308 03:47:08.260260 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 03:47:08.260749 master-0 kubenswrapper[4045]: I0308 03:47:08.260698 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 03:47:08.263639 master-0 kubenswrapper[4045]: I0308 03:47:08.261221 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 03:47:08.263639 master-0 kubenswrapper[4045]: I0308 03:47:08.261579 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 08 03:47:08.263639 master-0 kubenswrapper[4045]: I0308 03:47:08.261774 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 03:47:08.263639 master-0 kubenswrapper[4045]: I0308 03:47:08.261853 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 08 03:47:08.263639 master-0 kubenswrapper[4045]: I0308 03:47:08.262107 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 08 03:47:08.263639 master-0 kubenswrapper[4045]: I0308 03:47:08.262518 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 03:47:08.263639 master-0 kubenswrapper[4045]: I0308 03:47:08.262569 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 03:47:08.263639 master-0 kubenswrapper[4045]: I0308 03:47:08.262710 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 08 03:47:08.263639 master-0 kubenswrapper[4045]: I0308 03:47:08.263162 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 03:47:08.263639 master-0 kubenswrapper[4045]: I0308 03:47:08.263344 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 03:47:08.264307 master-0 kubenswrapper[4045]: I0308 03:47:08.264009 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 03:47:08.264507 master-0 kubenswrapper[4045]: I0308 03:47:08.264459 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs"] Mar 08 03:47:08.265408 master-0 kubenswrapper[4045]: I0308 03:47:08.265128 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 03:47:08.275276 master-0 kubenswrapper[4045]: I0308 03:47:08.275211 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 03:47:08.277917 master-0 kubenswrapper[4045]: I0308 03:47:08.277864 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 03:47:08.278799 master-0 kubenswrapper[4045]: I0308 03:47:08.278748 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:47:08.279557 master-0 kubenswrapper[4045]: I0308 03:47:08.279482 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72"] Mar 08 03:47:08.280002 master-0 kubenswrapper[4045]: I0308 03:47:08.279690 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:47:08.280075 master-0 kubenswrapper[4045]: I0308 03:47:08.279759 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:08.281569 master-0 kubenswrapper[4045]: I0308 03:47:08.280852 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 08 03:47:08.281569 master-0 kubenswrapper[4045]: I0308 03:47:08.280857 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 08 03:47:08.281569 master-0 kubenswrapper[4045]: I0308 03:47:08.281021 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj"] Mar 08 03:47:08.281569 master-0 kubenswrapper[4045]: I0308 03:47:08.280888 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 08 03:47:08.281569 master-0 kubenswrapper[4045]: I0308 03:47:08.281442 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" Mar 08 03:47:08.281797 master-0 kubenswrapper[4045]: I0308 03:47:08.281608 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 08 03:47:08.282120 master-0 kubenswrapper[4045]: I0308 03:47:08.282073 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d"] Mar 08 03:47:08.282956 master-0 kubenswrapper[4045]: I0308 03:47:08.282363 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:08.283401 master-0 kubenswrapper[4045]: I0308 03:47:08.283355 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-xttlz"] Mar 08 03:47:08.283939 master-0 kubenswrapper[4045]: I0308 03:47:08.283868 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:08.284755 master-0 kubenswrapper[4045]: I0308 03:47:08.284722 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp"] Mar 08 03:47:08.284972 master-0 kubenswrapper[4045]: I0308 03:47:08.284943 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 03:47:08.285092 master-0 kubenswrapper[4045]: I0308 03:47:08.285054 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 03:47:08.285218 master-0 kubenswrapper[4045]: I0308 03:47:08.285104 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 08 03:47:08.285218 master-0 kubenswrapper[4045]: I0308 03:47:08.285114 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:08.286004 master-0 kubenswrapper[4045]: I0308 03:47:08.285958 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q"] Mar 08 03:47:08.286623 master-0 kubenswrapper[4045]: I0308 03:47:08.286573 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:47:08.287295 master-0 kubenswrapper[4045]: I0308 03:47:08.287251 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz"] Mar 08 03:47:08.288475 master-0 kubenswrapper[4045]: I0308 03:47:08.288412 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:08.288678 master-0 kubenswrapper[4045]: I0308 03:47:08.288618 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-xhbrl"] Mar 08 03:47:08.289300 master-0 kubenswrapper[4045]: I0308 03:47:08.289244 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-xhbrl" Mar 08 03:47:08.289300 master-0 kubenswrapper[4045]: I0308 03:47:08.289283 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 03:47:08.289486 master-0 kubenswrapper[4045]: I0308 03:47:08.289358 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 08 03:47:08.290040 master-0 kubenswrapper[4045]: I0308 03:47:08.289985 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 03:47:08.290727 master-0 kubenswrapper[4045]: I0308 03:47:08.290632 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 03:47:08.291046 master-0 kubenswrapper[4045]: I0308 03:47:08.290752 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 08 03:47:08.291046 master-0 kubenswrapper[4045]: I0308 03:47:08.290931 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 08 03:47:08.291340 master-0 kubenswrapper[4045]: I0308 03:47:08.291089 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 03:47:08.291418 master-0 kubenswrapper[4045]: I0308 03:47:08.291350 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 03:47:08.291788 master-0 kubenswrapper[4045]: I0308 03:47:08.291733 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 03:47:08.291960 master-0 kubenswrapper[4045]: I0308 03:47:08.291910 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 03:47:08.292101 master-0 kubenswrapper[4045]: I0308 03:47:08.291923 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2"] Mar 08 03:47:08.292690 master-0 kubenswrapper[4045]: I0308 03:47:08.292641 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 08 03:47:08.292786 master-0 kubenswrapper[4045]: I0308 03:47:08.292736 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 03:47:08.293019 master-0 kubenswrapper[4045]: I0308 03:47:08.292649 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:08.293234 master-0 kubenswrapper[4045]: I0308 03:47:08.293184 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp"] Mar 08 03:47:08.293763 master-0 kubenswrapper[4045]: I0308 03:47:08.293717 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:08.293871 master-0 kubenswrapper[4045]: I0308 03:47:08.293768 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 03:47:08.294935 master-0 kubenswrapper[4045]: I0308 03:47:08.294855 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf"] Mar 08 03:47:08.295290 master-0 kubenswrapper[4045]: I0308 03:47:08.295246 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:47:08.295878 master-0 kubenswrapper[4045]: I0308 03:47:08.295776 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 08 03:47:08.295975 master-0 kubenswrapper[4045]: I0308 03:47:08.295912 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j"] Mar 08 03:47:08.296311 master-0 kubenswrapper[4045]: I0308 03:47:08.296263 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:47:08.297326 master-0 kubenswrapper[4045]: I0308 03:47:08.297261 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682"] Mar 08 03:47:08.297873 master-0 kubenswrapper[4045]: I0308 03:47:08.297820 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh"] Mar 08 03:47:08.298129 master-0 kubenswrapper[4045]: I0308 03:47:08.298088 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:47:08.300614 master-0 kubenswrapper[4045]: I0308 03:47:08.300566 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 03:47:08.306045 master-0 kubenswrapper[4045]: I0308 03:47:08.302726 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:08.306045 master-0 kubenswrapper[4045]: I0308 03:47:08.304567 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 03:47:08.306045 master-0 kubenswrapper[4045]: I0308 03:47:08.304955 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 08 03:47:08.306045 master-0 kubenswrapper[4045]: I0308 03:47:08.305489 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 03:47:08.306045 master-0 kubenswrapper[4045]: I0308 03:47:08.305633 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-j8pv6"] Mar 08 03:47:08.309058 master-0 kubenswrapper[4045]: I0308 03:47:08.306723 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 03:47:08.309058 master-0 kubenswrapper[4045]: I0308 03:47:08.307086 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 03:47:08.309058 master-0 kubenswrapper[4045]: I0308 03:47:08.307427 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 03:47:08.309058 master-0 kubenswrapper[4045]: I0308 03:47:08.307440 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 03:47:08.309058 master-0 kubenswrapper[4045]: I0308 03:47:08.307356 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 03:47:08.309058 master-0 kubenswrapper[4045]: I0308 03:47:08.307528 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 03:47:08.309058 master-0 kubenswrapper[4045]: I0308 03:47:08.307587 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 03:47:08.309058 master-0 kubenswrapper[4045]: I0308 03:47:08.307618 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 03:47:08.309058 master-0 kubenswrapper[4045]: I0308 03:47:08.307783 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 03:47:08.309058 master-0 kubenswrapper[4045]: I0308 03:47:08.307364 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 03:47:08.309058 master-0 kubenswrapper[4045]: I0308 03:47:08.307906 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 03:47:08.309058 master-0 kubenswrapper[4045]: I0308 03:47:08.308077 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 03:47:08.309058 master-0 kubenswrapper[4045]: I0308 03:47:08.308161 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 03:47:08.309058 master-0 kubenswrapper[4045]: I0308 03:47:08.308165 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 03:47:08.309058 master-0 kubenswrapper[4045]: I0308 03:47:08.308545 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 08 03:47:08.309058 master-0 kubenswrapper[4045]: I0308 03:47:08.308548 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 08 03:47:08.309058 master-0 kubenswrapper[4045]: I0308 03:47:08.308690 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 03:47:08.309058 master-0 kubenswrapper[4045]: I0308 03:47:08.308668 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-client\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.310753 master-0 kubenswrapper[4045]: I0308 03:47:08.309547 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 03:47:08.312427 master-0 kubenswrapper[4045]: I0308 03:47:08.311000 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.312427 master-0 kubenswrapper[4045]: I0308 03:47:08.311214 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-ca\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.312427 master-0 kubenswrapper[4045]: I0308 03:47:08.311228 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:47:08.312427 master-0 kubenswrapper[4045]: I0308 03:47:08.311261 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7752f9-7b9a-451f-997a-e9f696d38b34-serving-cert\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.312427 master-0 kubenswrapper[4045]: I0308 03:47:08.311380 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b5zb\" (UniqueName: \"kubernetes.io/projected/5a7752f9-7b9a-451f-997a-e9f696d38b34-kube-api-access-8b5zb\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.312427 master-0 kubenswrapper[4045]: I0308 03:47:08.312007 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7"] Mar 08 03:47:08.312427 master-0 kubenswrapper[4045]: I0308 03:47:08.311867 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-config\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.312982 master-0 kubenswrapper[4045]: I0308 03:47:08.312691 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:08.313610 master-0 kubenswrapper[4045]: I0308 03:47:08.313382 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.316113 master-0 kubenswrapper[4045]: I0308 03:47:08.315393 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 03:47:08.316113 master-0 kubenswrapper[4045]: I0308 03:47:08.315425 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 03:47:08.316113 master-0 kubenswrapper[4045]: I0308 03:47:08.315526 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 03:47:08.316113 master-0 kubenswrapper[4045]: I0308 03:47:08.315449 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-config\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.316113 master-0 kubenswrapper[4045]: I0308 03:47:08.315754 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 03:47:08.316113 master-0 kubenswrapper[4045]: I0308 03:47:08.315854 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-ca\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.316635 master-0 kubenswrapper[4045]: I0308 03:47:08.316453 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 03:47:08.317294 master-0 kubenswrapper[4045]: I0308 03:47:08.316988 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 03:47:08.318495 master-0 kubenswrapper[4045]: I0308 03:47:08.318424 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7752f9-7b9a-451f-997a-e9f696d38b34-serving-cert\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.318613 master-0 kubenswrapper[4045]: I0308 03:47:08.318510 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-4mr6p"] Mar 08 03:47:08.321461 master-0 kubenswrapper[4045]: I0308 03:47:08.320428 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 03:47:08.321461 master-0 kubenswrapper[4045]: I0308 03:47:08.320984 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-client\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.321461 master-0 kubenswrapper[4045]: I0308 03:47:08.321158 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 03:47:08.322415 master-0 kubenswrapper[4045]: I0308 03:47:08.322118 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 03:47:08.322415 master-0 kubenswrapper[4045]: I0308 03:47:08.322148 4045 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 03:47:08.322415 master-0 kubenswrapper[4045]: I0308 03:47:08.322234 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 03:47:08.325679 master-0 kubenswrapper[4045]: I0308 03:47:08.325627 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 03:47:08.327583 master-0 kubenswrapper[4045]: I0308 03:47:08.327509 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58"] Mar 08 03:47:08.329882 master-0 kubenswrapper[4045]: I0308 03:47:08.329842 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 03:47:08.413502 master-0 kubenswrapper[4045]: I0308 03:47:08.413424 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:47:08.413502 master-0 kubenswrapper[4045]: I0308 03:47:08.413472 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/7ff63c73-62a3-44b4-acd3-1b3df175794f-operand-assets\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:47:08.413502 master-0 kubenswrapper[4045]: I0308 03:47:08.413498 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnzt7\" (UniqueName: \"kubernetes.io/projected/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-kube-api-access-pnzt7\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:08.413794 master-0 kubenswrapper[4045]: I0308 03:47:08.413546 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-available-featuregates\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:08.413794 master-0 kubenswrapper[4045]: I0308 03:47:08.413602 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mghmh\" (UniqueName: \"kubernetes.io/projected/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-kube-api-access-mghmh\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:08.413794 master-0 kubenswrapper[4045]: I0308 03:47:08.413626 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d377285-0336-41b7-b48f-c44a7b563498-serving-cert\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:47:08.413794 master-0 kubenswrapper[4045]: I0308 03:47:08.413697 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qn5v\" (UniqueName: \"kubernetes.io/projected/0d377285-0336-41b7-b48f-c44a7b563498-kube-api-access-7qn5v\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:47:08.413794 master-0 kubenswrapper[4045]: I0308 03:47:08.413760 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ebf1330-e044-4ff5-8b48-2d667e0c5625-config\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:47:08.413794 master-0 kubenswrapper[4045]: I0308 03:47:08.413794 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee586416-6f56-4ea4-ad62-95de1e6df23b-service-ca-bundle\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:08.413794 master-0 kubenswrapper[4045]: I0308 03:47:08.413847 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-serving-cert\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:08.414295 master-0 kubenswrapper[4045]: I0308 03:47:08.413933 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:47:08.414295 master-0 kubenswrapper[4045]: I0308 03:47:08.414059 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ebf1330-e044-4ff5-8b48-2d667e0c5625-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:47:08.414295 master-0 kubenswrapper[4045]: I0308 03:47:08.414160 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26180f77-0b1a-4d0f-9ed0-a12fdee69817-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:47:08.414295 master-0 kubenswrapper[4045]: I0308 03:47:08.414228 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfqc5\" (UniqueName: \"kubernetes.io/projected/7ff63c73-62a3-44b4-acd3-1b3df175794f-kube-api-access-vfqc5\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:47:08.414295 master-0 kubenswrapper[4045]: I0308 03:47:08.414255 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29dpg\" (UniqueName: \"kubernetes.io/projected/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-kube-api-access-29dpg\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:08.414690 master-0 kubenswrapper[4045]: I0308 03:47:08.414339 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x997v\" (UniqueName: \"kubernetes.io/projected/6cde5024-edf7-4fa4-8964-cabe7899578b-kube-api-access-x997v\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:08.414690 master-0 kubenswrapper[4045]: I0308 03:47:08.414376 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3eea925-73b3-4693-8f0e-6dd26107f60a-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-nm8fj\" (UID: \"b3eea925-73b3-4693-8f0e-6dd26107f60a\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" Mar 08 03:47:08.414690 master-0 kubenswrapper[4045]: I0308 03:47:08.414412 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ee586416-6f56-4ea4-ad62-95de1e6df23b-snapshots\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:08.414690 master-0 kubenswrapper[4045]: I0308 03:47:08.414446 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee586416-6f56-4ea4-ad62-95de1e6df23b-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:08.414690 master-0 kubenswrapper[4045]: I0308 03:47:08.414469 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-serving-cert\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:08.414690 master-0 kubenswrapper[4045]: I0308 03:47:08.414552 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff63c73-62a3-44b4-acd3-1b3df175794f-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:47:08.414690 master-0 kubenswrapper[4045]: I0308 03:47:08.414605 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpsx7\" (UniqueName: \"kubernetes.io/projected/8efdcef9-9b31-4567-b7f9-cb59a894273d-kube-api-access-cpsx7\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:08.415222 master-0 kubenswrapper[4045]: I0308 03:47:08.414789 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:08.415222 master-0 kubenswrapper[4045]: I0308 03:47:08.414945 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:08.415222 master-0 kubenswrapper[4045]: I0308 03:47:08.414981 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sx5s\" (UniqueName: \"kubernetes.io/projected/b3eea925-73b3-4693-8f0e-6dd26107f60a-kube-api-access-6sx5s\") pod \"cluster-storage-operator-6fbfc8dc8f-nm8fj\" (UID: \"b3eea925-73b3-4693-8f0e-6dd26107f60a\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" Mar 08 03:47:08.415222 master-0 kubenswrapper[4045]: I0308 03:47:08.415022 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30211469-7108-4820-a988-26fc4ced734e-config\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:47:08.415222 master-0 kubenswrapper[4045]: I0308 03:47:08.415088 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-serving-cert\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:47:08.415222 master-0 kubenswrapper[4045]: I0308 03:47:08.415175 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:08.415569 master-0 kubenswrapper[4045]: I0308 03:47:08.415249 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx4fw\" (UniqueName: \"kubernetes.io/projected/232c421d-96f0-4894-b8d8-74f43d02bbd3-kube-api-access-fx4fw\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:08.415569 master-0 kubenswrapper[4045]: I0308 03:47:08.415363 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd549\" (UniqueName: \"kubernetes.io/projected/52b495ac-bb28-44f3-b925-3c54f86d5ec4-kube-api-access-dd549\") pod \"csi-snapshot-controller-operator-5685fbc7d-xhbrl\" (UID: \"52b495ac-bb28-44f3-b925-3c54f86d5ec4\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-xhbrl" Mar 08 03:47:08.415569 master-0 kubenswrapper[4045]: I0308 03:47:08.415529 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:08.415756 master-0 kubenswrapper[4045]: I0308 03:47:08.415654 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fncng\" (UniqueName: \"kubernetes.io/projected/30211469-7108-4820-a988-26fc4ced734e-kube-api-access-fncng\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:47:08.417008 master-0 kubenswrapper[4045]: I0308 03:47:08.416942 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:08.417143 master-0 kubenswrapper[4045]: I0308 03:47:08.417031 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxxhh\" (UniqueName: \"kubernetes.io/projected/ee586416-6f56-4ea4-ad62-95de1e6df23b-kube-api-access-sxxhh\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:08.417317 master-0 kubenswrapper[4045]: I0308 03:47:08.417246 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2h62\" (UniqueName: \"kubernetes.io/projected/1482d789-884b-4337-b598-f0e2b71eb9f2-kube-api-access-m2h62\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:08.417401 master-0 kubenswrapper[4045]: I0308 03:47:08.417332 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-bound-sa-token\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:08.417401 master-0 kubenswrapper[4045]: I0308 03:47:08.417380 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfvnn\" (UniqueName: \"kubernetes.io/projected/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-kube-api-access-cfvnn\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:47:08.417535 master-0 kubenswrapper[4045]: I0308 03:47:08.417429 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmpdd\" (UniqueName: \"kubernetes.io/projected/0418ff42-7eac-4266-97b5-4df88623d066-kube-api-access-kmpdd\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:08.417535 master-0 kubenswrapper[4045]: I0308 03:47:08.417491 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:08.417759 master-0 kubenswrapper[4045]: I0308 03:47:08.417557 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:08.417759 master-0 kubenswrapper[4045]: I0308 03:47:08.417596 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw7mr\" (UniqueName: \"kubernetes.io/projected/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-kube-api-access-fw7mr\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:08.417759 master-0 kubenswrapper[4045]: I0308 03:47:08.417637 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:08.417759 master-0 kubenswrapper[4045]: I0308 03:47:08.417684 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/232c421d-96f0-4894-b8d8-74f43d02bbd3-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:08.417759 master-0 kubenswrapper[4045]: I0308 03:47:08.417731 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-trusted-ca\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:08.418221 master-0 kubenswrapper[4045]: I0308 03:47:08.417778 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d377285-0336-41b7-b48f-c44a7b563498-config\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:47:08.418221 master-0 kubenswrapper[4045]: I0308 03:47:08.417865 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26180f77-0b1a-4d0f-9ed0-a12fdee69817-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:47:08.418221 master-0 kubenswrapper[4045]: I0308 03:47:08.417908 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:08.418221 master-0 kubenswrapper[4045]: I0308 03:47:08.417936 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:47:08.418221 master-0 kubenswrapper[4045]: I0308 03:47:08.417957 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lfcj\" (UniqueName: \"kubernetes.io/projected/54ad284e-d40e-4e69-b898-f5093952a0e6-kube-api-access-9lfcj\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:08.418221 master-0 kubenswrapper[4045]: I0308 03:47:08.417982 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:08.418221 master-0 kubenswrapper[4045]: I0308 03:47:08.418061 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxkm6\" (UniqueName: \"kubernetes.io/projected/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-kube-api-access-zxkm6\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:08.418221 master-0 kubenswrapper[4045]: I0308 03:47:08.418153 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-images\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:08.418221 master-0 kubenswrapper[4045]: I0308 03:47:08.418181 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee586416-6f56-4ea4-ad62-95de1e6df23b-serving-cert\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:08.418221 master-0 kubenswrapper[4045]: I0308 03:47:08.418220 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rchj5\" (UniqueName: \"kubernetes.io/projected/e78b283b-981e-48d7-a5f2-53f8401766ea-kube-api-access-rchj5\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:08.419037 master-0 kubenswrapper[4045]: I0308 03:47:08.418276 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hccv4\" (UniqueName: \"kubernetes.io/projected/0ebf1330-e044-4ff5-8b48-2d667e0c5625-kube-api-access-hccv4\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:47:08.419037 master-0 kubenswrapper[4045]: I0308 03:47:08.418317 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:08.419037 master-0 kubenswrapper[4045]: I0308 03:47:08.418366 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:47:08.419037 master-0 kubenswrapper[4045]: I0308 03:47:08.418385 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d831cb23-7411-4072-8273-c167d9afca28-images\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:08.419037 master-0 kubenswrapper[4045]: I0308 03:47:08.418401 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:08.419037 master-0 kubenswrapper[4045]: I0308 03:47:08.418419 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-config\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:47:08.419037 master-0 kubenswrapper[4045]: I0308 03:47:08.418437 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30211469-7108-4820-a988-26fc4ced734e-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:47:08.419037 master-0 kubenswrapper[4045]: I0308 03:47:08.418453 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:08.419037 master-0 kubenswrapper[4045]: I0308 03:47:08.418470 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26180f77-0b1a-4d0f-9ed0-a12fdee69817-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:47:08.419037 master-0 kubenswrapper[4045]: I0308 03:47:08.418487 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:47:08.419037 master-0 kubenswrapper[4045]: I0308 03:47:08.418505 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/0418ff42-7eac-4266-97b5-4df88623d066-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:08.419037 master-0 kubenswrapper[4045]: I0308 03:47:08.418523 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:08.419037 master-0 kubenswrapper[4045]: I0308 03:47:08.418643 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:08.419037 master-0 kubenswrapper[4045]: I0308 03:47:08.418687 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d831cb23-7411-4072-8273-c167d9afca28-config\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:08.420133 master-0 kubenswrapper[4045]: I0308 03:47:08.418846 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:08.420133 master-0 kubenswrapper[4045]: I0308 03:47:08.418895 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:08.420133 master-0 kubenswrapper[4045]: I0308 03:47:08.418997 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-config\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:08.420133 master-0 kubenswrapper[4045]: I0308 03:47:08.419030 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:47:08.420133 master-0 kubenswrapper[4045]: I0308 03:47:08.419061 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwkwt\" (UniqueName: \"kubernetes.io/projected/d831cb23-7411-4072-8273-c167d9afca28-kube-api-access-dwkwt\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:08.420133 master-0 kubenswrapper[4045]: I0308 03:47:08.419093 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:08.420133 master-0 kubenswrapper[4045]: I0308 03:47:08.419124 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:08.524539 master-0 kubenswrapper[4045]: I0308 03:47:08.521260 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:08.524539 master-0 kubenswrapper[4045]: I0308 03:47:08.521338 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw7mr\" (UniqueName: \"kubernetes.io/projected/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-kube-api-access-fw7mr\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:08.524539 master-0 kubenswrapper[4045]: I0308 03:47:08.521367 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:08.524539 master-0 kubenswrapper[4045]: I0308 03:47:08.521396 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/232c421d-96f0-4894-b8d8-74f43d02bbd3-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:08.524539 master-0 kubenswrapper[4045]: I0308 03:47:08.521456 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:08.524539 master-0 kubenswrapper[4045]: I0308 03:47:08.521486 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d377285-0336-41b7-b48f-c44a7b563498-config\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:47:08.524539 master-0 kubenswrapper[4045]: I0308 03:47:08.521511 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-trusted-ca\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:08.524539 master-0 kubenswrapper[4045]: I0308 03:47:08.521535 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:08.524539 master-0 kubenswrapper[4045]: I0308 03:47:08.521558 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26180f77-0b1a-4d0f-9ed0-a12fdee69817-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:47:08.524539 master-0 kubenswrapper[4045]: I0308 03:47:08.521586 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:47:08.524539 master-0 kubenswrapper[4045]: I0308 03:47:08.521614 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lfcj\" (UniqueName: \"kubernetes.io/projected/54ad284e-d40e-4e69-b898-f5093952a0e6-kube-api-access-9lfcj\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:08.524539 master-0 kubenswrapper[4045]: I0308 03:47:08.521666 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:08.524539 master-0 kubenswrapper[4045]: I0308 03:47:08.521696 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxkm6\" (UniqueName: \"kubernetes.io/projected/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-kube-api-access-zxkm6\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:08.524539 master-0 kubenswrapper[4045]: I0308 03:47:08.521724 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-images\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:08.524539 master-0 kubenswrapper[4045]: I0308 03:47:08.521746 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee586416-6f56-4ea4-ad62-95de1e6df23b-serving-cert\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:08.528101 master-0 kubenswrapper[4045]: I0308 03:47:08.521770 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rchj5\" (UniqueName: \"kubernetes.io/projected/e78b283b-981e-48d7-a5f2-53f8401766ea-kube-api-access-rchj5\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:08.528101 master-0 kubenswrapper[4045]: I0308 03:47:08.521796 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hccv4\" (UniqueName: \"kubernetes.io/projected/0ebf1330-e044-4ff5-8b48-2d667e0c5625-kube-api-access-hccv4\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:47:08.528101 master-0 kubenswrapper[4045]: I0308 03:47:08.521842 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:08.528101 master-0 kubenswrapper[4045]: I0308 03:47:08.521868 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqhzl\" (UniqueName: \"kubernetes.io/projected/1eb851be-f157-48ea-9a39-1361b68d2639-kube-api-access-nqhzl\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:08.528101 master-0 kubenswrapper[4045]: I0308 03:47:08.521917 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:47:08.528101 master-0 kubenswrapper[4045]: I0308 03:47:08.521942 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d831cb23-7411-4072-8273-c167d9afca28-images\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:08.528101 master-0 kubenswrapper[4045]: I0308 03:47:08.521967 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:08.528101 master-0 kubenswrapper[4045]: I0308 03:47:08.521992 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30211469-7108-4820-a988-26fc4ced734e-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:47:08.528101 master-0 kubenswrapper[4045]: I0308 03:47:08.522044 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-config\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:47:08.528101 master-0 kubenswrapper[4045]: I0308 03:47:08.522072 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:08.528101 master-0 kubenswrapper[4045]: I0308 03:47:08.522096 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26180f77-0b1a-4d0f-9ed0-a12fdee69817-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:47:08.528101 master-0 kubenswrapper[4045]: I0308 03:47:08.522141 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:47:08.528101 master-0 kubenswrapper[4045]: I0308 03:47:08.522168 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/0418ff42-7eac-4266-97b5-4df88623d066-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:08.528101 master-0 kubenswrapper[4045]: I0308 03:47:08.522193 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:08.528101 master-0 kubenswrapper[4045]: I0308 03:47:08.522219 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:08.529088 master-0 kubenswrapper[4045]: I0308 03:47:08.522245 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d831cb23-7411-4072-8273-c167d9afca28-config\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:08.529088 master-0 kubenswrapper[4045]: I0308 03:47:08.522269 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:08.529088 master-0 kubenswrapper[4045]: I0308 03:47:08.522292 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:08.529088 master-0 kubenswrapper[4045]: I0308 03:47:08.522324 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-config\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:08.529088 master-0 kubenswrapper[4045]: I0308 03:47:08.522348 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:47:08.529088 master-0 kubenswrapper[4045]: I0308 03:47:08.522374 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwkwt\" (UniqueName: \"kubernetes.io/projected/d831cb23-7411-4072-8273-c167d9afca28-kube-api-access-dwkwt\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:08.529088 master-0 kubenswrapper[4045]: I0308 03:47:08.522399 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:08.529088 master-0 kubenswrapper[4045]: I0308 03:47:08.522424 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:08.529088 master-0 kubenswrapper[4045]: I0308 03:47:08.522459 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:47:08.529088 master-0 kubenswrapper[4045]: I0308 03:47:08.522482 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/7ff63c73-62a3-44b4-acd3-1b3df175794f-operand-assets\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:47:08.529088 master-0 kubenswrapper[4045]: I0308 03:47:08.522508 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnzt7\" (UniqueName: \"kubernetes.io/projected/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-kube-api-access-pnzt7\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:08.529088 master-0 kubenswrapper[4045]: I0308 03:47:08.522533 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-available-featuregates\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:08.529088 master-0 kubenswrapper[4045]: I0308 03:47:08.522557 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mghmh\" (UniqueName: \"kubernetes.io/projected/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-kube-api-access-mghmh\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:08.529088 master-0 kubenswrapper[4045]: I0308 03:47:08.522583 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qn5v\" (UniqueName: \"kubernetes.io/projected/0d377285-0336-41b7-b48f-c44a7b563498-kube-api-access-7qn5v\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:47:08.529088 master-0 kubenswrapper[4045]: I0308 03:47:08.522614 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ebf1330-e044-4ff5-8b48-2d667e0c5625-config\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:47:08.529898 master-0 kubenswrapper[4045]: I0308 03:47:08.522640 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee586416-6f56-4ea4-ad62-95de1e6df23b-service-ca-bundle\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:08.529898 master-0 kubenswrapper[4045]: I0308 03:47:08.522663 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-serving-cert\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:08.529898 master-0 kubenswrapper[4045]: I0308 03:47:08.522685 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d377285-0336-41b7-b48f-c44a7b563498-serving-cert\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:47:08.529898 master-0 kubenswrapper[4045]: I0308 03:47:08.522717 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ebf1330-e044-4ff5-8b48-2d667e0c5625-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:47:08.529898 master-0 kubenswrapper[4045]: I0308 03:47:08.522750 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5"] Mar 08 03:47:08.529898 master-0 kubenswrapper[4045]: I0308 03:47:08.522781 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:47:08.529898 master-0 kubenswrapper[4045]: I0308 03:47:08.522809 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26180f77-0b1a-4d0f-9ed0-a12fdee69817-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:47:08.529898 master-0 kubenswrapper[4045]: I0308 03:47:08.522897 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29dpg\" (UniqueName: \"kubernetes.io/projected/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-kube-api-access-29dpg\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:08.529898 master-0 kubenswrapper[4045]: I0308 03:47:08.522923 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfqc5\" (UniqueName: \"kubernetes.io/projected/7ff63c73-62a3-44b4-acd3-1b3df175794f-kube-api-access-vfqc5\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:47:08.529898 master-0 kubenswrapper[4045]: I0308 03:47:08.522965 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x997v\" (UniqueName: \"kubernetes.io/projected/6cde5024-edf7-4fa4-8964-cabe7899578b-kube-api-access-x997v\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:08.529898 master-0 kubenswrapper[4045]: I0308 03:47:08.522992 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ee586416-6f56-4ea4-ad62-95de1e6df23b-snapshots\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:08.529898 master-0 kubenswrapper[4045]: I0308 03:47:08.523014 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee586416-6f56-4ea4-ad62-95de1e6df23b-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:08.529898 master-0 kubenswrapper[4045]: I0308 03:47:08.523038 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-serving-cert\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:08.529898 master-0 kubenswrapper[4045]: I0308 03:47:08.523046 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs"] Mar 08 03:47:08.529898 master-0 kubenswrapper[4045]: I0308 03:47:08.523064 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff63c73-62a3-44b4-acd3-1b3df175794f-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:47:08.529898 master-0 kubenswrapper[4045]: I0308 03:47:08.523094 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3eea925-73b3-4693-8f0e-6dd26107f60a-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-nm8fj\" (UID: \"b3eea925-73b3-4693-8f0e-6dd26107f60a\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" Mar 08 03:47:08.531092 master-0 kubenswrapper[4045]: I0308 03:47:08.523127 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpsx7\" (UniqueName: \"kubernetes.io/projected/8efdcef9-9b31-4567-b7f9-cb59a894273d-kube-api-access-cpsx7\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:08.531092 master-0 kubenswrapper[4045]: I0308 03:47:08.523164 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:08.531092 master-0 kubenswrapper[4045]: I0308 03:47:08.523194 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:08.531092 master-0 kubenswrapper[4045]: I0308 03:47:08.523221 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sx5s\" (UniqueName: \"kubernetes.io/projected/b3eea925-73b3-4693-8f0e-6dd26107f60a-kube-api-access-6sx5s\") pod \"cluster-storage-operator-6fbfc8dc8f-nm8fj\" (UID: \"b3eea925-73b3-4693-8f0e-6dd26107f60a\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" Mar 08 03:47:08.531092 master-0 kubenswrapper[4045]: I0308 03:47:08.523254 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30211469-7108-4820-a988-26fc4ced734e-config\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:47:08.531092 master-0 kubenswrapper[4045]: I0308 03:47:08.523391 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-serving-cert\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:47:08.531092 master-0 kubenswrapper[4045]: I0308 03:47:08.523426 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:08.531092 master-0 kubenswrapper[4045]: I0308 03:47:08.523459 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd549\" (UniqueName: \"kubernetes.io/projected/52b495ac-bb28-44f3-b925-3c54f86d5ec4-kube-api-access-dd549\") pod \"csi-snapshot-controller-operator-5685fbc7d-xhbrl\" (UID: \"52b495ac-bb28-44f3-b925-3c54f86d5ec4\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-xhbrl" Mar 08 03:47:08.531092 master-0 kubenswrapper[4045]: I0308 03:47:08.523517 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx4fw\" (UniqueName: \"kubernetes.io/projected/232c421d-96f0-4894-b8d8-74f43d02bbd3-kube-api-access-fx4fw\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:08.531092 master-0 kubenswrapper[4045]: I0308 03:47:08.523547 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:08.531092 master-0 kubenswrapper[4045]: I0308 03:47:08.523606 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fncng\" (UniqueName: \"kubernetes.io/projected/30211469-7108-4820-a988-26fc4ced734e-kube-api-access-fncng\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:47:08.531092 master-0 kubenswrapper[4045]: I0308 03:47:08.523662 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:08.531092 master-0 kubenswrapper[4045]: I0308 03:47:08.523693 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxxhh\" (UniqueName: \"kubernetes.io/projected/ee586416-6f56-4ea4-ad62-95de1e6df23b-kube-api-access-sxxhh\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:08.531092 master-0 kubenswrapper[4045]: I0308 03:47:08.523751 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2h62\" (UniqueName: \"kubernetes.io/projected/1482d789-884b-4337-b598-f0e2b71eb9f2-kube-api-access-m2h62\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:08.531092 master-0 kubenswrapper[4045]: I0308 03:47:08.523778 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfvnn\" (UniqueName: \"kubernetes.io/projected/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-kube-api-access-cfvnn\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:47:08.532131 master-0 kubenswrapper[4045]: I0308 03:47:08.524047 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmpdd\" (UniqueName: \"kubernetes.io/projected/0418ff42-7eac-4266-97b5-4df88623d066-kube-api-access-kmpdd\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:08.532131 master-0 kubenswrapper[4045]: I0308 03:47:08.524165 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:08.532131 master-0 kubenswrapper[4045]: I0308 03:47:08.524239 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-bound-sa-token\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:08.532131 master-0 kubenswrapper[4045]: E0308 03:47:08.524957 4045 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:47:08.532131 master-0 kubenswrapper[4045]: E0308 03:47:08.525015 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert podName:6cde5024-edf7-4fa4-8964-cabe7899578b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:09.02499697 +0000 UTC m=+128.635697948 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-c46zz" (UID: "6cde5024-edf7-4fa4-8964-cabe7899578b") : secret "package-server-manager-serving-cert" not found Mar 08 03:47:08.532131 master-0 kubenswrapper[4045]: I0308 03:47:08.526132 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj"] Mar 08 03:47:08.575245 master-0 kubenswrapper[4045]: I0308 03:47:08.565632 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/0418ff42-7eac-4266-97b5-4df88623d066-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:08.575245 master-0 kubenswrapper[4045]: E0308 03:47:08.565717 4045 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:47:08.575245 master-0 kubenswrapper[4045]: E0308 03:47:08.565921 4045 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:08.575245 master-0 kubenswrapper[4045]: I0308 03:47:08.566057 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ee586416-6f56-4ea4-ad62-95de1e6df23b-snapshots\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:08.575245 master-0 kubenswrapper[4045]: E0308 03:47:08.566007 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls podName:69eb8ba2-7bfb-4433-8951-08f89e7bcb5f nodeName:}" failed. No retries permitted until 2026-03-08 03:47:09.065880063 +0000 UTC m=+128.676581021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-572xh" (UID: "69eb8ba2-7bfb-4433-8951-08f89e7bcb5f") : secret "image-registry-operator-tls" not found Mar 08 03:47:08.575245 master-0 kubenswrapper[4045]: E0308 03:47:08.566593 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:09.066515638 +0000 UTC m=+128.677216606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:08.575245 master-0 kubenswrapper[4045]: I0308 03:47:08.566846 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:08.575245 master-0 kubenswrapper[4045]: E0308 03:47:08.567059 4045 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:08.575245 master-0 kubenswrapper[4045]: E0308 03:47:08.567096 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls podName:0418ff42-7eac-4266-97b5-4df88623d066 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:09.067080661 +0000 UTC m=+128.677781619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-clqwj" (UID: "0418ff42-7eac-4266-97b5-4df88623d066") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:08.575245 master-0 kubenswrapper[4045]: I0308 03:47:08.567448 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d831cb23-7411-4072-8273-c167d9afca28-config\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:08.575245 master-0 kubenswrapper[4045]: E0308 03:47:08.567668 4045 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:47:08.575245 master-0 kubenswrapper[4045]: E0308 03:47:08.567774 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert podName:2dd4279d-a1a9-450a-a061-9008cd1ea8e0 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:09.067739878 +0000 UTC m=+128.678440846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert") pod "olm-operator-d64cfc9db-qddlp" (UID: "2dd4279d-a1a9-450a-a061-9008cd1ea8e0") : secret "olm-operator-serving-cert" not found Mar 08 03:47:08.575245 master-0 kubenswrapper[4045]: E0308 03:47:08.567938 4045 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:08.575245 master-0 kubenswrapper[4045]: E0308 03:47:08.568041 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:09.068023494 +0000 UTC m=+128.678724452 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:08.575245 master-0 kubenswrapper[4045]: I0308 03:47:08.568640 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/232c421d-96f0-4894-b8d8-74f43d02bbd3-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:08.580702 master-0 kubenswrapper[4045]: E0308 03:47:08.570204 4045 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: configmap "kube-rbac-proxy" not found Mar 08 03:47:08.580702 master-0 kubenswrapper[4045]: E0308 03:47:08.570343 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:09.070315268 +0000 UTC m=+128.681016236 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : configmap "kube-rbac-proxy" not found Mar 08 03:47:08.580702 master-0 kubenswrapper[4045]: E0308 03:47:08.570417 4045 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 08 03:47:08.580702 master-0 kubenswrapper[4045]: E0308 03:47:08.570489 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:09.070471782 +0000 UTC m=+128.681172760 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : secret "mco-proxy-tls" not found Mar 08 03:47:08.580702 master-0 kubenswrapper[4045]: I0308 03:47:08.571209 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-config\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:08.580702 master-0 kubenswrapper[4045]: E0308 03:47:08.571214 4045 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:47:08.580702 master-0 kubenswrapper[4045]: E0308 03:47:08.571406 4045 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:08.580702 master-0 kubenswrapper[4045]: E0308 03:47:08.571590 4045 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:08.580702 master-0 kubenswrapper[4045]: E0308 03:47:08.571650 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls podName:c9de4939-680a-4e3e-89fd-e20ecb8b10f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:09.071630259 +0000 UTC m=+128.682331227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls") pod "ingress-operator-677db989d6-t77qr" (UID: "c9de4939-680a-4e3e-89fd-e20ecb8b10f2") : secret "metrics-tls" not found Mar 08 03:47:08.580702 master-0 kubenswrapper[4045]: I0308 03:47:08.572209 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:47:08.580702 master-0 kubenswrapper[4045]: I0308 03:47:08.572232 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30211469-7108-4820-a988-26fc4ced734e-config\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:47:08.580702 master-0 kubenswrapper[4045]: I0308 03:47:08.572782 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee586416-6f56-4ea4-ad62-95de1e6df23b-service-ca-bundle\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:08.580702 master-0 kubenswrapper[4045]: I0308 03:47:08.575198 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d831cb23-7411-4072-8273-c167d9afca28-images\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:08.580702 master-0 kubenswrapper[4045]: E0308 03:47:08.575765 4045 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:47:08.580702 master-0 kubenswrapper[4045]: I0308 03:47:08.576077 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:47:08.580702 master-0 kubenswrapper[4045]: I0308 03:47:08.576137 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee586416-6f56-4ea4-ad62-95de1e6df23b-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:08.580702 master-0 kubenswrapper[4045]: I0308 03:47:08.576435 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:08.580702 master-0 kubenswrapper[4045]: I0308 03:47:08.577291 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq"] Mar 08 03:47:08.581168 master-0 kubenswrapper[4045]: E0308 03:47:08.577915 4045 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:47:08.581168 master-0 kubenswrapper[4045]: E0308 03:47:08.578043 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics podName:54ad284e-d40e-4e69-b898-f5093952a0e6 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:09.078008979 +0000 UTC m=+128.688709947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-9sw2d" (UID: "54ad284e-d40e-4e69-b898-f5093952a0e6") : secret "marketplace-operator-metrics" not found Mar 08 03:47:08.581168 master-0 kubenswrapper[4045]: I0308 03:47:08.578339 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d377285-0336-41b7-b48f-c44a7b563498-config\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:47:08.581168 master-0 kubenswrapper[4045]: I0308 03:47:08.578630 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-available-featuregates\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:08.581168 master-0 kubenswrapper[4045]: I0308 03:47:08.578481 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:08.581168 master-0 kubenswrapper[4045]: I0308 03:47:08.578869 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-config\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:47:08.581168 master-0 kubenswrapper[4045]: I0308 03:47:08.579333 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee586416-6f56-4ea4-ad62-95de1e6df23b-serving-cert\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:08.581168 master-0 kubenswrapper[4045]: I0308 03:47:08.579728 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:47:08.581168 master-0 kubenswrapper[4045]: I0308 03:47:08.579944 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26180f77-0b1a-4d0f-9ed0-a12fdee69817-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:47:08.581168 master-0 kubenswrapper[4045]: E0308 03:47:08.580291 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert podName:1482d789-884b-4337-b598-f0e2b71eb9f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:09.080196001 +0000 UTC m=+128.690896989 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert") pod "catalog-operator-7d9c49f57b-qlfgq" (UID: "1482d789-884b-4337-b598-f0e2b71eb9f2") : secret "catalog-operator-serving-cert" not found Mar 08 03:47:08.581168 master-0 kubenswrapper[4045]: I0308 03:47:08.580913 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-serving-cert\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:08.582022 master-0 kubenswrapper[4045]: I0308 03:47:08.581839 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/7ff63c73-62a3-44b4-acd3-1b3df175794f-operand-assets\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:47:08.582022 master-0 kubenswrapper[4045]: E0308 03:47:08.581817 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:09.081791619 +0000 UTC m=+128.692492617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "node-tuning-operator-tls" not found Mar 08 03:47:08.582022 master-0 kubenswrapper[4045]: I0308 03:47:08.581611 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-trusted-ca\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:08.582189 master-0 kubenswrapper[4045]: E0308 03:47:08.582135 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:09.082040024 +0000 UTC m=+128.692740992 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:08.582546 master-0 kubenswrapper[4045]: I0308 03:47:08.582357 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ebf1330-e044-4ff5-8b48-2d667e0c5625-config\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:47:08.582656 master-0 kubenswrapper[4045]: E0308 03:47:08.582622 4045 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:08.583281 master-0 kubenswrapper[4045]: E0308 03:47:08.583243 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls podName:8efdcef9-9b31-4567-b7f9-cb59a894273d nodeName:}" failed. No retries permitted until 2026-03-08 03:47:09.083201971 +0000 UTC m=+128.693902939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls") pod "dns-operator-589895fbb7-xttlz" (UID: "8efdcef9-9b31-4567-b7f9-cb59a894273d") : secret "metrics-tls" not found Mar 08 03:47:08.589987 master-0 kubenswrapper[4045]: I0308 03:47:08.585518 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-serving-cert\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:47:08.589987 master-0 kubenswrapper[4045]: I0308 03:47:08.586678 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b5zb\" (UniqueName: \"kubernetes.io/projected/5a7752f9-7b9a-451f-997a-e9f696d38b34-kube-api-access-8b5zb\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.589987 master-0 kubenswrapper[4045]: I0308 03:47:08.588785 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795"] Mar 08 03:47:08.589987 master-0 kubenswrapper[4045]: I0308 03:47:08.589642 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26180f77-0b1a-4d0f-9ed0-a12fdee69817-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:47:08.590622 master-0 kubenswrapper[4045]: I0308 03:47:08.590302 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30211469-7108-4820-a988-26fc4ced734e-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:47:08.596331 master-0 kubenswrapper[4045]: I0308 03:47:08.591534 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:47:08.596331 master-0 kubenswrapper[4045]: I0308 03:47:08.591797 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff63c73-62a3-44b4-acd3-1b3df175794f-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:47:08.596331 master-0 kubenswrapper[4045]: I0308 03:47:08.592127 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-images\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:08.596331 master-0 kubenswrapper[4045]: I0308 03:47:08.592192 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-xttlz"] Mar 08 03:47:08.596331 master-0 kubenswrapper[4045]: I0308 03:47:08.592884 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-serving-cert\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:08.596331 master-0 kubenswrapper[4045]: I0308 03:47:08.593681 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp"] Mar 08 03:47:08.596331 master-0 kubenswrapper[4045]: I0308 03:47:08.594148 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d377285-0336-41b7-b48f-c44a7b563498-serving-cert\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:47:08.596331 master-0 kubenswrapper[4045]: I0308 03:47:08.595162 4045 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-7c28p"] Mar 08 03:47:08.596331 master-0 kubenswrapper[4045]: I0308 03:47:08.595777 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ebf1330-e044-4ff5-8b48-2d667e0c5625-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:47:08.596331 master-0 kubenswrapper[4045]: I0308 03:47:08.596284 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3eea925-73b3-4693-8f0e-6dd26107f60a-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-nm8fj\" (UID: \"b3eea925-73b3-4693-8f0e-6dd26107f60a\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" Mar 08 03:47:08.596620 master-0 kubenswrapper[4045]: I0308 03:47:08.596533 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:47:08.597018 master-0 kubenswrapper[4045]: I0308 03:47:08.596926 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:08.599478 master-0 kubenswrapper[4045]: I0308 03:47:08.599444 4045 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 03:47:08.599712 master-0 kubenswrapper[4045]: I0308 03:47:08.599664 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2"] Mar 08 03:47:08.600722 master-0 kubenswrapper[4045]: I0308 03:47:08.600691 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-t77qr"] Mar 08 03:47:08.601604 master-0 kubenswrapper[4045]: I0308 03:47:08.601566 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682"] Mar 08 03:47:08.602546 master-0 kubenswrapper[4045]: I0308 03:47:08.602506 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-j8pv6"] Mar 08 03:47:08.603629 master-0 kubenswrapper[4045]: I0308 03:47:08.603588 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72"] Mar 08 03:47:08.605832 master-0 kubenswrapper[4045]: I0308 03:47:08.605788 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj"] Mar 08 03:47:08.625060 master-0 kubenswrapper[4045]: I0308 03:47:08.625035 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqhzl\" (UniqueName: \"kubernetes.io/projected/1eb851be-f157-48ea-9a39-1361b68d2639-kube-api-access-nqhzl\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:08.625598 master-0 kubenswrapper[4045]: I0308 03:47:08.625568 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:08.625753 master-0 kubenswrapper[4045]: E0308 03:47:08.625724 4045 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:47:08.625794 master-0 kubenswrapper[4045]: E0308 03:47:08.625787 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs podName:1eb851be-f157-48ea-9a39-1361b68d2639 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:09.125768614 +0000 UTC m=+128.736469582 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs") pod "multus-admission-controller-8d675b596-j8pv6" (UID: "1eb851be-f157-48ea-9a39-1361b68d2639") : secret "multus-admission-controller-secret" not found Mar 08 03:47:08.639191 master-0 kubenswrapper[4045]: I0308 03:47:08.639166 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:08.728135 master-0 kubenswrapper[4045]: I0308 03:47:08.726994 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-iptables-alerter-script\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:47:08.728135 master-0 kubenswrapper[4045]: I0308 03:47:08.727108 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkzb2\" (UniqueName: \"kubernetes.io/projected/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-kube-api-access-mkzb2\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:47:08.728135 master-0 kubenswrapper[4045]: I0308 03:47:08.727463 4045 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-host-slash\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:47:08.808845 master-0 kubenswrapper[4045]: I0308 03:47:08.807775 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52"] Mar 08 03:47:08.808845 master-0 kubenswrapper[4045]: I0308 03:47:08.807889 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp"] Mar 08 03:47:08.808845 master-0 kubenswrapper[4045]: I0308 03:47:08.808045 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf"] Mar 08 03:47:08.832795 master-0 kubenswrapper[4045]: I0308 03:47:08.832091 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-host-slash\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:47:08.832795 master-0 kubenswrapper[4045]: I0308 03:47:08.832666 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-iptables-alerter-script\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:47:08.832795 master-0 kubenswrapper[4045]: I0308 03:47:08.832713 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkzb2\" (UniqueName: \"kubernetes.io/projected/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-kube-api-access-mkzb2\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:47:08.833345 master-0 kubenswrapper[4045]: I0308 03:47:08.833309 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-host-slash\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:47:08.834560 master-0 kubenswrapper[4045]: I0308 03:47:08.834099 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j"] Mar 08 03:47:08.834560 master-0 kubenswrapper[4045]: I0308 03:47:08.834140 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d"] Mar 08 03:47:08.834560 master-0 kubenswrapper[4045]: I0308 03:47:08.834386 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-iptables-alerter-script\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:47:08.846858 master-0 kubenswrapper[4045]: I0308 03:47:08.842594 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6"] Mar 08 03:47:08.853857 master-0 kubenswrapper[4045]: I0308 03:47:08.851912 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-xhbrl"] Mar 08 03:47:08.853857 master-0 kubenswrapper[4045]: I0308 03:47:08.853174 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz"] Mar 08 03:47:08.856669 master-0 kubenswrapper[4045]: I0308 03:47:08.856637 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fncng\" (UniqueName: \"kubernetes.io/projected/30211469-7108-4820-a988-26fc4ced734e-kube-api-access-fncng\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:47:08.871883 master-0 kubenswrapper[4045]: I0308 03:47:08.871846 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x997v\" (UniqueName: \"kubernetes.io/projected/6cde5024-edf7-4fa4-8964-cabe7899578b-kube-api-access-x997v\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:08.872709 master-0 kubenswrapper[4045]: I0308 03:47:08.872684 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwkwt\" (UniqueName: \"kubernetes.io/projected/d831cb23-7411-4072-8273-c167d9afca28-kube-api-access-dwkwt\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:08.882014 master-0 kubenswrapper[4045]: I0308 03:47:08.877136 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxkm6\" (UniqueName: \"kubernetes.io/projected/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-kube-api-access-zxkm6\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:08.882014 master-0 kubenswrapper[4045]: I0308 03:47:08.878191 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q"] Mar 08 03:47:08.882014 master-0 kubenswrapper[4045]: I0308 03:47:08.878218 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh"] Mar 08 03:47:08.882014 master-0 kubenswrapper[4045]: I0308 03:47:08.878734 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqhzl\" (UniqueName: \"kubernetes.io/projected/1eb851be-f157-48ea-9a39-1361b68d2639-kube-api-access-nqhzl\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:08.882014 master-0 kubenswrapper[4045]: I0308 03:47:08.878745 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxxhh\" (UniqueName: \"kubernetes.io/projected/ee586416-6f56-4ea4-ad62-95de1e6df23b-kube-api-access-sxxhh\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:08.882014 master-0 kubenswrapper[4045]: I0308 03:47:08.880145 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qn5v\" (UniqueName: \"kubernetes.io/projected/0d377285-0336-41b7-b48f-c44a7b563498-kube-api-access-7qn5v\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:47:08.882014 master-0 kubenswrapper[4045]: I0308 03:47:08.880306 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmpdd\" (UniqueName: \"kubernetes.io/projected/0418ff42-7eac-4266-97b5-4df88623d066-kube-api-access-kmpdd\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:08.882014 master-0 kubenswrapper[4045]: I0308 03:47:08.880712 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnzt7\" (UniqueName: \"kubernetes.io/projected/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-kube-api-access-pnzt7\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:08.882014 master-0 kubenswrapper[4045]: I0308 03:47:08.881096 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx4fw\" (UniqueName: \"kubernetes.io/projected/232c421d-96f0-4894-b8d8-74f43d02bbd3-kube-api-access-fx4fw\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:08.882014 master-0 kubenswrapper[4045]: I0308 03:47:08.881209 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:08.882014 master-0 kubenswrapper[4045]: I0308 03:47:08.881607 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2h62\" (UniqueName: \"kubernetes.io/projected/1482d789-884b-4337-b598-f0e2b71eb9f2-kube-api-access-m2h62\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:08.882014 master-0 kubenswrapper[4045]: I0308 03:47:08.881699 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:47:08.882014 master-0 kubenswrapper[4045]: I0308 03:47:08.881727 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw7mr\" (UniqueName: \"kubernetes.io/projected/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-kube-api-access-fw7mr\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:08.882014 master-0 kubenswrapper[4045]: I0308 03:47:08.881942 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:47:08.882880 master-0 kubenswrapper[4045]: I0308 03:47:08.882160 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rchj5\" (UniqueName: \"kubernetes.io/projected/e78b283b-981e-48d7-a5f2-53f8401766ea-kube-api-access-rchj5\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:08.882880 master-0 kubenswrapper[4045]: I0308 03:47:08.882292 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:47:08.882880 master-0 kubenswrapper[4045]: I0308 03:47:08.882335 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29dpg\" (UniqueName: \"kubernetes.io/projected/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-kube-api-access-29dpg\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:08.882880 master-0 kubenswrapper[4045]: I0308 03:47:08.882339 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sx5s\" (UniqueName: \"kubernetes.io/projected/b3eea925-73b3-4693-8f0e-6dd26107f60a-kube-api-access-6sx5s\") pod \"cluster-storage-operator-6fbfc8dc8f-nm8fj\" (UID: \"b3eea925-73b3-4693-8f0e-6dd26107f60a\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" Mar 08 03:47:08.882880 master-0 kubenswrapper[4045]: I0308 03:47:08.882336 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd549\" (UniqueName: \"kubernetes.io/projected/52b495ac-bb28-44f3-b925-3c54f86d5ec4-kube-api-access-dd549\") pod \"csi-snapshot-controller-operator-5685fbc7d-xhbrl\" (UID: \"52b495ac-bb28-44f3-b925-3c54f86d5ec4\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-xhbrl" Mar 08 03:47:08.882880 master-0 kubenswrapper[4045]: I0308 03:47:08.882529 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfqc5\" (UniqueName: \"kubernetes.io/projected/7ff63c73-62a3-44b4-acd3-1b3df175794f-kube-api-access-vfqc5\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:47:08.883137 master-0 kubenswrapper[4045]: I0308 03:47:08.883043 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lfcj\" (UniqueName: \"kubernetes.io/projected/54ad284e-d40e-4e69-b898-f5093952a0e6-kube-api-access-9lfcj\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:08.884739 master-0 kubenswrapper[4045]: I0308 03:47:08.884506 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hccv4\" (UniqueName: \"kubernetes.io/projected/0ebf1330-e044-4ff5-8b48-2d667e0c5625-kube-api-access-hccv4\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:47:08.886531 master-0 kubenswrapper[4045]: I0308 03:47:08.886492 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mghmh\" (UniqueName: \"kubernetes.io/projected/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-kube-api-access-mghmh\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:08.886757 master-0 kubenswrapper[4045]: I0308 03:47:08.886723 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26180f77-0b1a-4d0f-9ed0-a12fdee69817-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:47:08.888475 master-0 kubenswrapper[4045]: I0308 03:47:08.888439 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-bound-sa-token\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:08.896000 master-0 kubenswrapper[4045]: I0308 03:47:08.894535 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkzb2\" (UniqueName: \"kubernetes.io/projected/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-kube-api-access-mkzb2\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:47:08.896948 master-0 kubenswrapper[4045]: I0308 03:47:08.896850 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpsx7\" (UniqueName: \"kubernetes.io/projected/8efdcef9-9b31-4567-b7f9-cb59a894273d-kube-api-access-cpsx7\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:08.897738 master-0 kubenswrapper[4045]: I0308 03:47:08.897649 4045 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfvnn\" (UniqueName: \"kubernetes.io/projected/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-kube-api-access-cfvnn\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:47:08.921447 master-0 kubenswrapper[4045]: I0308 03:47:08.921405 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:47:08.932094 master-0 kubenswrapper[4045]: W0308 03:47:08.931604 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c5a0c1d_867a_4ce4_9570_ea66452c8db3.slice/crio-49aec4971047b96e14ae56703fe099426b567477422c0add4be258e7ae9b7ff1 WatchSource:0}: Error finding container 49aec4971047b96e14ae56703fe099426b567477422c0add4be258e7ae9b7ff1: Status 404 returned error can't find the container with id 49aec4971047b96e14ae56703fe099426b567477422c0add4be258e7ae9b7ff1 Mar 08 03:47:08.932844 master-0 kubenswrapper[4045]: I0308 03:47:08.932796 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:47:08.952693 master-0 kubenswrapper[4045]: I0308 03:47:08.952388 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:47:08.976616 master-0 kubenswrapper[4045]: I0308 03:47:08.976522 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7"] Mar 08 03:47:08.978802 master-0 kubenswrapper[4045]: I0308 03:47:08.978765 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:47:08.992376 master-0 kubenswrapper[4045]: W0308 03:47:08.992303 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a7752f9_7b9a_451f_997a_e9f696d38b34.slice/crio-3235d3bd9c5f6c6a7e16ad74c79046e87f4d03278e4096c568a5930f544fbbf0 WatchSource:0}: Error finding container 3235d3bd9c5f6c6a7e16ad74c79046e87f4d03278e4096c568a5930f544fbbf0: Status 404 returned error can't find the container with id 3235d3bd9c5f6c6a7e16ad74c79046e87f4d03278e4096c568a5930f544fbbf0 Mar 08 03:47:08.996746 master-0 kubenswrapper[4045]: I0308 03:47:08.993289 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:47:09.001811 master-0 kubenswrapper[4045]: I0308 03:47:09.001771 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:09.010810 master-0 kubenswrapper[4045]: I0308 03:47:09.010765 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" Mar 08 03:47:09.030570 master-0 kubenswrapper[4045]: I0308 03:47:09.030540 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58"] Mar 08 03:47:09.036724 master-0 kubenswrapper[4045]: I0308 03:47:09.034962 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:09.036724 master-0 kubenswrapper[4045]: E0308 03:47:09.035156 4045 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:47:09.036724 master-0 kubenswrapper[4045]: E0308 03:47:09.035196 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert podName:6cde5024-edf7-4fa4-8964-cabe7899578b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:10.035183411 +0000 UTC m=+129.645884369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-c46zz" (UID: "6cde5024-edf7-4fa4-8964-cabe7899578b") : secret "package-server-manager-serving-cert" not found Mar 08 03:47:09.044055 master-0 kubenswrapper[4045]: I0308 03:47:09.044017 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-xhbrl" Mar 08 03:47:09.063116 master-0 kubenswrapper[4045]: I0308 03:47:09.063088 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:09.068934 master-0 kubenswrapper[4045]: I0308 03:47:09.068057 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:47:09.075954 master-0 kubenswrapper[4045]: I0308 03:47:09.075867 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:47:09.082001 master-0 kubenswrapper[4045]: I0308 03:47:09.081964 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:47:09.138255 master-0 kubenswrapper[4045]: I0308 03:47:09.138225 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:09.138329 master-0 kubenswrapper[4045]: I0308 03:47:09.138259 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:09.138329 master-0 kubenswrapper[4045]: I0308 03:47:09.138283 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:09.138329 master-0 kubenswrapper[4045]: I0308 03:47:09.138309 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:09.138329 master-0 kubenswrapper[4045]: I0308 03:47:09.138324 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:09.138460 master-0 kubenswrapper[4045]: I0308 03:47:09.138358 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:09.138460 master-0 kubenswrapper[4045]: I0308 03:47:09.138373 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:09.138460 master-0 kubenswrapper[4045]: I0308 03:47:09.138392 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:09.138460 master-0 kubenswrapper[4045]: I0308 03:47:09.138439 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:09.138460 master-0 kubenswrapper[4045]: I0308 03:47:09.138459 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:09.138625 master-0 kubenswrapper[4045]: I0308 03:47:09.138476 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:09.138625 master-0 kubenswrapper[4045]: I0308 03:47:09.138496 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:09.138625 master-0 kubenswrapper[4045]: I0308 03:47:09.138512 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:09.138625 master-0 kubenswrapper[4045]: I0308 03:47:09.138530 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:09.138625 master-0 kubenswrapper[4045]: E0308 03:47:09.138628 4045 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:47:09.138785 master-0 kubenswrapper[4045]: E0308 03:47:09.138663 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics podName:54ad284e-d40e-4e69-b898-f5093952a0e6 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:10.138651899 +0000 UTC m=+129.749352857 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-9sw2d" (UID: "54ad284e-d40e-4e69-b898-f5093952a0e6") : secret "marketplace-operator-metrics" not found Mar 08 03:47:09.139118 master-0 kubenswrapper[4045]: E0308 03:47:09.139083 4045 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:09.139172 master-0 kubenswrapper[4045]: E0308 03:47:09.139122 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:10.139111981 +0000 UTC m=+129.749812939 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:09.139172 master-0 kubenswrapper[4045]: E0308 03:47:09.139168 4045 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:47:09.139311 master-0 kubenswrapper[4045]: E0308 03:47:09.139196 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs podName:1eb851be-f157-48ea-9a39-1361b68d2639 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:10.139189633 +0000 UTC m=+129.749890591 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs") pod "multus-admission-controller-8d675b596-j8pv6" (UID: "1eb851be-f157-48ea-9a39-1361b68d2639") : secret "multus-admission-controller-secret" not found Mar 08 03:47:09.139311 master-0 kubenswrapper[4045]: E0308 03:47:09.139229 4045 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:09.139311 master-0 kubenswrapper[4045]: E0308 03:47:09.139246 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls podName:0418ff42-7eac-4266-97b5-4df88623d066 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:10.139240614 +0000 UTC m=+129.749941572 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-clqwj" (UID: "0418ff42-7eac-4266-97b5-4df88623d066") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:09.139311 master-0 kubenswrapper[4045]: E0308 03:47:09.139276 4045 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:47:09.139311 master-0 kubenswrapper[4045]: E0308 03:47:09.139292 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert podName:2dd4279d-a1a9-450a-a061-9008cd1ea8e0 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:10.139286395 +0000 UTC m=+129.749987353 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert") pod "olm-operator-d64cfc9db-qddlp" (UID: "2dd4279d-a1a9-450a-a061-9008cd1ea8e0") : secret "olm-operator-serving-cert" not found Mar 08 03:47:09.139497 master-0 kubenswrapper[4045]: E0308 03:47:09.139318 4045 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:09.139497 master-0 kubenswrapper[4045]: E0308 03:47:09.139335 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:10.139330496 +0000 UTC m=+129.750031454 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:09.139497 master-0 kubenswrapper[4045]: E0308 03:47:09.139365 4045 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 08 03:47:09.139497 master-0 kubenswrapper[4045]: E0308 03:47:09.139380 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:10.139375037 +0000 UTC m=+129.750075995 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : secret "mco-proxy-tls" not found Mar 08 03:47:09.139497 master-0 kubenswrapper[4045]: E0308 03:47:09.139407 4045 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:47:09.139497 master-0 kubenswrapper[4045]: E0308 03:47:09.139440 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert podName:1482d789-884b-4337-b598-f0e2b71eb9f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:10.139416508 +0000 UTC m=+129.750117466 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert") pod "catalog-operator-7d9c49f57b-qlfgq" (UID: "1482d789-884b-4337-b598-f0e2b71eb9f2") : secret "catalog-operator-serving-cert" not found Mar 08 03:47:09.139497 master-0 kubenswrapper[4045]: E0308 03:47:09.139467 4045 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:47:09.139497 master-0 kubenswrapper[4045]: E0308 03:47:09.139482 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:10.139477279 +0000 UTC m=+129.750178237 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "node-tuning-operator-tls" not found Mar 08 03:47:09.139497 master-0 kubenswrapper[4045]: E0308 03:47:09.139508 4045 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:09.139789 master-0 kubenswrapper[4045]: E0308 03:47:09.139525 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:10.13951987 +0000 UTC m=+129.750220828 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:09.139789 master-0 kubenswrapper[4045]: E0308 03:47:09.139555 4045 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:09.139789 master-0 kubenswrapper[4045]: E0308 03:47:09.139569 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls podName:c9de4939-680a-4e3e-89fd-e20ecb8b10f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:10.139564471 +0000 UTC m=+129.750265429 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls") pod "ingress-operator-677db989d6-t77qr" (UID: "c9de4939-680a-4e3e-89fd-e20ecb8b10f2") : secret "metrics-tls" not found Mar 08 03:47:09.139789 master-0 kubenswrapper[4045]: E0308 03:47:09.139597 4045 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:47:09.139789 master-0 kubenswrapper[4045]: E0308 03:47:09.139613 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls podName:69eb8ba2-7bfb-4433-8951-08f89e7bcb5f nodeName:}" failed. No retries permitted until 2026-03-08 03:47:10.139608712 +0000 UTC m=+129.750309670 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-572xh" (UID: "69eb8ba2-7bfb-4433-8951-08f89e7bcb5f") : secret "image-registry-operator-tls" not found Mar 08 03:47:09.139789 master-0 kubenswrapper[4045]: E0308 03:47:09.139638 4045 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: configmap "kube-rbac-proxy" not found Mar 08 03:47:09.139789 master-0 kubenswrapper[4045]: E0308 03:47:09.139655 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:10.139650483 +0000 UTC m=+129.750351441 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : configmap "kube-rbac-proxy" not found Mar 08 03:47:09.139789 master-0 kubenswrapper[4045]: E0308 03:47:09.139685 4045 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:09.139789 master-0 kubenswrapper[4045]: E0308 03:47:09.139700 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls podName:8efdcef9-9b31-4567-b7f9-cb59a894273d nodeName:}" failed. No retries permitted until 2026-03-08 03:47:10.139695864 +0000 UTC m=+129.750396822 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls") pod "dns-operator-589895fbb7-xttlz" (UID: "8efdcef9-9b31-4567-b7f9-cb59a894273d") : secret "metrics-tls" not found Mar 08 03:47:09.151677 master-0 kubenswrapper[4045]: I0308 03:47:09.151625 4045 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:09.189878 master-0 kubenswrapper[4045]: I0308 03:47:09.189662 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795"] Mar 08 03:47:09.199250 master-0 kubenswrapper[4045]: W0308 03:47:09.199070 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d377285_0336_41b7_b48f_c44a7b563498.slice/crio-2899b4e2a1cabd8aea96b1bf0db490c7e98f0e9564c40236186985f7b516039b WatchSource:0}: Error finding container 2899b4e2a1cabd8aea96b1bf0db490c7e98f0e9564c40236186985f7b516039b: Status 404 returned error can't find the container with id 2899b4e2a1cabd8aea96b1bf0db490c7e98f0e9564c40236186985f7b516039b Mar 08 03:47:09.313754 master-0 kubenswrapper[4045]: I0308 03:47:09.313714 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-xhbrl"] Mar 08 03:47:09.314797 master-0 kubenswrapper[4045]: I0308 03:47:09.314768 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp"] Mar 08 03:47:09.359121 master-0 kubenswrapper[4045]: I0308 03:47:09.356820 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf"] Mar 08 03:47:09.367261 master-0 kubenswrapper[4045]: I0308 03:47:09.366447 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682"] Mar 08 03:47:09.370370 master-0 kubenswrapper[4045]: I0308 03:47:09.370289 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j"] Mar 08 03:47:09.371241 master-0 kubenswrapper[4045]: W0308 03:47:09.370509 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30211469_7108_4820_a988_26fc4ced734e.slice/crio-47433e6e63affa1ba02608e11b299ca5af00d1c85e6731e35f43a4b241522538 WatchSource:0}: Error finding container 47433e6e63affa1ba02608e11b299ca5af00d1c85e6731e35f43a4b241522538: Status 404 returned error can't find the container with id 47433e6e63affa1ba02608e11b299ca5af00d1c85e6731e35f43a4b241522538 Mar 08 03:47:09.411115 master-0 kubenswrapper[4045]: I0308 03:47:09.410926 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q"] Mar 08 03:47:09.412954 master-0 kubenswrapper[4045]: I0308 03:47:09.412926 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs"] Mar 08 03:47:09.415887 master-0 kubenswrapper[4045]: W0308 03:47:09.415858 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ff63c73_62a3_44b4_acd3_1b3df175794f.slice/crio-9b135e9cc968b9e23fda104dcc5dd8cbf50632e21d670c61642446eb2eb45282 WatchSource:0}: Error finding container 9b135e9cc968b9e23fda104dcc5dd8cbf50632e21d670c61642446eb2eb45282: Status 404 returned error can't find the container with id 9b135e9cc968b9e23fda104dcc5dd8cbf50632e21d670c61642446eb2eb45282 Mar 08 03:47:09.420212 master-0 kubenswrapper[4045]: I0308 03:47:09.420171 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-4mr6p"] Mar 08 03:47:09.420894 master-0 kubenswrapper[4045]: W0308 03:47:09.420851 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5953ccfc_d0f9_4e24_bf59_bfa85a2b9e4a.slice/crio-14fe5cb6383f1129ecf327e882bdb7904f8ad1a8a2cc2647d9ee96534b6ccb93 WatchSource:0}: Error finding container 14fe5cb6383f1129ecf327e882bdb7904f8ad1a8a2cc2647d9ee96534b6ccb93: Status 404 returned error can't find the container with id 14fe5cb6383f1129ecf327e882bdb7904f8ad1a8a2cc2647d9ee96534b6ccb93 Mar 08 03:47:09.463015 master-0 kubenswrapper[4045]: I0308 03:47:09.462944 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj"] Mar 08 03:47:09.464887 master-0 kubenswrapper[4045]: I0308 03:47:09.464836 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6"] Mar 08 03:47:09.472671 master-0 kubenswrapper[4045]: I0308 03:47:09.472525 4045 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72"] Mar 08 03:47:09.473730 master-0 kubenswrapper[4045]: W0308 03:47:09.473684 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26180f77_0b1a_4d0f_9ed0_a12fdee69817.slice/crio-3d64c10d51d4d9009da402a9f2c51b81830f1695b7370548200097f367d254f2 WatchSource:0}: Error finding container 3d64c10d51d4d9009da402a9f2c51b81830f1695b7370548200097f367d254f2: Status 404 returned error can't find the container with id 3d64c10d51d4d9009da402a9f2c51b81830f1695b7370548200097f367d254f2 Mar 08 03:47:09.475264 master-0 kubenswrapper[4045]: W0308 03:47:09.475239 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3eea925_73b3_4693_8f0e_6dd26107f60a.slice/crio-d47d14f256ba67306efe8da7bbcadc67f946b747f7e0a1d658a9687f1f0a1a37 WatchSource:0}: Error finding container d47d14f256ba67306efe8da7bbcadc67f946b747f7e0a1d658a9687f1f0a1a37: Status 404 returned error can't find the container with id d47d14f256ba67306efe8da7bbcadc67f946b747f7e0a1d658a9687f1f0a1a37 Mar 08 03:47:09.476867 master-0 kubenswrapper[4045]: W0308 03:47:09.476769 4045 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda60bc804_52e7_422a_87fd_ac4c5aa90cb3.slice/crio-2a8e902cd252f0c879e3e1c00047d04c3e8646bfeed72f034a41537b464f6d14 WatchSource:0}: Error finding container 2a8e902cd252f0c879e3e1c00047d04c3e8646bfeed72f034a41537b464f6d14: Status 404 returned error can't find the container with id 2a8e902cd252f0c879e3e1c00047d04c3e8646bfeed72f034a41537b464f6d14 Mar 08 03:47:09.477999 master-0 kubenswrapper[4045]: E0308 03:47:09.477703 4045 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cluster-storage-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5,Command:[cluster-storage-operator start],Args:[start -v=2 --terminate-on-files=/var/run/secrets/serving-cert/tls.crt --terminate-on-files=/var/run/secrets/serving-cert/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:AWS_EBS_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d56c7f1d581c34f2ea2f67902eb71069e48523bdb5d964a9f6f63aa99f968876,ValueFrom:nil,},EnvVar{Name:AWS_EBS_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06ef6dcf5e3f0f83ed90209f5d3b31dab1debd1049ec97ec92f4f800abea8b78,ValueFrom:nil,},EnvVar{Name:GCP_PD_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3bf7ebb7d731da7f28d37117f4a38c9ee300ffd76e1f237bc1aab40390bbeb1c,ValueFrom:nil,},EnvVar{Name:GCP_PD_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf2ee91fb20b7873c456d2a45a997ad3e9bb9f9b879027e61fe4c413ae0d6449,ValueFrom:nil,},EnvVar{Name:OPENSTACK_CINDER_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fb0c6965c3634596b8bfb56605d8b3ffca300481045a3e03524c7a37f62e3875,ValueFrom:nil,},EnvVar{Name:OPENSTACK_CINDER_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3c8eb332b349325e5adabd9c1dcce16ff8f0fdb42c6385841206b80f946192d8,ValueFrom:nil,},EnvVar{Name:OVIRT_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fc839f06b007c8a18ff270a4677e03bf095fe8750beceeb26fa1bc3c15063ba5,ValueFrom:nil,},EnvVar{Name:OVIRT_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:961c2c0c561028c1b0ff7eb979519659bae2ad7ebeda8c31f9790dfff7bcf52c,ValueFrom:nil,},EnvVar{Name:MANILA_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb9a946195c27c7175b8d8edd8de889a89902b3bd07bb0ba2c6bc9f7facb87c,ValueFrom:nil,},EnvVar{Name:MANILA_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cd655557446f01d07862cf3ec1a20ff68d6675efb7c485af5c51227444c38ffd,ValueFrom:nil,},EnvVar{Name:MANILA_NFS_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3e30f12a259fe5a18431e487e5868c85753d7e22d44c34cfa9c47728a4ac95bc,ValueFrom:nil,},EnvVar{Name:PROVISIONER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:035e2d85907cee77c0fa3a52a6a65f07fd4175bb16072801c5dca7517d1298c9,ValueFrom:nil,},EnvVar{Name:ATTACHER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:81ea02ba3286c8748ae8607730de107584b888c0827f65569310957e6f73f7ff,ValueFrom:nil,},EnvVar{Name:RESIZER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:547a81df21302364dd9d6df89e6c1c665d02d891a2e3853f0747431605210186,ValueFrom:nil,},EnvVar{Name:SNAPSHOTTER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1eaf4783075f35c405b7f1eba3cde9bca9f707da7374315a64ccf764ecbbb47,ValueFrom:nil,},EnvVar{Name:NODE_DRIVER_REGISTRAR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1441b067af19141e16ea5589f525c1f99e6b4bcf91008fd480b517b251dd2dc1,ValueFrom:nil,},EnvVar{Name:LIVENESS_PROBE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:78078bbcd93ceab253456f3d551a382f6e8974f71f7c8d90aa1650aeb61065b7,ValueFrom:nil,},EnvVar{Name:VSPHERE_PROBLEM_DETECTOR_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bc6341864a48b01600b126452dcceaeb95fe4cc951ac346ddc83a223e414cf3,ValueFrom:nil,},EnvVar{Name:AZURE_DISK_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2cc1330dd935fc183a3ded96f68f265a2f4c2e5ce3ea6838171d2c146c0e69af,ValueFrom:nil,},EnvVar{Name:AZURE_DISK_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1757b0d6e876ded828d41fa93b19a7c739275ebfa17883654ff0442dba9bd643,ValueFrom:nil,},EnvVar{Name:AZURE_FILE_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10adf4af9eb4b3e9072b49f2298d0e5275604686d7b01a04c0a1bcb6fc19f291,ValueFrom:nil,},EnvVar{Name:AZURE_FILE_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f8a21af582f1840c82bba649d5981193dd88a14595cd7fc37e5722b7178c8921,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501,ValueFrom:nil,},EnvVar{Name:VMWARE_VSPHERE_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f604982bcdbed79ab78ca987b7a394c2376873079e8dbf6eb987880c6675c69f,ValueFrom:nil,},EnvVar{Name:VMWARE_VSPHERE_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f9f05c5864803e2582f288838678273d03e807b8b67d036a7cf378b187acc760,ValueFrom:nil,},EnvVar{Name:VMWARE_VSPHERE_SYNCER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ba412af618963fda7eed461b3934b00e62d05c87ced8bf2ec941e62e149808ad,ValueFrom:nil,},EnvVar{Name:CLUSTER_CLOUD_CONTROLLER_MANAGER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d470dba32064cc62b2ab29303d6e00612304548262eaa2f4e5b40a00a26f71ce,ValueFrom:nil,},EnvVar{Name:IBM_VPC_BLOCK_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5b86ea1a69ac634d7ed1282fe01b2330ee682a1c4d0fe5c4572f36b4d654ebc,ValueFrom:nil,},EnvVar{Name:IBM_VPC_BLOCK_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a1eb5fee11695dc15d201e6115f420d6e106e15e2e9982335ed8176b504d6e6,ValueFrom:nil,},EnvVar{Name:POWERVS_BLOCK_CSI_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5443f312d8a9a14766a3b82972589cb0b1623c9649be1e6df60f1aa96aa5592f,ValueFrom:nil,},EnvVar{Name:POWERVS_BLOCK_CSI_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d36472a77899acc21925f5cf4ec07f11dbfaedf45b6f11623aa751921a5af823,ValueFrom:nil,},EnvVar{Name:TOOLS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35768a0c3eb24134dd38633e8acfc7db69ee96b2fd660e9bba3b8c996452fef7,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cluster-storage-operator-serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6sx5s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cluster-storage-operator-6fbfc8dc8f-nm8fj_openshift-cluster-storage-operator(b3eea925-73b3-4693-8f0e-6dd26107f60a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 03:47:09.479139 master-0 kubenswrapper[4045]: E0308 03:47:09.479087 4045 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 03:47:09.479139 master-0 kubenswrapper[4045]: container &Container{Name:authentication-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953,Command:[/bin/bash -ec],Args:[if [ -s /var/run/configmaps/trusted-ca-bundle/ca-bundle.crt ]; then Mar 08 03:47:09.479139 master-0 kubenswrapper[4045]: echo "Copying system trust bundle" Mar 08 03:47:09.479139 master-0 kubenswrapper[4045]: cp -f /var/run/configmaps/trusted-ca-bundle/ca-bundle.crt /etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem Mar 08 03:47:09.479139 master-0 kubenswrapper[4045]: fi Mar 08 03:47:09.479139 master-0 kubenswrapper[4045]: exec authentication-operator operator --config=/var/run/configmaps/config/operator-config.yaml --v=2 --terminate-on-files=/var/run/configmaps/trusted-ca-bundle/ca-bundle.crt --terminate-on-files=/tmp/terminate Mar 08 03:47:09.479139 master-0 kubenswrapper[4045]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:IMAGE_OAUTH_SERVER,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3d3571ade02a7c61123d62c53fda6a57031a52c058c0571759dc09f96b23978f,ValueFrom:nil,},EnvVar{Name:IMAGE_OAUTH_APISERVER,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_OAUTH_SERVER_IMAGE_VERSION,Value:4.18.34_openshift,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{209715200 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:trusted-ca-bundle,ReadOnly:true,MountPath:/var/run/configmaps/trusted-ca-bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:service-ca-bundle,ReadOnly:true,MountPath:/var/run/configmaps/service-ca-bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zxkm6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod authentication-operator-7c6989d6c4-slm72_openshift-authentication-operator(a60bc804-52e7-422a-87fd-ac4c5aa90cb3): ErrImagePull: pull QPS exceeded Mar 08 03:47:09.479139 master-0 kubenswrapper[4045]: > logger="UnhandledError" Mar 08 03:47:09.479483 master-0 kubenswrapper[4045]: E0308 03:47:09.479199 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-storage-operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" podUID="b3eea925-73b3-4693-8f0e-6dd26107f60a" Mar 08 03:47:09.480415 master-0 kubenswrapper[4045]: E0308 03:47:09.480381 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" podUID="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" Mar 08 03:47:09.759698 master-0 kubenswrapper[4045]: I0308 03:47:09.759578 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-xhbrl" event={"ID":"52b495ac-bb28-44f3-b925-3c54f86d5ec4","Type":"ContainerStarted","Data":"4482916b3b4b521cf75927dd45a05e0a2072a49de37c125a72612ca885ff96ce"} Mar 08 03:47:09.761439 master-0 kubenswrapper[4045]: I0308 03:47:09.761365 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" event={"ID":"b3eea925-73b3-4693-8f0e-6dd26107f60a","Type":"ContainerStarted","Data":"d47d14f256ba67306efe8da7bbcadc67f946b747f7e0a1d658a9687f1f0a1a37"} Mar 08 03:47:09.764047 master-0 kubenswrapper[4045]: I0308 03:47:09.762868 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" event={"ID":"26180f77-0b1a-4d0f-9ed0-a12fdee69817","Type":"ContainerStarted","Data":"3d64c10d51d4d9009da402a9f2c51b81830f1695b7370548200097f367d254f2"} Mar 08 03:47:09.764047 master-0 kubenswrapper[4045]: E0308 03:47:09.763483 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-storage-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5\\\"\"" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" podUID="b3eea925-73b3-4693-8f0e-6dd26107f60a" Mar 08 03:47:09.765474 master-0 kubenswrapper[4045]: I0308 03:47:09.765423 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" event={"ID":"0ebf1330-e044-4ff5-8b48-2d667e0c5625","Type":"ContainerStarted","Data":"133c0043d3a977b4007520994c1530f26391f82433e16ae8b2e991aa2092980b"} Mar 08 03:47:09.767164 master-0 kubenswrapper[4045]: I0308 03:47:09.767127 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" event={"ID":"7ff63c73-62a3-44b4-acd3-1b3df175794f","Type":"ContainerStarted","Data":"9b135e9cc968b9e23fda104dcc5dd8cbf50632e21d670c61642446eb2eb45282"} Mar 08 03:47:09.768615 master-0 kubenswrapper[4045]: I0308 03:47:09.768541 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" event={"ID":"1cbcb403-a424-4496-8c5c-5eb5e42dfb93","Type":"ContainerStarted","Data":"3b576ae60c0b63ec0db45afc74d3ab2b7a31ef872c28479883b2bca1465128e0"} Mar 08 03:47:09.768615 master-0 kubenswrapper[4045]: I0308 03:47:09.768570 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" event={"ID":"1cbcb403-a424-4496-8c5c-5eb5e42dfb93","Type":"ContainerStarted","Data":"adfc23f0d784d89240f88962b8f79cdf84a79077cb7581e94d0e19b479eeafaa"} Mar 08 03:47:09.770612 master-0 kubenswrapper[4045]: I0308 03:47:09.770435 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" event={"ID":"ee586416-6f56-4ea4-ad62-95de1e6df23b","Type":"ContainerStarted","Data":"a5455725a1362a8e870442eb2f0235fbea46c1d047d2183683f1ca346ec9c059"} Mar 08 03:47:09.771415 master-0 kubenswrapper[4045]: I0308 03:47:09.771355 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" event={"ID":"5a7752f9-7b9a-451f-997a-e9f696d38b34","Type":"ContainerStarted","Data":"3235d3bd9c5f6c6a7e16ad74c79046e87f4d03278e4096c568a5930f544fbbf0"} Mar 08 03:47:09.772265 master-0 kubenswrapper[4045]: I0308 03:47:09.772232 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" event={"ID":"0d377285-0336-41b7-b48f-c44a7b563498","Type":"ContainerStarted","Data":"2899b4e2a1cabd8aea96b1bf0db490c7e98f0e9564c40236186985f7b516039b"} Mar 08 03:47:09.773110 master-0 kubenswrapper[4045]: I0308 03:47:09.773076 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" event={"ID":"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a","Type":"ContainerStarted","Data":"14fe5cb6383f1129ecf327e882bdb7904f8ad1a8a2cc2647d9ee96534b6ccb93"} Mar 08 03:47:09.773778 master-0 kubenswrapper[4045]: I0308 03:47:09.773743 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7c28p" event={"ID":"4c5a0c1d-867a-4ce4-9570-ea66452c8db3","Type":"ContainerStarted","Data":"49aec4971047b96e14ae56703fe099426b567477422c0add4be258e7ae9b7ff1"} Mar 08 03:47:09.774744 master-0 kubenswrapper[4045]: I0308 03:47:09.774698 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" event={"ID":"30211469-7108-4820-a988-26fc4ced734e","Type":"ContainerStarted","Data":"47433e6e63affa1ba02608e11b299ca5af00d1c85e6731e35f43a4b241522538"} Mar 08 03:47:09.775640 master-0 kubenswrapper[4045]: I0308 03:47:09.775607 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" event={"ID":"e4541b7b-3f7f-4851-9bd9-26fcda5cab13","Type":"ContainerStarted","Data":"d73b051671cc575452964e4ec7abae8ed2cf8ae1de2a3be5460a27e068329e94"} Mar 08 03:47:09.776832 master-0 kubenswrapper[4045]: I0308 03:47:09.776793 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" event={"ID":"a60bc804-52e7-422a-87fd-ac4c5aa90cb3","Type":"ContainerStarted","Data":"2a8e902cd252f0c879e3e1c00047d04c3e8646bfeed72f034a41537b464f6d14"} Mar 08 03:47:09.778264 master-0 kubenswrapper[4045]: E0308 03:47:09.778037 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"\"" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" podUID="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" Mar 08 03:47:09.780546 master-0 kubenswrapper[4045]: I0308 03:47:09.780488 4045 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" event={"ID":"0918ba32-8e55-48d0-8e50-027c0dcb4bbd","Type":"ContainerStarted","Data":"df9f6505570d879efae2662d6149a2ae417f35b1bed956f7339c92d857b81707"} Mar 08 03:47:09.863769 master-0 kubenswrapper[4045]: I0308 03:47:09.817736 4045 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" podStartSLOduration=92.81769986 podStartE2EDuration="1m32.81769986s" podCreationTimestamp="2026-03-08 03:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:47:09.810526561 +0000 UTC m=+129.421227579" watchObservedRunningTime="2026-03-08 03:47:09.81769986 +0000 UTC m=+129.428400898" Mar 08 03:47:10.078323 master-0 kubenswrapper[4045]: I0308 03:47:10.078244 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:10.078543 master-0 kubenswrapper[4045]: E0308 03:47:10.078515 4045 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:47:10.078997 master-0 kubenswrapper[4045]: E0308 03:47:10.078599 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert podName:6cde5024-edf7-4fa4-8964-cabe7899578b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:12.078577657 +0000 UTC m=+131.689278625 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-c46zz" (UID: "6cde5024-edf7-4fa4-8964-cabe7899578b") : secret "package-server-manager-serving-cert" not found Mar 08 03:47:10.179004 master-0 kubenswrapper[4045]: I0308 03:47:10.178875 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:10.179249 master-0 kubenswrapper[4045]: E0308 03:47:10.179169 4045 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:47:10.179319 master-0 kubenswrapper[4045]: E0308 03:47:10.179298 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert podName:2dd4279d-a1a9-450a-a061-9008cd1ea8e0 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:12.17927334 +0000 UTC m=+131.789974358 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert") pod "olm-operator-d64cfc9db-qddlp" (UID: "2dd4279d-a1a9-450a-a061-9008cd1ea8e0") : secret "olm-operator-serving-cert" not found Mar 08 03:47:10.180231 master-0 kubenswrapper[4045]: I0308 03:47:10.179974 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:10.180316 master-0 kubenswrapper[4045]: I0308 03:47:10.180282 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:10.180399 master-0 kubenswrapper[4045]: I0308 03:47:10.180378 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:10.180508 master-0 kubenswrapper[4045]: I0308 03:47:10.180483 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:10.180605 master-0 kubenswrapper[4045]: I0308 03:47:10.180582 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:10.180661 master-0 kubenswrapper[4045]: I0308 03:47:10.180619 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:10.180661 master-0 kubenswrapper[4045]: I0308 03:47:10.180656 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:10.180744 master-0 kubenswrapper[4045]: I0308 03:47:10.180687 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:10.180744 master-0 kubenswrapper[4045]: I0308 03:47:10.180716 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:10.180810 master-0 kubenswrapper[4045]: I0308 03:47:10.180746 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:10.180810 master-0 kubenswrapper[4045]: I0308 03:47:10.180777 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:10.181760 master-0 kubenswrapper[4045]: I0308 03:47:10.181738 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:10.181850 master-0 kubenswrapper[4045]: I0308 03:47:10.181771 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:10.181940 master-0 kubenswrapper[4045]: E0308 03:47:10.181917 4045 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:47:10.182004 master-0 kubenswrapper[4045]: E0308 03:47:10.181958 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs podName:1eb851be-f157-48ea-9a39-1361b68d2639 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:12.181946733 +0000 UTC m=+131.792647691 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs") pod "multus-admission-controller-8d675b596-j8pv6" (UID: "1eb851be-f157-48ea-9a39-1361b68d2639") : secret "multus-admission-controller-secret" not found Mar 08 03:47:10.182054 master-0 kubenswrapper[4045]: E0308 03:47:10.182009 4045 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:10.182054 master-0 kubenswrapper[4045]: E0308 03:47:10.182036 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:12.182026515 +0000 UTC m=+131.792727473 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:10.182131 master-0 kubenswrapper[4045]: E0308 03:47:10.182089 4045 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 08 03:47:10.182131 master-0 kubenswrapper[4045]: E0308 03:47:10.182115 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:12.182106877 +0000 UTC m=+131.792807835 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : secret "mco-proxy-tls" not found Mar 08 03:47:10.182204 master-0 kubenswrapper[4045]: E0308 03:47:10.182154 4045 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:47:10.182204 master-0 kubenswrapper[4045]: E0308 03:47:10.182176 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert podName:1482d789-884b-4337-b598-f0e2b71eb9f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:12.182168618 +0000 UTC m=+131.792869576 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert") pod "catalog-operator-7d9c49f57b-qlfgq" (UID: "1482d789-884b-4337-b598-f0e2b71eb9f2") : secret "catalog-operator-serving-cert" not found Mar 08 03:47:10.182277 master-0 kubenswrapper[4045]: E0308 03:47:10.182215 4045 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:47:10.182277 master-0 kubenswrapper[4045]: E0308 03:47:10.182273 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:12.18226424 +0000 UTC m=+131.792965278 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "node-tuning-operator-tls" not found Mar 08 03:47:10.182371 master-0 kubenswrapper[4045]: E0308 03:47:10.182326 4045 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:10.182371 master-0 kubenswrapper[4045]: E0308 03:47:10.182368 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls podName:c9de4939-680a-4e3e-89fd-e20ecb8b10f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:12.182358473 +0000 UTC m=+131.793059431 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls") pod "ingress-operator-677db989d6-t77qr" (UID: "c9de4939-680a-4e3e-89fd-e20ecb8b10f2") : secret "metrics-tls" not found Mar 08 03:47:10.182454 master-0 kubenswrapper[4045]: E0308 03:47:10.182418 4045 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:10.182454 master-0 kubenswrapper[4045]: E0308 03:47:10.182445 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:12.182436164 +0000 UTC m=+131.793137122 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:10.182523 master-0 kubenswrapper[4045]: E0308 03:47:10.182490 4045 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:47:10.182523 master-0 kubenswrapper[4045]: E0308 03:47:10.182513 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls podName:69eb8ba2-7bfb-4433-8951-08f89e7bcb5f nodeName:}" failed. No retries permitted until 2026-03-08 03:47:12.182506106 +0000 UTC m=+131.793207064 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-572xh" (UID: "69eb8ba2-7bfb-4433-8951-08f89e7bcb5f") : secret "image-registry-operator-tls" not found Mar 08 03:47:10.182591 master-0 kubenswrapper[4045]: E0308 03:47:10.182564 4045 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: configmap "kube-rbac-proxy" not found Mar 08 03:47:10.182625 master-0 kubenswrapper[4045]: E0308 03:47:10.182592 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:12.182584188 +0000 UTC m=+131.793285146 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : configmap "kube-rbac-proxy" not found Mar 08 03:47:10.182666 master-0 kubenswrapper[4045]: E0308 03:47:10.182640 4045 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:10.182666 master-0 kubenswrapper[4045]: E0308 03:47:10.182663 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls podName:8efdcef9-9b31-4567-b7f9-cb59a894273d nodeName:}" failed. No retries permitted until 2026-03-08 03:47:12.18265627 +0000 UTC m=+131.793357228 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls") pod "dns-operator-589895fbb7-xttlz" (UID: "8efdcef9-9b31-4567-b7f9-cb59a894273d") : secret "metrics-tls" not found Mar 08 03:47:10.182730 master-0 kubenswrapper[4045]: E0308 03:47:10.182707 4045 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:47:10.182730 master-0 kubenswrapper[4045]: E0308 03:47:10.182728 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics podName:54ad284e-d40e-4e69-b898-f5093952a0e6 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:12.182720181 +0000 UTC m=+131.793421139 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-9sw2d" (UID: "54ad284e-d40e-4e69-b898-f5093952a0e6") : secret "marketplace-operator-metrics" not found Mar 08 03:47:10.182803 master-0 kubenswrapper[4045]: E0308 03:47:10.182775 4045 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:10.182803 master-0 kubenswrapper[4045]: E0308 03:47:10.182801 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:12.182791953 +0000 UTC m=+131.793493011 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:10.182959 master-0 kubenswrapper[4045]: E0308 03:47:10.182938 4045 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:10.183128 master-0 kubenswrapper[4045]: E0308 03:47:10.183071 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls podName:0418ff42-7eac-4266-97b5-4df88623d066 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:12.183060089 +0000 UTC m=+131.793761047 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-clqwj" (UID: "0418ff42-7eac-4266-97b5-4df88623d066") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:10.790480 master-0 kubenswrapper[4045]: E0308 03:47:10.790302 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"\"" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" podUID="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" Mar 08 03:47:10.791543 master-0 kubenswrapper[4045]: E0308 03:47:10.791516 4045 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-storage-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5\\\"\"" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" podUID="b3eea925-73b3-4693-8f0e-6dd26107f60a" Mar 08 03:47:12.112159 master-0 kubenswrapper[4045]: I0308 03:47:12.111802 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:12.112159 master-0 kubenswrapper[4045]: E0308 03:47:12.112118 4045 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:47:12.112922 master-0 kubenswrapper[4045]: E0308 03:47:12.112302 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert podName:6cde5024-edf7-4fa4-8964-cabe7899578b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:16.112259428 +0000 UTC m=+135.722960466 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-c46zz" (UID: "6cde5024-edf7-4fa4-8964-cabe7899578b") : secret "package-server-manager-serving-cert" not found Mar 08 03:47:12.213284 master-0 kubenswrapper[4045]: I0308 03:47:12.213225 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:12.213284 master-0 kubenswrapper[4045]: I0308 03:47:12.213285 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:12.213530 master-0 kubenswrapper[4045]: I0308 03:47:12.213315 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:12.213530 master-0 kubenswrapper[4045]: E0308 03:47:12.213448 4045 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: configmap "kube-rbac-proxy" not found Mar 08 03:47:12.213530 master-0 kubenswrapper[4045]: I0308 03:47:12.213512 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:12.213639 master-0 kubenswrapper[4045]: E0308 03:47:12.213561 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:16.213529284 +0000 UTC m=+135.824230282 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : configmap "kube-rbac-proxy" not found Mar 08 03:47:12.213639 master-0 kubenswrapper[4045]: I0308 03:47:12.213607 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:12.213724 master-0 kubenswrapper[4045]: E0308 03:47:12.213674 4045 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:12.213761 master-0 kubenswrapper[4045]: I0308 03:47:12.213673 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:12.213761 master-0 kubenswrapper[4045]: E0308 03:47:12.213748 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:16.213722688 +0000 UTC m=+135.824423756 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:12.213878 master-0 kubenswrapper[4045]: I0308 03:47:12.213796 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:12.213878 master-0 kubenswrapper[4045]: E0308 03:47:12.213850 4045 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:47:12.213956 master-0 kubenswrapper[4045]: I0308 03:47:12.213867 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:12.213956 master-0 kubenswrapper[4045]: E0308 03:47:12.213930 4045 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:47:12.213956 master-0 kubenswrapper[4045]: E0308 03:47:12.213940 4045 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:47:12.214058 master-0 kubenswrapper[4045]: E0308 03:47:12.213933 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs podName:1eb851be-f157-48ea-9a39-1361b68d2639 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:16.213910033 +0000 UTC m=+135.824610991 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs") pod "multus-admission-controller-8d675b596-j8pv6" (UID: "1eb851be-f157-48ea-9a39-1361b68d2639") : secret "multus-admission-controller-secret" not found Mar 08 03:47:12.214058 master-0 kubenswrapper[4045]: E0308 03:47:12.213976 4045 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:12.214058 master-0 kubenswrapper[4045]: E0308 03:47:12.213996 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics podName:54ad284e-d40e-4e69-b898-f5093952a0e6 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:16.213980964 +0000 UTC m=+135.824682072 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-9sw2d" (UID: "54ad284e-d40e-4e69-b898-f5093952a0e6") : secret "marketplace-operator-metrics" not found Mar 08 03:47:12.214058 master-0 kubenswrapper[4045]: E0308 03:47:12.214042 4045 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:12.214206 master-0 kubenswrapper[4045]: I0308 03:47:12.214123 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:12.214206 master-0 kubenswrapper[4045]: E0308 03:47:12.214159 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls podName:8efdcef9-9b31-4567-b7f9-cb59a894273d nodeName:}" failed. No retries permitted until 2026-03-08 03:47:16.214114487 +0000 UTC m=+135.824815505 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls") pod "dns-operator-589895fbb7-xttlz" (UID: "8efdcef9-9b31-4567-b7f9-cb59a894273d") : secret "metrics-tls" not found Mar 08 03:47:12.214206 master-0 kubenswrapper[4045]: E0308 03:47:12.214038 4045 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:12.214206 master-0 kubenswrapper[4045]: I0308 03:47:12.214192 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:12.214346 master-0 kubenswrapper[4045]: E0308 03:47:12.214212 4045 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 08 03:47:12.214346 master-0 kubenswrapper[4045]: E0308 03:47:12.214222 4045 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:47:12.214346 master-0 kubenswrapper[4045]: E0308 03:47:12.214228 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:16.2142091 +0000 UTC m=+135.824910088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:12.214346 master-0 kubenswrapper[4045]: E0308 03:47:12.214260 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert podName:1482d789-884b-4337-b598-f0e2b71eb9f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:16.214252291 +0000 UTC m=+135.824953369 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert") pod "catalog-operator-7d9c49f57b-qlfgq" (UID: "1482d789-884b-4337-b598-f0e2b71eb9f2") : secret "catalog-operator-serving-cert" not found Mar 08 03:47:12.214346 master-0 kubenswrapper[4045]: I0308 03:47:12.214259 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:12.214346 master-0 kubenswrapper[4045]: E0308 03:47:12.214276 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert podName:2dd4279d-a1a9-450a-a061-9008cd1ea8e0 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:16.214269031 +0000 UTC m=+135.824969989 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert") pod "olm-operator-d64cfc9db-qddlp" (UID: "2dd4279d-a1a9-450a-a061-9008cd1ea8e0") : secret "olm-operator-serving-cert" not found Mar 08 03:47:12.214346 master-0 kubenswrapper[4045]: E0308 03:47:12.214288 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls podName:0418ff42-7eac-4266-97b5-4df88623d066 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:16.214282771 +0000 UTC m=+135.824983729 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-clqwj" (UID: "0418ff42-7eac-4266-97b5-4df88623d066") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:12.214346 master-0 kubenswrapper[4045]: E0308 03:47:12.214305 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:16.214298782 +0000 UTC m=+135.824999740 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : secret "mco-proxy-tls" not found Mar 08 03:47:12.214629 master-0 kubenswrapper[4045]: E0308 03:47:12.214366 4045 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:47:12.214629 master-0 kubenswrapper[4045]: E0308 03:47:12.214396 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:16.214384824 +0000 UTC m=+135.825085782 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "node-tuning-operator-tls" not found Mar 08 03:47:12.214629 master-0 kubenswrapper[4045]: I0308 03:47:12.214391 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:12.214629 master-0 kubenswrapper[4045]: I0308 03:47:12.214429 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:12.214629 master-0 kubenswrapper[4045]: I0308 03:47:12.214453 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:12.214629 master-0 kubenswrapper[4045]: E0308 03:47:12.214486 4045 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:12.214629 master-0 kubenswrapper[4045]: E0308 03:47:12.214519 4045 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:47:12.214629 master-0 kubenswrapper[4045]: E0308 03:47:12.214531 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:16.214515337 +0000 UTC m=+135.825216335 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:12.214629 master-0 kubenswrapper[4045]: E0308 03:47:12.214559 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls podName:69eb8ba2-7bfb-4433-8951-08f89e7bcb5f nodeName:}" failed. No retries permitted until 2026-03-08 03:47:16.214547968 +0000 UTC m=+135.825248966 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-572xh" (UID: "69eb8ba2-7bfb-4433-8951-08f89e7bcb5f") : secret "image-registry-operator-tls" not found Mar 08 03:47:12.214629 master-0 kubenswrapper[4045]: E0308 03:47:12.214611 4045 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:12.215094 master-0 kubenswrapper[4045]: E0308 03:47:12.214678 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls podName:c9de4939-680a-4e3e-89fd-e20ecb8b10f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:16.21465816 +0000 UTC m=+135.825359118 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls") pod "ingress-operator-677db989d6-t77qr" (UID: "c9de4939-680a-4e3e-89fd-e20ecb8b10f2") : secret "metrics-tls" not found Mar 08 03:47:12.315751 master-0 kubenswrapper[4045]: I0308 03:47:12.315673 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:47:12.316126 master-0 kubenswrapper[4045]: E0308 03:47:12.315876 4045 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 03:47:12.316126 master-0 kubenswrapper[4045]: E0308 03:47:12.315949 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs podName:d5044ffd-0686-4679-9894-e696faf33699 nodeName:}" failed. No retries permitted until 2026-03-08 03:48:16.315930047 +0000 UTC m=+195.926630995 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs") pod "network-metrics-daemon-schjl" (UID: "d5044ffd-0686-4679-9894-e696faf33699") : secret "metrics-daemon-secret" not found Mar 08 03:47:16.163303 master-0 kubenswrapper[4045]: I0308 03:47:16.163133 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:16.164213 master-0 kubenswrapper[4045]: E0308 03:47:16.163350 4045 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:47:16.164213 master-0 kubenswrapper[4045]: E0308 03:47:16.163443 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert podName:6cde5024-edf7-4fa4-8964-cabe7899578b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.163421256 +0000 UTC m=+143.774122214 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-c46zz" (UID: "6cde5024-edf7-4fa4-8964-cabe7899578b") : secret "package-server-manager-serving-cert" not found Mar 08 03:47:16.265865 master-0 kubenswrapper[4045]: I0308 03:47:16.265024 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:16.265865 master-0 kubenswrapper[4045]: E0308 03:47:16.265154 4045 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:47:16.265865 master-0 kubenswrapper[4045]: E0308 03:47:16.265592 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert podName:1482d789-884b-4337-b598-f0e2b71eb9f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.265572823 +0000 UTC m=+143.876273781 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert") pod "catalog-operator-7d9c49f57b-qlfgq" (UID: "1482d789-884b-4337-b598-f0e2b71eb9f2") : secret "catalog-operator-serving-cert" not found Mar 08 03:47:16.265865 master-0 kubenswrapper[4045]: E0308 03:47:16.265652 4045 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 08 03:47:16.265865 master-0 kubenswrapper[4045]: E0308 03:47:16.265712 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.265693597 +0000 UTC m=+143.876394645 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : secret "mco-proxy-tls" not found Mar 08 03:47:16.265865 master-0 kubenswrapper[4045]: I0308 03:47:16.265522 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:16.265865 master-0 kubenswrapper[4045]: I0308 03:47:16.265751 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:16.265865 master-0 kubenswrapper[4045]: I0308 03:47:16.265847 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:16.266309 master-0 kubenswrapper[4045]: E0308 03:47:16.265909 4045 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:16.266309 master-0 kubenswrapper[4045]: E0308 03:47:16.265937 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls podName:c9de4939-680a-4e3e-89fd-e20ecb8b10f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.265928542 +0000 UTC m=+143.876629500 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls") pod "ingress-operator-677db989d6-t77qr" (UID: "c9de4939-680a-4e3e-89fd-e20ecb8b10f2") : secret "metrics-tls" not found Mar 08 03:47:16.266309 master-0 kubenswrapper[4045]: E0308 03:47:16.265965 4045 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:47:16.266309 master-0 kubenswrapper[4045]: E0308 03:47:16.265999 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.265991433 +0000 UTC m=+143.876692501 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "node-tuning-operator-tls" not found Mar 08 03:47:16.266309 master-0 kubenswrapper[4045]: I0308 03:47:16.265996 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:16.266309 master-0 kubenswrapper[4045]: I0308 03:47:16.266039 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:16.266309 master-0 kubenswrapper[4045]: E0308 03:47:16.266104 4045 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:16.266309 master-0 kubenswrapper[4045]: E0308 03:47:16.266148 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.266132957 +0000 UTC m=+143.876833935 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:16.266309 master-0 kubenswrapper[4045]: I0308 03:47:16.266180 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:16.266309 master-0 kubenswrapper[4045]: I0308 03:47:16.266213 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:16.266309 master-0 kubenswrapper[4045]: I0308 03:47:16.266242 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:16.266309 master-0 kubenswrapper[4045]: I0308 03:47:16.266267 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:16.266309 master-0 kubenswrapper[4045]: I0308 03:47:16.266293 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:16.266727 master-0 kubenswrapper[4045]: I0308 03:47:16.266317 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:16.266727 master-0 kubenswrapper[4045]: I0308 03:47:16.266349 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:16.266727 master-0 kubenswrapper[4045]: I0308 03:47:16.266417 4045 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:16.266727 master-0 kubenswrapper[4045]: E0308 03:47:16.266528 4045 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:16.266727 master-0 kubenswrapper[4045]: E0308 03:47:16.266560 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.266549626 +0000 UTC m=+143.877250584 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:16.266727 master-0 kubenswrapper[4045]: E0308 03:47:16.266604 4045 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:47:16.266727 master-0 kubenswrapper[4045]: E0308 03:47:16.266639 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls podName:69eb8ba2-7bfb-4433-8951-08f89e7bcb5f nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.266629438 +0000 UTC m=+143.877330586 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-572xh" (UID: "69eb8ba2-7bfb-4433-8951-08f89e7bcb5f") : secret "image-registry-operator-tls" not found Mar 08 03:47:16.266727 master-0 kubenswrapper[4045]: E0308 03:47:16.266675 4045 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: configmap "kube-rbac-proxy" not found Mar 08 03:47:16.266727 master-0 kubenswrapper[4045]: E0308 03:47:16.266697 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.26669006 +0000 UTC m=+143.877391218 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : configmap "kube-rbac-proxy" not found Mar 08 03:47:16.267030 master-0 kubenswrapper[4045]: E0308 03:47:16.266741 4045 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:16.267030 master-0 kubenswrapper[4045]: E0308 03:47:16.266765 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls podName:8efdcef9-9b31-4567-b7f9-cb59a894273d nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.266756171 +0000 UTC m=+143.877457139 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls") pod "dns-operator-589895fbb7-xttlz" (UID: "8efdcef9-9b31-4567-b7f9-cb59a894273d") : secret "metrics-tls" not found Mar 08 03:47:16.267030 master-0 kubenswrapper[4045]: E0308 03:47:16.266813 4045 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:47:16.267030 master-0 kubenswrapper[4045]: E0308 03:47:16.266859 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics podName:54ad284e-d40e-4e69-b898-f5093952a0e6 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.266851513 +0000 UTC m=+143.877552471 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-9sw2d" (UID: "54ad284e-d40e-4e69-b898-f5093952a0e6") : secret "marketplace-operator-metrics" not found Mar 08 03:47:16.267030 master-0 kubenswrapper[4045]: E0308 03:47:16.266923 4045 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:16.267030 master-0 kubenswrapper[4045]: E0308 03:47:16.266938 4045 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:47:16.267030 master-0 kubenswrapper[4045]: E0308 03:47:16.266943 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.266936855 +0000 UTC m=+143.877637933 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:16.267030 master-0 kubenswrapper[4045]: E0308 03:47:16.266978 4045 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:16.267030 master-0 kubenswrapper[4045]: E0308 03:47:16.267014 4045 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:47:16.267030 master-0 kubenswrapper[4045]: E0308 03:47:16.266986 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert podName:2dd4279d-a1a9-450a-a061-9008cd1ea8e0 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.266976076 +0000 UTC m=+143.877677044 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert") pod "olm-operator-d64cfc9db-qddlp" (UID: "2dd4279d-a1a9-450a-a061-9008cd1ea8e0") : secret "olm-operator-serving-cert" not found Mar 08 03:47:16.267308 master-0 kubenswrapper[4045]: E0308 03:47:16.267051 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls podName:0418ff42-7eac-4266-97b5-4df88623d066 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.267041878 +0000 UTC m=+143.877742976 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-clqwj" (UID: "0418ff42-7eac-4266-97b5-4df88623d066") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:16.267308 master-0 kubenswrapper[4045]: E0308 03:47:16.267065 4045 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs podName:1eb851be-f157-48ea-9a39-1361b68d2639 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.267058808 +0000 UTC m=+143.877759896 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs") pod "multus-admission-controller-8d675b596-j8pv6" (UID: "1eb851be-f157-48ea-9a39-1361b68d2639") : secret "multus-admission-controller-secret" not found Mar 08 03:47:16.924793 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 08 03:47:16.950788 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 08 03:47:16.951417 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 08 03:47:16.959447 master-0 systemd[1]: kubelet.service: Consumed 11.215s CPU time. Mar 08 03:47:16.982471 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 08 03:47:17.099267 master-0 kubenswrapper[7547]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:47:17.099267 master-0 kubenswrapper[7547]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 08 03:47:17.099267 master-0 kubenswrapper[7547]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:47:17.099267 master-0 kubenswrapper[7547]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:47:17.100651 master-0 kubenswrapper[7547]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 08 03:47:17.100651 master-0 kubenswrapper[7547]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:47:17.100651 master-0 kubenswrapper[7547]: I0308 03:47:17.099397 7547 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 08 03:47:17.103668 master-0 kubenswrapper[7547]: W0308 03:47:17.103471 7547 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:47:17.103668 master-0 kubenswrapper[7547]: W0308 03:47:17.103582 7547 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:47:17.103668 master-0 kubenswrapper[7547]: W0308 03:47:17.103592 7547 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:47:17.103668 master-0 kubenswrapper[7547]: W0308 03:47:17.103598 7547 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:47:17.103668 master-0 kubenswrapper[7547]: W0308 03:47:17.103604 7547 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:47:17.103668 master-0 kubenswrapper[7547]: W0308 03:47:17.103667 7547 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:47:17.103904 master-0 kubenswrapper[7547]: W0308 03:47:17.103676 7547 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:47:17.103904 master-0 kubenswrapper[7547]: W0308 03:47:17.103682 7547 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:47:17.103904 master-0 kubenswrapper[7547]: W0308 03:47:17.103687 7547 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:47:17.103904 master-0 kubenswrapper[7547]: W0308 03:47:17.103694 7547 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:47:17.103904 master-0 kubenswrapper[7547]: W0308 03:47:17.103701 7547 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:47:17.103904 master-0 kubenswrapper[7547]: W0308 03:47:17.103709 7547 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:47:17.103904 master-0 kubenswrapper[7547]: W0308 03:47:17.103713 7547 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:47:17.103904 master-0 kubenswrapper[7547]: W0308 03:47:17.103717 7547 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:47:17.103904 master-0 kubenswrapper[7547]: W0308 03:47:17.103741 7547 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:47:17.103904 master-0 kubenswrapper[7547]: W0308 03:47:17.103745 7547 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:47:17.103904 master-0 kubenswrapper[7547]: W0308 03:47:17.103749 7547 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:47:17.103904 master-0 kubenswrapper[7547]: W0308 03:47:17.103791 7547 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:47:17.103904 master-0 kubenswrapper[7547]: W0308 03:47:17.103797 7547 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:47:17.103904 master-0 kubenswrapper[7547]: W0308 03:47:17.103835 7547 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.103956 7547 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.104201 7547 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.104210 7547 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.104238 7547 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.104245 7547 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.104254 7547 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.104260 7547 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.104265 7547 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.104270 7547 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.104276 7547 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.104281 7547 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.104286 7547 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.104306 7547 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.104314 7547 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.104319 7547 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.104324 7547 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.104328 7547 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.104333 7547 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:47:17.104542 master-0 kubenswrapper[7547]: W0308 03:47:17.104338 7547 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104342 7547 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104348 7547 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104354 7547 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104358 7547 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104363 7547 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104370 7547 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104375 7547 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104380 7547 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104385 7547 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104390 7547 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104434 7547 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104737 7547 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104745 7547 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104750 7547 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104755 7547 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104761 7547 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104765 7547 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104784 7547 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104788 7547 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:47:17.105175 master-0 kubenswrapper[7547]: W0308 03:47:17.104792 7547 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: W0308 03:47:17.104800 7547 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: W0308 03:47:17.104809 7547 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: W0308 03:47:17.104813 7547 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: W0308 03:47:17.104894 7547 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: W0308 03:47:17.104899 7547 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: W0308 03:47:17.104904 7547 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: W0308 03:47:17.104908 7547 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: W0308 03:47:17.104912 7547 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: W0308 03:47:17.104916 7547 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: W0308 03:47:17.104922 7547 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: W0308 03:47:17.104946 7547 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: W0308 03:47:17.104950 7547 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: W0308 03:47:17.104955 7547 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: I0308 03:47:17.105697 7547 flags.go:64] FLAG: --address="0.0.0.0" Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: I0308 03:47:17.105724 7547 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: I0308 03:47:17.105739 7547 flags.go:64] FLAG: --anonymous-auth="true" Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: I0308 03:47:17.105748 7547 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: I0308 03:47:17.105761 7547 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: I0308 03:47:17.105767 7547 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: I0308 03:47:17.105778 7547 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 08 03:47:17.105899 master-0 kubenswrapper[7547]: I0308 03:47:17.105787 7547 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.105792 7547 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.105798 7547 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.105804 7547 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.105811 7547 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106060 7547 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106188 7547 flags.go:64] FLAG: --cgroup-root="" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106196 7547 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106277 7547 flags.go:64] FLAG: --client-ca-file="" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106289 7547 flags.go:64] FLAG: --cloud-config="" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106297 7547 flags.go:64] FLAG: --cloud-provider="" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106305 7547 flags.go:64] FLAG: --cluster-dns="[]" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106320 7547 flags.go:64] FLAG: --cluster-domain="" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106328 7547 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106337 7547 flags.go:64] FLAG: --config-dir="" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106352 7547 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106363 7547 flags.go:64] FLAG: --container-log-max-files="5" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106390 7547 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106397 7547 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106404 7547 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106411 7547 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106418 7547 flags.go:64] FLAG: --contention-profiling="false" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106426 7547 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106437 7547 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106447 7547 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 08 03:47:17.106696 master-0 kubenswrapper[7547]: I0308 03:47:17.106453 7547 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106488 7547 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106495 7547 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106501 7547 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106508 7547 flags.go:64] FLAG: --enable-load-reader="false" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106516 7547 flags.go:64] FLAG: --enable-server="true" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106523 7547 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106536 7547 flags.go:64] FLAG: --event-burst="100" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106543 7547 flags.go:64] FLAG: --event-qps="50" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106550 7547 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106557 7547 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106564 7547 flags.go:64] FLAG: --eviction-hard="" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106573 7547 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106580 7547 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106586 7547 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106597 7547 flags.go:64] FLAG: --eviction-soft="" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106603 7547 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106610 7547 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106616 7547 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106650 7547 flags.go:64] FLAG: --experimental-mounter-path="" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106658 7547 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106665 7547 flags.go:64] FLAG: --fail-swap-on="true" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106671 7547 flags.go:64] FLAG: --feature-gates="" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106684 7547 flags.go:64] FLAG: --file-check-frequency="20s" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106691 7547 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 08 03:47:17.107887 master-0 kubenswrapper[7547]: I0308 03:47:17.106698 7547 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106705 7547 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106714 7547 flags.go:64] FLAG: --healthz-port="10248" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106721 7547 flags.go:64] FLAG: --help="false" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106728 7547 flags.go:64] FLAG: --hostname-override="" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106735 7547 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106742 7547 flags.go:64] FLAG: --http-check-frequency="20s" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106752 7547 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106759 7547 flags.go:64] FLAG: --image-credential-provider-config="" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106765 7547 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106773 7547 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106779 7547 flags.go:64] FLAG: --image-service-endpoint="" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106785 7547 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106792 7547 flags.go:64] FLAG: --kube-api-burst="100" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106799 7547 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106806 7547 flags.go:64] FLAG: --kube-api-qps="50" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106816 7547 flags.go:64] FLAG: --kube-reserved="" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106858 7547 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106866 7547 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106872 7547 flags.go:64] FLAG: --kubelet-cgroups="" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106879 7547 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106885 7547 flags.go:64] FLAG: --lock-file="" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106892 7547 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106899 7547 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106909 7547 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106921 7547 flags.go:64] FLAG: --log-json-split-stream="false" Mar 08 03:47:17.108621 master-0 kubenswrapper[7547]: I0308 03:47:17.106927 7547 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.106934 7547 flags.go:64] FLAG: --log-text-split-stream="false" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.106940 7547 flags.go:64] FLAG: --logging-format="text" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.106947 7547 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.106955 7547 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.106962 7547 flags.go:64] FLAG: --manifest-url="" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.106970 7547 flags.go:64] FLAG: --manifest-url-header="" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.106984 7547 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.106991 7547 flags.go:64] FLAG: --max-open-files="1000000" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.107001 7547 flags.go:64] FLAG: --max-pods="110" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.107008 7547 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.107014 7547 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.107021 7547 flags.go:64] FLAG: --memory-manager-policy="None" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.107028 7547 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.107035 7547 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.107045 7547 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.107052 7547 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.107074 7547 flags.go:64] FLAG: --node-status-max-images="50" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.107081 7547 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.107088 7547 flags.go:64] FLAG: --oom-score-adj="-999" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.107094 7547 flags.go:64] FLAG: --pod-cidr="" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.107106 7547 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.107117 7547 flags.go:64] FLAG: --pod-manifest-path="" Mar 08 03:47:17.109424 master-0 kubenswrapper[7547]: I0308 03:47:17.107147 7547 flags.go:64] FLAG: --pod-max-pids="-1" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107158 7547 flags.go:64] FLAG: --pods-per-core="0" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107166 7547 flags.go:64] FLAG: --port="10250" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107173 7547 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107180 7547 flags.go:64] FLAG: --provider-id="" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107187 7547 flags.go:64] FLAG: --qos-reserved="" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107193 7547 flags.go:64] FLAG: --read-only-port="10255" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107203 7547 flags.go:64] FLAG: --register-node="true" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107210 7547 flags.go:64] FLAG: --register-schedulable="true" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107216 7547 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107229 7547 flags.go:64] FLAG: --registry-burst="10" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107236 7547 flags.go:64] FLAG: --registry-qps="5" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107242 7547 flags.go:64] FLAG: --reserved-cpus="" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107248 7547 flags.go:64] FLAG: --reserved-memory="" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107262 7547 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107268 7547 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107275 7547 flags.go:64] FLAG: --rotate-certificates="false" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107283 7547 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107289 7547 flags.go:64] FLAG: --runonce="false" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107295 7547 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107302 7547 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107309 7547 flags.go:64] FLAG: --seccomp-default="false" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107317 7547 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107326 7547 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107333 7547 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107340 7547 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 08 03:47:17.110158 master-0 kubenswrapper[7547]: I0308 03:47:17.107347 7547 flags.go:64] FLAG: --storage-driver-password="root" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107354 7547 flags.go:64] FLAG: --storage-driver-secure="false" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107361 7547 flags.go:64] FLAG: --storage-driver-table="stats" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107367 7547 flags.go:64] FLAG: --storage-driver-user="root" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107373 7547 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107380 7547 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107391 7547 flags.go:64] FLAG: --system-cgroups="" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107592 7547 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107609 7547 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107618 7547 flags.go:64] FLAG: --tls-cert-file="" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107625 7547 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107637 7547 flags.go:64] FLAG: --tls-min-version="" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107651 7547 flags.go:64] FLAG: --tls-private-key-file="" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107659 7547 flags.go:64] FLAG: --topology-manager-policy="none" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107668 7547 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107676 7547 flags.go:64] FLAG: --topology-manager-scope="container" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107685 7547 flags.go:64] FLAG: --v="2" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107698 7547 flags.go:64] FLAG: --version="false" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107709 7547 flags.go:64] FLAG: --vmodule="" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107719 7547 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: I0308 03:47:17.107733 7547 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: W0308 03:47:17.108135 7547 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: W0308 03:47:17.108148 7547 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: W0308 03:47:17.108157 7547 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:47:17.110948 master-0 kubenswrapper[7547]: W0308 03:47:17.108165 7547 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108171 7547 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108176 7547 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108184 7547 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108190 7547 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108195 7547 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108201 7547 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108207 7547 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108213 7547 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108219 7547 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108224 7547 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108233 7547 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108239 7547 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108244 7547 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108252 7547 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108262 7547 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108268 7547 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108274 7547 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108280 7547 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:47:17.111766 master-0 kubenswrapper[7547]: W0308 03:47:17.108314 7547 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108324 7547 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108331 7547 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108338 7547 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108347 7547 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108355 7547 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108362 7547 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108368 7547 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108374 7547 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108381 7547 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108389 7547 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108395 7547 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108400 7547 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108406 7547 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108411 7547 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108416 7547 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108426 7547 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108431 7547 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108437 7547 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:47:17.112364 master-0 kubenswrapper[7547]: W0308 03:47:17.108442 7547 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108450 7547 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108455 7547 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108461 7547 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108466 7547 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108472 7547 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108477 7547 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108483 7547 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108488 7547 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108496 7547 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108505 7547 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108512 7547 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108518 7547 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108524 7547 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108530 7547 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108536 7547 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108543 7547 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108549 7547 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108555 7547 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108561 7547 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:47:17.113052 master-0 kubenswrapper[7547]: W0308 03:47:17.108567 7547 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:47:17.113667 master-0 kubenswrapper[7547]: W0308 03:47:17.108573 7547 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:47:17.113667 master-0 kubenswrapper[7547]: W0308 03:47:17.108582 7547 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:47:17.113667 master-0 kubenswrapper[7547]: W0308 03:47:17.108588 7547 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:47:17.113667 master-0 kubenswrapper[7547]: W0308 03:47:17.108594 7547 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:47:17.113667 master-0 kubenswrapper[7547]: W0308 03:47:17.108599 7547 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:47:17.113667 master-0 kubenswrapper[7547]: W0308 03:47:17.108604 7547 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:47:17.113667 master-0 kubenswrapper[7547]: W0308 03:47:17.108610 7547 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:47:17.113667 master-0 kubenswrapper[7547]: W0308 03:47:17.108616 7547 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:47:17.113667 master-0 kubenswrapper[7547]: W0308 03:47:17.108621 7547 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:47:17.113667 master-0 kubenswrapper[7547]: W0308 03:47:17.108627 7547 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:47:17.113667 master-0 kubenswrapper[7547]: I0308 03:47:17.108637 7547 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 03:47:17.118611 master-0 kubenswrapper[7547]: I0308 03:47:17.118554 7547 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 08 03:47:17.118611 master-0 kubenswrapper[7547]: I0308 03:47:17.118601 7547 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118840 7547 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118859 7547 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118867 7547 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118875 7547 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118883 7547 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118893 7547 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118901 7547 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118907 7547 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118914 7547 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118920 7547 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118925 7547 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118931 7547 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118936 7547 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118941 7547 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118946 7547 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118951 7547 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118957 7547 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118963 7547 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118970 7547 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:47:17.118991 master-0 kubenswrapper[7547]: W0308 03:47:17.118979 7547 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.118985 7547 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.118992 7547 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.118999 7547 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.119005 7547 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.119011 7547 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.119017 7547 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.119023 7547 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.119029 7547 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.119034 7547 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.119040 7547 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.119045 7547 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.119050 7547 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.119058 7547 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.119065 7547 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.119071 7547 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.119077 7547 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.119083 7547 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.119089 7547 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.119095 7547 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:47:17.119738 master-0 kubenswrapper[7547]: W0308 03:47:17.119100 7547 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119105 7547 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119111 7547 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119117 7547 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119122 7547 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119129 7547 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119135 7547 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119140 7547 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119146 7547 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119151 7547 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119158 7547 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119164 7547 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119170 7547 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119175 7547 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119181 7547 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119186 7547 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119191 7547 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119197 7547 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119202 7547 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119207 7547 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:47:17.120708 master-0 kubenswrapper[7547]: W0308 03:47:17.119213 7547 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:47:17.121510 master-0 kubenswrapper[7547]: W0308 03:47:17.119218 7547 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:47:17.121510 master-0 kubenswrapper[7547]: W0308 03:47:17.119223 7547 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:47:17.121510 master-0 kubenswrapper[7547]: W0308 03:47:17.119231 7547 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:47:17.121510 master-0 kubenswrapper[7547]: W0308 03:47:17.119238 7547 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:47:17.121510 master-0 kubenswrapper[7547]: W0308 03:47:17.119244 7547 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:47:17.121510 master-0 kubenswrapper[7547]: W0308 03:47:17.119249 7547 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:47:17.121510 master-0 kubenswrapper[7547]: W0308 03:47:17.119255 7547 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:47:17.121510 master-0 kubenswrapper[7547]: W0308 03:47:17.119260 7547 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:47:17.121510 master-0 kubenswrapper[7547]: W0308 03:47:17.119265 7547 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:47:17.121510 master-0 kubenswrapper[7547]: W0308 03:47:17.119270 7547 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:47:17.121510 master-0 kubenswrapper[7547]: W0308 03:47:17.119275 7547 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:47:17.121510 master-0 kubenswrapper[7547]: W0308 03:47:17.119281 7547 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:47:17.121510 master-0 kubenswrapper[7547]: I0308 03:47:17.119290 7547 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 03:47:17.121510 master-0 kubenswrapper[7547]: W0308 03:47:17.119650 7547 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:47:17.121510 master-0 kubenswrapper[7547]: W0308 03:47:17.119662 7547 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119667 7547 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119674 7547 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119679 7547 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119685 7547 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119691 7547 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119698 7547 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119706 7547 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119712 7547 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119717 7547 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119724 7547 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119731 7547 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119739 7547 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119745 7547 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119752 7547 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119759 7547 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119765 7547 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119770 7547 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119776 7547 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:47:17.122022 master-0 kubenswrapper[7547]: W0308 03:47:17.119781 7547 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.119787 7547 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.119792 7547 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.119798 7547 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.119803 7547 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.119899 7547 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.119910 7547 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.119917 7547 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.119927 7547 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.119933 7547 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.119939 7547 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.119944 7547 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.119950 7547 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.119955 7547 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.119960 7547 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.120022 7547 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.120032 7547 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.120039 7547 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.120045 7547 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.120051 7547 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:47:17.122725 master-0 kubenswrapper[7547]: W0308 03:47:17.120056 7547 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120062 7547 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120067 7547 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120074 7547 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120080 7547 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120122 7547 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120129 7547 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120134 7547 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120140 7547 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120145 7547 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120150 7547 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120155 7547 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120161 7547 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120166 7547 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120171 7547 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120179 7547 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120186 7547 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120313 7547 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120321 7547 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:47:17.123899 master-0 kubenswrapper[7547]: W0308 03:47:17.120327 7547 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:47:17.124615 master-0 kubenswrapper[7547]: W0308 03:47:17.120334 7547 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:47:17.124615 master-0 kubenswrapper[7547]: W0308 03:47:17.120339 7547 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:47:17.124615 master-0 kubenswrapper[7547]: W0308 03:47:17.120345 7547 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:47:17.124615 master-0 kubenswrapper[7547]: W0308 03:47:17.120350 7547 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:47:17.124615 master-0 kubenswrapper[7547]: W0308 03:47:17.120358 7547 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:47:17.124615 master-0 kubenswrapper[7547]: W0308 03:47:17.120364 7547 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:47:17.124615 master-0 kubenswrapper[7547]: W0308 03:47:17.120369 7547 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:47:17.124615 master-0 kubenswrapper[7547]: W0308 03:47:17.120439 7547 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:47:17.124615 master-0 kubenswrapper[7547]: W0308 03:47:17.120448 7547 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:47:17.124615 master-0 kubenswrapper[7547]: W0308 03:47:17.120454 7547 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:47:17.124615 master-0 kubenswrapper[7547]: W0308 03:47:17.120459 7547 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:47:17.124615 master-0 kubenswrapper[7547]: W0308 03:47:17.120465 7547 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:47:17.124615 master-0 kubenswrapper[7547]: I0308 03:47:17.120472 7547 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 03:47:17.124615 master-0 kubenswrapper[7547]: I0308 03:47:17.120898 7547 server.go:940] "Client rotation is on, will bootstrap in background" Mar 08 03:47:17.124615 master-0 kubenswrapper[7547]: I0308 03:47:17.124279 7547 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 08 03:47:17.125189 master-0 kubenswrapper[7547]: I0308 03:47:17.124518 7547 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 08 03:47:17.125343 master-0 kubenswrapper[7547]: I0308 03:47:17.125307 7547 server.go:997] "Starting client certificate rotation" Mar 08 03:47:17.125422 master-0 kubenswrapper[7547]: I0308 03:47:17.125354 7547 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 08 03:47:17.126100 master-0 kubenswrapper[7547]: I0308 03:47:17.125643 7547 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-09 03:37:12 +0000 UTC, rotation deadline is 2026-03-08 21:30:19.347581905 +0000 UTC Mar 08 03:47:17.126156 master-0 kubenswrapper[7547]: I0308 03:47:17.126095 7547 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 17h43m2.221493585s for next certificate rotation Mar 08 03:47:17.127114 master-0 kubenswrapper[7547]: I0308 03:47:17.127082 7547 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 03:47:17.129075 master-0 kubenswrapper[7547]: I0308 03:47:17.129005 7547 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 03:47:17.133320 master-0 kubenswrapper[7547]: I0308 03:47:17.133035 7547 log.go:25] "Validated CRI v1 runtime API" Mar 08 03:47:17.139622 master-0 kubenswrapper[7547]: I0308 03:47:17.139594 7547 log.go:25] "Validated CRI v1 image API" Mar 08 03:47:17.141316 master-0 kubenswrapper[7547]: I0308 03:47:17.141264 7547 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 08 03:47:17.146848 master-0 kubenswrapper[7547]: I0308 03:47:17.146782 7547 fs.go:135] Filesystem UUIDs: map[67898fbb-3e32-465e-b6f9-207afe668b6e:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 08 03:47:17.147386 master-0 kubenswrapper[7547]: I0308 03:47:17.146825 7547 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0ec1fcf833bb575029f4371f595adf3e92b6ae14914f83458d311cb85210d774/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0ec1fcf833bb575029f4371f595adf3e92b6ae14914f83458d311cb85210d774/userdata/shm major:0 minor:114 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/133c0043d3a977b4007520994c1530f26391f82433e16ae8b2e991aa2092980b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/133c0043d3a977b4007520994c1530f26391f82433e16ae8b2e991aa2092980b/userdata/shm major:0 minor:285 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/14fe5cb6383f1129ecf327e882bdb7904f8ad1a8a2cc2647d9ee96534b6ccb93/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/14fe5cb6383f1129ecf327e882bdb7904f8ad1a8a2cc2647d9ee96534b6ccb93/userdata/shm major:0 minor:271 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/22cc81f0c9d90fe64f682c3bbb7bbcefc904c4ee2c036d7eedf6b66887f69fae/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/22cc81f0c9d90fe64f682c3bbb7bbcefc904c4ee2c036d7eedf6b66887f69fae/userdata/shm major:0 minor:100 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2899b4e2a1cabd8aea96b1bf0db490c7e98f0e9564c40236186985f7b516039b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2899b4e2a1cabd8aea96b1bf0db490c7e98f0e9564c40236186985f7b516039b/userdata/shm major:0 minor:269 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2a8e902cd252f0c879e3e1c00047d04c3e8646bfeed72f034a41537b464f6d14/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2a8e902cd252f0c879e3e1c00047d04c3e8646bfeed72f034a41537b464f6d14/userdata/shm major:0 minor:273 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3235d3bd9c5f6c6a7e16ad74c79046e87f4d03278e4096c568a5930f544fbbf0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3235d3bd9c5f6c6a7e16ad74c79046e87f4d03278e4096c568a5930f544fbbf0/userdata/shm major:0 minor:227 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3d64c10d51d4d9009da402a9f2c51b81830f1695b7370548200097f367d254f2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3d64c10d51d4d9009da402a9f2c51b81830f1695b7370548200097f367d254f2/userdata/shm major:0 minor:265 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4482916b3b4b521cf75927dd45a05e0a2072a49de37c125a72612ca885ff96ce/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4482916b3b4b521cf75927dd45a05e0a2072a49de37c125a72612ca885ff96ce/userdata/shm major:0 minor:279 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/47433e6e63affa1ba02608e11b299ca5af00d1c85e6731e35f43a4b241522538/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/47433e6e63affa1ba02608e11b299ca5af00d1c85e6731e35f43a4b241522538/userdata/shm major:0 minor:286 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/49aec4971047b96e14ae56703fe099426b567477422c0add4be258e7ae9b7ff1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/49aec4971047b96e14ae56703fe099426b567477422c0add4be258e7ae9b7ff1/userdata/shm major:0 minor:259 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5b8c31076d1db49fd8c133661fbbc131a58892112131cf3118f58212505e7460/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5b8c31076d1db49fd8c133661fbbc131a58892112131cf3118f58212505e7460/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5ffd7e8cf7a9593e9910a67c41b6e95af26b8d49eaf5fd007129fe49d1978425/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5ffd7e8cf7a9593e9910a67c41b6e95af26b8d49eaf5fd007129fe49d1978425/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/604b5e18b0f1fc95cb4cabd9d6cb088bfcead3c4cba52acd70685e03b5856c7f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/604b5e18b0f1fc95cb4cabd9d6cb088bfcead3c4cba52acd70685e03b5856c7f/userdata/shm major:0 minor:45 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6588c21791f0b9fd7a866ced5165aad3ddf504a15e8585434bc4836ba3395293/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6588c21791f0b9fd7a866ced5165aad3ddf504a15e8585434bc4836ba3395293/userdata/shm major:0 minor:142 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/758a2c2e2af7455b02804a595f36886f4047114b8dbd25a8393a292e35b7254e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/758a2c2e2af7455b02804a595f36886f4047114b8dbd25a8393a292e35b7254e/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9b135e9cc968b9e23fda104dcc5dd8cbf50632e21d670c61642446eb2eb45282/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9b135e9cc968b9e23fda104dcc5dd8cbf50632e21d670c61642446eb2eb45282/userdata/shm major:0 minor:263 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a5455725a1362a8e870442eb2f0235fbea46c1d047d2183683f1ca346ec9c059/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a5455725a1362a8e870442eb2f0235fbea46c1d047d2183683f1ca346ec9c059/userdata/shm major:0 minor:289 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/adfc23f0d784d89240f88962b8f79cdf84a79077cb7581e94d0e19b479eeafaa/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/adfc23f0d784d89240f88962b8f79cdf84a79077cb7581e94d0e19b479eeafaa/userdata/shm major:0 minor:283 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c5cc16f26a63d054e0857f2a2f1278a7512a2a20bea66d9521aa218fb1539d3c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c5cc16f26a63d054e0857f2a2f1278a7512a2a20bea66d9521aa218fb1539d3c/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d47d14f256ba67306efe8da7bbcadc67f946b747f7e0a1d658a9687f1f0a1a37/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d47d14f256ba67306efe8da7bbcadc67f946b747f7e0a1d658a9687f1f0a1a37/userdata/shm major:0 minor:275 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d73b051671cc575452964e4ec7abae8ed2cf8ae1de2a3be5460a27e068329e94/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d73b051671cc575452964e4ec7abae8ed2cf8ae1de2a3be5460a27e068329e94/userdata/shm major:0 minor:257 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/df9f6505570d879efae2662d6149a2ae417f35b1bed956f7339c92d857b81707/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/df9f6505570d879efae2662d6149a2ae417f35b1bed956f7339c92d857b81707/userdata/shm major:0 minor:281 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e8733f46dd1d2647e586c0cc9b5a4ebea38d695f856a8c74190015b70d99a33e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e8733f46dd1d2647e586c0cc9b5a4ebea38d695f856a8c74190015b70d99a33e/userdata/shm major:0 minor:129 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ec83c044c04d6837d5d5f7d4c71e74473794e6ee1e718df488cf45a934fcc03a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ec83c044c04d6837d5d5f7d4c71e74473794e6ee1e718df488cf45a934fcc03a/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ed400b0e1b21fe5e4ef5385a05444bf39db4c2fd9c754a3d6c45427d3b29ef99/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ed400b0e1b21fe5e4ef5385a05444bf39db4c2fd9c754a3d6c45427d3b29ef99/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0418ff42-7eac-4266-97b5-4df88623d066/volumes/kubernetes.io~projected/kube-api-access-kmpdd:{mountpoint:/var/lib/kubelet/pods/0418ff42-7eac-4266-97b5-4df88623d066/volumes/kubernetes.io~projected/kube-api-access-kmpdd major:0 minor:245 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0918ba32-8e55-48d0-8e50-027c0dcb4bbd/volumes/kubernetes.io~projected/kube-api-access-mghmh:{mountpoint:/var/lib/kubelet/pods/0918ba32-8e55-48d0-8e50-027c0dcb4bbd/volumes/kubernetes.io~projected/kube-api-access-mghmh major:0 minor:252 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0918ba32-8e55-48d0-8e50-027c0dcb4bbd/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/0918ba32-8e55-48d0-8e50-027c0dcb4bbd/volumes/kubernetes.io~secret/serving-cert major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/093f17f0-2818-4e24-b3c3-6ab4da9d21fb/volumes/kubernetes.io~projected/kube-api-access-7nk8r:{mountpoint:/var/lib/kubelet/pods/093f17f0-2818-4e24-b3c3-6ab4da9d21fb/volumes/kubernetes.io~projected/kube-api-access-7nk8r major:0 minor:105 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0d377285-0336-41b7-b48f-c44a7b563498/volumes/kubernetes.io~projected/kube-api-access-7qn5v:{mountpoint:/var/lib/kubelet/pods/0d377285-0336-41b7-b48f-c44a7b563498/volumes/kubernetes.io~projected/kube-api-access-7qn5v major:0 minor:249 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0d377285-0336-41b7-b48f-c44a7b563498/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/0d377285-0336-41b7-b48f-c44a7b563498/volumes/kubernetes.io~secret/serving-cert major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0ebf1330-e044-4ff5-8b48-2d667e0c5625/volumes/kubernetes.io~projected/kube-api-access-hccv4:{mountpoint:/var/lib/kubelet/pods/0ebf1330-e044-4ff5-8b48-2d667e0c5625/volumes/kubernetes.io~projected/kube-api-access-hccv4 major:0 minor:241 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0ebf1330-e044-4ff5-8b48-2d667e0c5625/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/0ebf1330-e044-4ff5-8b48-2d667e0c5625/volumes/kubernetes.io~secret/serving-cert major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1482d789-884b-4337-b598-f0e2b71eb9f2/volumes/kubernetes.io~projected/kube-api-access-m2h62:{mountpoint:/var/lib/kubelet/pods/1482d789-884b-4337-b598-f0e2b71eb9f2/volumes/kubernetes.io~projected/kube-api-access-m2h62 major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/164586b1-f133-4427-8ab6-eb0839b79738/volumes/kubernetes.io~projected/kube-api-access-r4stz:{mountpoint:/var/lib/kubelet/pods/164586b1-f133-4427-8ab6-eb0839b79738/volumes/kubernetes.io~projected/kube-api-access-r4stz major:0 minor:139 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/164586b1-f133-4427-8ab6-eb0839b79738/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/164586b1-f133-4427-8ab6-eb0839b79738/volumes/kubernetes.io~secret/webhook-cert major:0 minor:138 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1cbcb403-a424-4496-8c5c-5eb5e42dfb93/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/1cbcb403-a424-4496-8c5c-5eb5e42dfb93/volumes/kubernetes.io~projected/kube-api-access major:0 minor:244 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1cbcb403-a424-4496-8c5c-5eb5e42dfb93/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/1cbcb403-a424-4496-8c5c-5eb5e42dfb93/volumes/kubernetes.io~secret/serving-cert major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1eb851be-f157-48ea-9a39-1361b68d2639/volumes/kubernetes.io~projected/kube-api-access-nqhzl:{mountpoint:/var/lib/kubelet/pods/1eb851be-f157-48ea-9a39-1361b68d2639/volumes/kubernetes.io~projected/kube-api-access-nqhzl major:0 minor:246 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/232c421d-96f0-4894-b8d8-74f43d02bbd3/volumes/kubernetes.io~projected/kube-api-access-fx4fw:{mountpoint:/var/lib/kubelet/pods/232c421d-96f0-4894-b8d8-74f43d02bbd3/volumes/kubernetes.io~projected/kube-api-access-fx4fw major:0 minor:238 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/26180f77-0b1a-4d0f-9ed0-a12fdee69817/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/26180f77-0b1a-4d0f-9ed0-a12fdee69817/volumes/kubernetes.io~projected/kube-api-access major:0 minor:253 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/26180f77-0b1a-4d0f-9ed0-a12fdee69817/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/26180f77-0b1a-4d0f-9ed0-a12fdee69817/volumes/kubernetes.io~secret/serving-cert major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2dd4279d-a1a9-450a-a061-9008cd1ea8e0/volumes/kubernetes.io~projected/kube-api-access-pnzt7:{mountpoint:/var/lib/kubelet/pods/2dd4279d-a1a9-450a-a061-9008cd1ea8e0/volumes/kubernetes.io~projected/kube-api-access-pnzt7 major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/30211469-7108-4820-a988-26fc4ced734e/volumes/kubernetes.io~projected/kube-api-access-fncng:{mountpoint:/var/lib/kubelet/pods/30211469-7108-4820-a988-26fc4ced734e/volumes/kubernetes.io~projected/kube-api-access-fncng major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/30211469-7108-4820-a988-26fc4ced734e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/30211469-7108-4820-a988-26fc4ced734e/volumes/kubernetes.io~secret/serving-cert major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/349d438d-d124-4d34-a172-4160e766c680/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/349d438d-d124-4d34-a172-4160e766c680/volumes/kubernetes.io~projected/kube-api-access major:0 minor:91 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3ddfd0e7-fe76-41bc-b316-94505df81002/volumes/kubernetes.io~projected/kube-api-access-bgc7c:{mountpoint:/var/lib/kubelet/pods/3ddfd0e7-fe76-41bc-b316-94505df81002/volumes/kubernetes.io~projected/kube-api-access-bgc7c major:0 minor:92 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3ddfd0e7-fe76-41bc-b316-94505df81002/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/3ddfd0e7-fe76-41bc-b316-94505df81002/volumes/kubernetes.io~secret/metrics-tls major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4a19441e-e61b-4d58-85db-813ae88e1f9b/volumes/kubernetes.io~projected/kube-api-access-dw7bx:{mountpoint:/var/lib/kubelet/pods/4a19441e-e61b-4d58-85db-813ae88e1f9b/volumes/kubernetes.io~projected/kube-api-access-dw7bx major:0 minor:118 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4c5a0c1d-867a-4ce4-9570-ea66452c8db3/volumes/kubernetes.io~projected/kube-api-access-mkzb2:{mountpoint:/var/lib/kubelet/pods/4c5a0c1d-867a-4ce4-9570-ea66452c8db3/volumes/kubernetes.io~projected/kube-api-access-mkzb2 major:0 minor:256 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/52b495ac-bb28-44f3-b925-3c54f86d5ec4/volumes/kubernetes.io~projected/kube-api-access-dd549:{mountpoint:/var/lib/kubelet/pods/52b495ac-bb28-44f3-b925-3c54f86d5ec4/volumes/kubernetes.io~projected/kube-api-access-dd549 major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/54ad284e-d40e-4e69-b898-f5093952a0e6/volumes/kubernetes.io~projected/kube-api-access-9lfcj:{mountpoint:/var/lib/kubelet/pods/54ad284e-d40e-4e69-b898-f5093952a0e6/volumes/kubernetes.io~projected/kube-api-access-9lfcj major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a/volumes/kubernetes.io~projected/kube-api-access-cfvnn:{mountpoint:/var/lib/kubelet/pods/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a/volumes/kubernetes.io~projected/kube-api-access-cfvnn major:0 minor:255 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a/volumes/kubernetes.io~secret/serving-cert major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a7752f9-7b9a-451f-997a-e9f696d38b34/volumes/kubernetes.io~projected/kube-api-access-8b5zb:{mountpoint:/var/lib/kubelet/pods/5a7752f9-7b9a-451f-997a-e9f696d38b34/volumes/kubernetes.io~projected/kube-api-access-8b5zb major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a7752f9-7b9a-451f-997a-e9f696d38b34/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/5a7752f9-7b9a-451f-997a-e9f696d38b34/volumes/kubernetes.io~secret/etcd-client major:0 minor:209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a7752f9-7b9a-451f-997a-e9f696d38b34/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/5a7752f9-7b9a-451f-997a-e9f696d38b34/volumes/kubernetes.io~secret/serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:247 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f/volumes/kubernetes.io~projected/kube-api-access-fw7mr:{mountpoint:/var/lib/kubelet/pods/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f/volumes/kubernetes.io~projected/kube-api-access-fw7mr major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6cde5024-edf7-4fa4-8964-cabe7899578b/volumes/kubernetes.io~projected/kube-api-access-x997v:{mountpoint:/var/lib/kubelet/pods/6cde5024-edf7-4fa4-8964-cabe7899578b/volumes/kubernetes.io~projected/kube-api-access-x997v major:0 minor:237 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7e5935ea-8d95-45e3-b836-c7892953ef3d/volumes/kubernetes.io~projected/kube-api-access-c6gml:{mountpoint:/var/lib/kubelet/pods/7e5935ea-8d95-45e3-b836-c7892953ef3d/volumes/kubernetes.io~projected/kube-api-access-c6gml major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7e5935ea-8d95-45e3-b836-c7892953ef3d/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/7e5935ea-8d95-45e3-b836-c7892953ef3d/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7ff63c73-62a3-44b4-acd3-1b3df175794f/volumes/kubernetes.io~projected/kube-api-access-vfqc5:{mountpoint:/var/lib/kubelet/pods/7ff63c73-62a3-44b4-acd3-1b3df175794f/volumes/kubernetes.io~projected/kube-api-access-vfqc5 major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7ff63c73-62a3-44b4-acd3-1b3df175794f/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/7ff63c73-62a3-44b4-acd3-1b3df175794f/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8efdcef9-9b31-4567-b7f9-cb59a894273d/volumes/kubernetes.io~projected/kube-api-access-cpsx7:{mountpoint:/var/lib/kubelet/pods/8efdcef9-9b31-4567-b7f9-cb59a894273d/volumes/kubernetes.io~projected/kube-api-access-cpsx7 major:0 minor:254 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a60bc804-52e7-422a-87fd-ac4c5aa90cb3/volumes/kubernetes.io~projected/kube-api-access-zxkm6:{mountpoint:/var/lib/kubelet/pods/a60bc804-52e7-422a-87fd-ac4c5aa90cb3/volumes/kubernetes.io~projected/kube-api-access-zxkm6 major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a60bc804-52e7-422a-87fd-ac4c5aa90cb3/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a60bc804-52e7-422a-87fd-ac4c5aa90cb3/volumes/kubernetes.io~secret/serving-cert major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b3eea925-73b3-4693-8f0e-6dd26107f60a/volumes/kubernetes.io~projected/kube-api-access-6sx5s:{mountpoint:/var/lib/kubelet/pods/b3eea925-73b3-4693-8f0e-6dd26107f60a/volumes/kubernetes.io~projected/kube-api-access-6sx5s major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b3eea925-73b3-4693-8f0e-6dd26107f60a/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/b3eea925-73b3-4693-8f0e-6dd26107f60a/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c9de4939-680a-4e3e-89fd-e20ecb8b10f2/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/c9de4939-680a-4e3e-89fd-e20ecb8b10f2/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:250 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c9de4939-680a-4e3e-89fd-e20ecb8b10f2/volumes/kubernetes.io~projected/kube-api-access-29dpg:{mountpoint:/var/lib/kubelet/pods/c9de4939-680a-4e3e-89fd-e20ecb8b10f2/volumes/kubernetes.io~projected/kube-api-access-29dpg major:0 minor:248 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d5044ffd-0686-4679-9894-e696faf33699/volumes/kubernetes.io~projected/kube-api-access-mmhtb:{mountpoint:/var/lib/kubelet/pods/d5044ffd-0686-4679-9894-e696faf33699/volumes/kubernetes.io~projected/kube-api-access-mmhtb major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d831cb23-7411-4072-8273-c167d9afca28/volumes/kubernetes.io~projected/kube-api-access-dwkwt:{mountpoint:/var/lib/kubelet/pods/d831cb23-7411-4072-8273-c167d9afca28/volumes/kubernetes.io~projected/kube-api-access-dwkwt major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e4541b7b-3f7f-4851-9bd9-26fcda5cab13/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/e4541b7b-3f7f-4851-9bd9-26fcda5cab13/volumes/kubernetes.io~projected/kube-api-access major:0 minor:240 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e4541b7b-3f7f-4851-9bd9-26fcda5cab13/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e4541b7b-3f7f-4851-9bd9-26fcda5cab13/volumes/kubernetes.io~secret/serving-cert major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e78b283b-981e-48d7-a5f2-53f8401766ea/volumes/kubernetes.io~projected/kube-api-access-rchj5:{mountpoint:/var/lib/kubelet/pods/e78b283b-981e-48d7-a5f2-53f8401766ea/volumes/kubernetes.io~projected/kube-api-access-rchj5 major:0 minor:251 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ee586416-6f56-4ea4-ad62-95de1e6df23b/volumes/kubernetes.io~projected/kube-api-access-sxxhh:{mountpoint:/var/lib/kubelet/pods/ee586416-6f56-4ea4-ad62-95de1e6df23b/volumes/kubernetes.io~projected/kube-api-access-sxxhh major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ee586416-6f56-4ea4-ad62-95de1e6df23b/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ee586416-6f56-4ea4-ad62-95de1e6df23b/volumes/kubernetes.io~secret/serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f0f5f3f3-0856-4da3-9157-15f65c6aba6e/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/f0f5f3f3-0856-4da3-9157-15f65c6aba6e/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f0f5f3f3-0856-4da3-9157-15f65c6aba6e/volumes/kubernetes.io~projected/kube-api-access-2vklx:{mountpoint:/var/lib/kubelet/pods/f0f5f3f3-0856-4da3-9157-15f65c6aba6e/volumes/kubernetes.io~projected/kube-api-access-2vklx major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f0f5f3f3-0856-4da3-9157-15f65c6aba6e/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/f0f5f3f3-0856-4da3-9157-15f65c6aba6e/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:126 fsType:tmpfs blockSize:0} overlay_0-102:{mountpoint:/var/lib/containers/storage/overlay/159f70f0295d276199db67e711ddadd5fa50e846ee6c064a9f037adbe03c5960/merged major:0 minor:102 fsType:overlay blockSize:0} overlay_0-110:{mountpoint:/var/lib/containers/storage/overlay/597fed34171f657578a1d5557d8cd13d46b059079217e54d7ddeaedef3dc97a8/merged major:0 minor:110 fsType:overlay blockSize:0} overlay_0-116:{mountpoint:/var/lib/containers/storage/overlay/1b06806633b7877a750d9205fc7919bf2454ea5da4d239059bb94a650c396de9/merged major:0 minor:116 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/f4bcba9ea7a2a47a2ec69ef757a6ece5bd493a6f40bb501912affa429fc33f84/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-132:{mountpoint:/var/lib/containers/storage/overlay/5770c755fb57991d02f24c9c55b95e329ffdc777a197559deca6c85b2066f1e6/merged major:0 minor:132 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/39c37fc286cbfdda810328109d5c626e28cbbea3f0c88e91d5056e30b6a6badf/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/8c2bc3b43f7c74fc829c708cfc407eb7d0becab6bcfc35aef288288139d72e16/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-140:{mountpoint:/var/lib/containers/storage/overlay/ddb1448d1eb6712ee303b77c4ae081aff6e6d861313662abd10d8c9c6f60efcf/merged major:0 minor:140 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/f89a5c80aa1c00850dfa9a29cae8433e43c2a966c88d623e77d0949c12d0b069/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/8a3b37e48224698e7a5aaf8e61fb6beb92dab875c519e7b8e4435d4d2f692716/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/02379eac95f47e8c9ea5b9e16c79376ac402c23c179884ea808d0aa19eec1cb7/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-156:{mountpoint:/var/lib/containers/storage/overlay/31caee417ab3b8d280b1b0c7324a1d44a15dfd162a934df61bce89bfb4b1322f/merged major:0 minor:156 fsType:overlay blockSize:0} overlay_0-158:{mountpoint:/var/lib/containers/storage/overlay/126f320739db4f221784c36cb60b60aa19ababc3f76609e8a185043dcc806152/merged major:0 minor:158 fsType:overlay blockSize:0} overlay_0-159:{mountpoint:/var/lib/containers/storage/overlay/871523128e2c0557743aa69d441527d5c16812b9c373dbabf4dc8a4daf386022/merged major:0 minor:159 fsType:overlay blockSize:0} overlay_0-174:{mountpoint:/var/lib/containers/storage/overlay/cbd801d2a28c3f00fe101072390e7e2d5657c4a2fe48742e7960360869a21ca5/merged major:0 minor:174 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/6eb1fbc6279a817180f1d34052d7f45b3694f592e5bc718b726fa23d8458ab4e/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/584d394e662410989c23408aee166d692041db1664ed67d7e4510dc6d136e9c8/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/28c79e3444b687b213817f4d6d7d0a36990b7c23e1e607c84f2b01b7cc09d655/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/61f59d1cb37f5b7067be0bbf8ff1a728f10ca3e8daddec97c07531154aeb42f4/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/78eaff81c4fedc9d73da1bcd07ee623dfc5d1416466b9577dd75c31b9479d4ed/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/202561aa60da5c15aaa47e5826d80ab313db28b1800b634c2baae69b4a2d7437/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-261:{mountpoint:/var/lib/containers/storage/overlay/6291218680ab4337ece6a9e70cfaef9e0a7d7a472c5f31ad52212bcc84310ad6/merged major:0 minor:261 fsType:overlay blockSize:0} overlay_0-266:{mountpoint:/var/lib/containers/storage/overlay/f93bdc2936ca0c10c0d8056847178328412dc5e4a74a09cf60c44f11383d8691/merged major:0 minor:266 fsType:overlay blockSize:0} overlay_0-277:{mountpoint:/var/lib/containers/storage/overlay/40d944ab7563914f374aa3c8068539483fbb6b24296502a2c688d1c14199db6f/merged major:0 minor:277 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/d57159928d5faa6c29b4ed8265a1460cf6a81432463a4a5bd8181f8ed5329d37/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/1c436f9c07c25cf14919f8e86213eb0b5e5f71577655d9feaaa9e15540b80466/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-295:{mountpoint:/var/lib/containers/storage/overlay/d4692cd1ab9db5084d70e40c2bb93ca90f631359549979c56ce8f57dd476e6cb/merged major:0 minor:295 fsType:overlay blockSize:0} overlay_0-297:{mountpoint:/var/lib/containers/storage/overlay/3eccf431b3b03d19597cab5c7720e96141321facc11df4acaf9aeae5eb5cff4b/merged major:0 minor:297 fsType:overlay blockSize:0} overlay_0-299:{mountpoint:/var/lib/containers/storage/overlay/a252e23963211d9542968848cf9d48cc012fbf09b292403264d2a45234d9c4a1/merged major:0 minor:299 fsType:overlay blockSize:0} overlay_0-301:{mountpoint:/var/lib/containers/storage/overlay/1e9b0d7ae87d25c638e4b7cc94c220d55da06f12e8d29a756b350cbc4278b711/merged major:0 minor:301 fsType:overlay blockSize:0} overlay_0-303:{mountpoint:/var/lib/containers/storage/overlay/b73acd11c36d8766e21280e8b7c707c1b4428243d800d7829619af3d643b787b/merged major:0 minor:303 fsType:overlay blockSize:0} overlay_0-305:{mountpoint:/var/lib/containers/storage/overlay/72b7f9c6a6eba9b6512a84c976d33d8807fc575a3d7d5f37d198ac73e29a5925/merged major:0 minor:305 fsType:overlay blockSize:0} overlay_0-307:{mountpoint:/var/lib/containers/storage/overlay/30b8cdc9c1a2def5dc5cad19ba2593e4edb09ee01e41374fafcd612baf67b667/merged major:0 minor:307 fsType:overlay blockSize:0} overlay_0-309:{mountpoint:/var/lib/containers/storage/overlay/6ba9c7d52dc9297679ca3cfa4cb0bf600312c377c92e9fe18f05b36f29678454/merged major:0 minor:309 fsType:overlay blockSize:0} overlay_0-311:{mountpoint:/var/lib/containers/storage/overlay/9de694dedd8cf20a1e634df7e411e9d1989995a6dd528c269076d3816f28e386/merged major:0 minor:311 fsType:overlay blockSize:0} overlay_0-313:{mountpoint:/var/lib/containers/storage/overlay/1871147ec5d62518aec484a24b49ffcf6277fdced02212b869f2837411ff1a9e/merged major:0 minor:313 fsType:overlay blockSize:0} overlay_0-315:{mountpoint:/var/lib/containers/storage/overlay/fd74e22116d6063def16ea49c2763e244563c8c947965c5b40c256759d772ee9/merged major:0 minor:315 fsType:overlay blockSize:0} overlay_0-44:{mountpoint:/var/lib/containers/storage/overlay/fc618a0235caaad11133db99a4b849d974263caadaa5c2b628e24398d3b9ec59/merged major:0 minor:44 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/98845493bc8490985915016f847103cc994a6a38f31689164a603f8a8df1f904/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/a890d0ffef46362decea31b67811f8a87249fe5fec0237c25fa51a883a8ab49f/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/a9705d96efeb0203a4da164445e0df679f76bfcd977843e6c552cd914d1a3e54/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/c19927ea024b3d21dedb1f69f70259228eeb02364b41205f08a4f92e75218e44/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/7a88c09bb3ab66cc5fb4a16cdf76155403406e3b457db6e7820e8a66ec439f82/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/652550bca01ef7d0804b4747a9e02d0b6dfe6d7a66fd54a4995283e1b07c1d7b/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-75:{mountpoint:/var/lib/containers/storage/overlay/73f4a3db77996182b3ff38cee6ea653df1fa2d1a8f52c82669107e39b00ae5db/merged major:0 minor:75 fsType:overlay blockSize:0} overlay_0-76:{mountpoint:/var/lib/containers/storage/overlay/791e5f89a0bf3e623dc2f69f6366ac318e0580618e6a936dcbca9cc7acbc4b07/merged major:0 minor:76 fsType:overlay blockSize:0} overlay_0-78:{mountpoint:/var/lib/containers/storage/overlay/88ecb00b80a8a848a874fbd9b9030a1c2ebda42956f4d136c148171b5c4d027d/merged major:0 minor:78 fsType:overlay blockSize:0} overlay_0-81:{mountpoint:/var/lib/containers/storage/overlay/a73bc0ce5d537beb3388abe790e36a6a1a6f32b9beddd993f79a655c19a8521e/merged major:0 minor:81 fsType:overlay blockSize:0} overlay_0-89:{mountpoint:/var/lib/containers/storage/overlay/d79c6f845f20c535e9fe2502dc7891d859becc81980f13cfc3342c5698635df0/merged major:0 minor:89 fsType:overlay blockSize:0} overlay_0-99:{mountpoint:/var/lib/containers/storage/overlay/a9b1825459af37974fba5d6143a8a8f78044f6b3378513a8886388da43da8bd4/merged major:0 minor:99 fsType:overlay blockSize:0}] Mar 08 03:47:17.174621 master-0 kubenswrapper[7547]: I0308 03:47:17.173410 7547 manager.go:217] Machine: {Timestamp:2026-03-08 03:47:17.171962121 +0000 UTC m=+0.117646654 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:713fb7c44cd644b5986e15157751dddb SystemUUID:713fb7c4-4cd6-44b5-986e-15157751dddb BootID:30e60e76-0e70-41ea-99da-7a4dcafd0e32 Filesystems:[{Device:/var/lib/kubelet/pods/349d438d-d124-4d34-a172-4160e766c680/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:91 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0d377285-0336-41b7-b48f-c44a7b563498/volumes/kubernetes.io~projected/kube-api-access-7qn5v DeviceMajor:0 DeviceMinor:249 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-266 DeviceMajor:0 DeviceMinor:266 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-313 DeviceMajor:0 DeviceMinor:313 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-102 DeviceMajor:0 DeviceMinor:102 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6cde5024-edf7-4fa4-8964-cabe7899578b/volumes/kubernetes.io~projected/kube-api-access-x997v DeviceMajor:0 DeviceMinor:237 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:247 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b3eea925-73b3-4693-8f0e-6dd26107f60a/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:224 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9b135e9cc968b9e23fda104dcc5dd8cbf50632e21d670c61642446eb2eb45282/userdata/shm DeviceMajor:0 DeviceMinor:263 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/adfc23f0d784d89240f88962b8f79cdf84a79077cb7581e94d0e19b479eeafaa/userdata/shm DeviceMajor:0 DeviceMinor:283 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-301 DeviceMajor:0 DeviceMinor:301 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6588c21791f0b9fd7a866ced5165aad3ddf504a15e8585434bc4836ba3395293/userdata/shm DeviceMajor:0 DeviceMinor:142 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f0f5f3f3-0856-4da3-9157-15f65c6aba6e/volumes/kubernetes.io~projected/kube-api-access-2vklx DeviceMajor:0 DeviceMinor:127 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:214 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e4541b7b-3f7f-4851-9bd9-26fcda5cab13/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:216 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2dd4279d-a1a9-450a-a061-9008cd1ea8e0/volumes/kubernetes.io~projected/kube-api-access-pnzt7 DeviceMajor:0 DeviceMinor:233 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a/volumes/kubernetes.io~projected/kube-api-access-cfvnn DeviceMajor:0 DeviceMinor:255 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/133c0043d3a977b4007520994c1530f26391f82433e16ae8b2e991aa2092980b/userdata/shm DeviceMajor:0 DeviceMinor:285 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/604b5e18b0f1fc95cb4cabd9d6cb088bfcead3c4cba52acd70685e03b5856c7f/userdata/shm DeviceMajor:0 DeviceMinor:45 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3ddfd0e7-fe76-41bc-b316-94505df81002/volumes/kubernetes.io~projected/kube-api-access-bgc7c DeviceMajor:0 DeviceMinor:92 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c9de4939-680a-4e3e-89fd-e20ecb8b10f2/volumes/kubernetes.io~projected/kube-api-access-29dpg DeviceMajor:0 DeviceMinor:248 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c9de4939-680a-4e3e-89fd-e20ecb8b10f2/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:250 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-132 DeviceMajor:0 DeviceMinor:132 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d73b051671cc575452964e4ec7abae8ed2cf8ae1de2a3be5460a27e068329e94/userdata/shm DeviceMajor:0 DeviceMinor:257 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2899b4e2a1cabd8aea96b1bf0db490c7e98f0e9564c40236186985f7b516039b/userdata/shm DeviceMajor:0 DeviceMinor:269 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3235d3bd9c5f6c6a7e16ad74c79046e87f4d03278e4096c568a5930f544fbbf0/userdata/shm DeviceMajor:0 DeviceMinor:227 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/232c421d-96f0-4894-b8d8-74f43d02bbd3/volumes/kubernetes.io~projected/kube-api-access-fx4fw DeviceMajor:0 DeviceMinor:238 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-295 DeviceMajor:0 DeviceMinor:295 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5b8c31076d1db49fd8c133661fbbc131a58892112131cf3118f58212505e7460/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-76 DeviceMajor:0 DeviceMinor:76 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-110 DeviceMajor:0 DeviceMinor:110 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/26180f77-0b1a-4d0f-9ed0-a12fdee69817/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:221 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7ff63c73-62a3-44b4-acd3-1b3df175794f/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:222 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0918ba32-8e55-48d0-8e50-027c0dcb4bbd/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:223 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7ff63c73-62a3-44b4-acd3-1b3df175794f/volumes/kubernetes.io~projected/kube-api-access-vfqc5 DeviceMajor:0 DeviceMinor:239 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-99 DeviceMajor:0 DeviceMinor:99 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d5044ffd-0686-4679-9894-e696faf33699/volumes/kubernetes.io~projected/kube-api-access-mmhtb DeviceMajor:0 DeviceMinor:123 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c5cc16f26a63d054e0857f2a2f1278a7512a2a20bea66d9521aa218fb1539d3c/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-140 DeviceMajor:0 DeviceMinor:140 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4482916b3b4b521cf75927dd45a05e0a2072a49de37c125a72612ca885ff96ce/userdata/shm DeviceMajor:0 DeviceMinor:279 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-315 DeviceMajor:0 DeviceMinor:315 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-297 DeviceMajor:0 DeviceMinor:297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/7e5935ea-8d95-45e3-b836-c7892953ef3d/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-156 DeviceMajor:0 DeviceMinor:156 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0418ff42-7eac-4266-97b5-4df88623d066/volumes/kubernetes.io~projected/kube-api-access-kmpdd DeviceMajor:0 DeviceMinor:245 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/df9f6505570d879efae2662d6149a2ae417f35b1bed956f7339c92d857b81707/userdata/shm DeviceMajor:0 DeviceMinor:281 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-305 DeviceMajor:0 DeviceMinor:305 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1cbcb403-a424-4496-8c5c-5eb5e42dfb93/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:220 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/var/lib/kubelet/pods/164586b1-f133-4427-8ab6-eb0839b79738/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:138 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-158 DeviceMajor:0 DeviceMinor:158 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f0f5f3f3-0856-4da3-9157-15f65c6aba6e/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:126 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1cbcb403-a424-4496-8c5c-5eb5e42dfb93/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:244 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a5455725a1362a8e870442eb2f0235fbea46c1d047d2183683f1ca346ec9c059/userdata/shm DeviceMajor:0 DeviceMinor:289 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-44 DeviceMajor:0 DeviceMinor:44 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-78 DeviceMajor:0 DeviceMinor:78 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/093f17f0-2818-4e24-b3c3-6ab4da9d21fb/volumes/kubernetes.io~projected/kube-api-access-7nk8r DeviceMajor:0 DeviceMinor:105 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/47433e6e63affa1ba02608e11b299ca5af00d1c85e6731e35f43a4b241522538/userdata/shm DeviceMajor:0 DeviceMinor:286 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-307 DeviceMajor:0 DeviceMinor:307 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ec83c044c04d6837d5d5f7d4c71e74473794e6ee1e718df488cf45a934fcc03a/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-159 DeviceMajor:0 DeviceMinor:159 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8efdcef9-9b31-4567-b7f9-cb59a894273d/volumes/kubernetes.io~projected/kube-api-access-cpsx7 DeviceMajor:0 DeviceMinor:254 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4c5a0c1d-867a-4ce4-9570-ea66452c8db3/volumes/kubernetes.io~projected/kube-api-access-mkzb2 DeviceMajor:0 DeviceMinor:256 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-116 DeviceMajor:0 DeviceMinor:116 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1eb851be-f157-48ea-9a39-1361b68d2639/volumes/kubernetes.io~projected/kube-api-access-nqhzl DeviceMajor:0 DeviceMinor:246 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4a19441e-e61b-4d58-85db-813ae88e1f9b/volumes/kubernetes.io~projected/kube-api-access-dw7bx DeviceMajor:0 DeviceMinor:118 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e8733f46dd1d2647e586c0cc9b5a4ebea38d695f856a8c74190015b70d99a33e/userdata/shm DeviceMajor:0 DeviceMinor:129 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2a8e902cd252f0c879e3e1c00047d04c3e8646bfeed72f034a41537b464f6d14/userdata/shm DeviceMajor:0 DeviceMinor:273 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f0f5f3f3-0856-4da3-9157-15f65c6aba6e/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/0d377285-0336-41b7-b48f-c44a7b563498/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:225 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/758a2c2e2af7455b02804a595f36886f4047114b8dbd25a8393a292e35b7254e/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-89 DeviceMajor:0 DeviceMinor:89 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7e5935ea-8d95-45e3-b836-c7892953ef3d/volumes/kubernetes.io~projected/kube-api-access-c6gml DeviceMajor:0 DeviceMinor:125 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/164586b1-f133-4427-8ab6-eb0839b79738/volumes/kubernetes.io~projected/kube-api-access-r4stz DeviceMajor:0 DeviceMinor:139 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-174 DeviceMajor:0 DeviceMinor:174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3ddfd0e7-fe76-41bc-b316-94505df81002/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:43 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/30211469-7108-4820-a988-26fc4ced734e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:218 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f/volumes/kubernetes.io~projected/kube-api-access-fw7mr DeviceMajor:0 DeviceMinor:231 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ee586416-6f56-4ea4-ad62-95de1e6df23b/volumes/kubernetes.io~projected/kube-api-access-sxxhh DeviceMajor:0 DeviceMinor:232 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-277 DeviceMajor:0 DeviceMinor:277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d47d14f256ba67306efe8da7bbcadc67f946b747f7e0a1d658a9687f1f0a1a37/userdata/shm DeviceMajor:0 DeviceMinor:275 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ee586416-6f56-4ea4-ad62-95de1e6df23b/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e4541b7b-3f7f-4851-9bd9-26fcda5cab13/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:240 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/54ad284e-d40e-4e69-b898-f5093952a0e6/volumes/kubernetes.io~projected/kube-api-access-9lfcj DeviceMajor:0 DeviceMinor:242 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/26180f77-0b1a-4d0f-9ed0-a12fdee69817/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:253 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-81 DeviceMajor:0 DeviceMinor:81 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a60bc804-52e7-422a-87fd-ac4c5aa90cb3/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:217 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0918ba32-8e55-48d0-8e50-027c0dcb4bbd/volumes/kubernetes.io~projected/kube-api-access-mghmh DeviceMajor:0 DeviceMinor:252 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-261 DeviceMajor:0 DeviceMinor:261 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3d64c10d51d4d9009da402a9f2c51b81830f1695b7370548200097f367d254f2/userdata/shm DeviceMajor:0 DeviceMinor:265 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/49aec4971047b96e14ae56703fe099426b567477422c0add4be258e7ae9b7ff1/userdata/shm DeviceMajor:0 DeviceMinor:259 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-311 DeviceMajor:0 DeviceMinor:311 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1482d789-884b-4337-b598-f0e2b71eb9f2/volumes/kubernetes.io~projected/kube-api-access-m2h62 DeviceMajor:0 DeviceMinor:243 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-75 DeviceMajor:0 DeviceMinor:75 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0ec1fcf833bb575029f4371f595adf3e92b6ae14914f83458d311cb85210d774/userdata/shm DeviceMajor:0 DeviceMinor:114 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d831cb23-7411-4072-8273-c167d9afca28/volumes/kubernetes.io~projected/kube-api-access-dwkwt DeviceMajor:0 DeviceMinor:236 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0ebf1330-e044-4ff5-8b48-2d667e0c5625/volumes/kubernetes.io~projected/kube-api-access-hccv4 DeviceMajor:0 DeviceMinor:241 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5a7752f9-7b9a-451f-997a-e9f696d38b34/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0ebf1330-e044-4ff5-8b48-2d667e0c5625/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:226 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/14fe5cb6383f1129ecf327e882bdb7904f8ad1a8a2cc2647d9ee96534b6ccb93/userdata/shm DeviceMajor:0 DeviceMinor:271 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/30211469-7108-4820-a988-26fc4ced734e/volumes/kubernetes.io~projected/kube-api-access-fncng DeviceMajor:0 DeviceMinor:229 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e78b283b-981e-48d7-a5f2-53f8401766ea/volumes/kubernetes.io~projected/kube-api-access-rchj5 DeviceMajor:0 DeviceMinor:251 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5ffd7e8cf7a9593e9910a67c41b6e95af26b8d49eaf5fd007129fe49d1978425/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/22cc81f0c9d90fe64f682c3bbb7bbcefc904c4ee2c036d7eedf6b66887f69fae/userdata/shm DeviceMajor:0 DeviceMinor:100 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5a7752f9-7b9a-451f-997a-e9f696d38b34/volumes/kubernetes.io~projected/kube-api-access-8b5zb DeviceMajor:0 DeviceMinor:219 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b3eea925-73b3-4693-8f0e-6dd26107f60a/volumes/kubernetes.io~projected/kube-api-access-6sx5s DeviceMajor:0 DeviceMinor:234 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5a7752f9-7b9a-451f-997a-e9f696d38b34/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:209 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/52b495ac-bb28-44f3-b925-3c54f86d5ec4/volumes/kubernetes.io~projected/kube-api-access-dd549 DeviceMajor:0 DeviceMinor:235 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-303 DeviceMajor:0 DeviceMinor:303 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-309 DeviceMajor:0 DeviceMinor:309 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ed400b0e1b21fe5e4ef5385a05444bf39db4c2fd9c754a3d6c45427d3b29ef99/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a60bc804-52e7-422a-87fd-ac4c5aa90cb3/volumes/kubernetes.io~projected/kube-api-access-zxkm6 DeviceMajor:0 DeviceMinor:230 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-299 DeviceMajor:0 DeviceMinor:299 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:133c0043d3a977b MacAddress:1e:9d:27:74:5d:53 Speed:10000 Mtu:8900} {Name:14fe5cb6383f112 MacAddress:12:f3:4c:5a:09:d9 Speed:10000 Mtu:8900} {Name:2899b4e2a1cabd8 MacAddress:1a:a3:47:17:ef:9e Speed:10000 Mtu:8900} {Name:2a8e902cd252f0c MacAddress:e2:86:f7:13:51:1d Speed:10000 Mtu:8900} {Name:3235d3bd9c5f6c6 MacAddress:8a:18:be:f7:22:2c Speed:10000 Mtu:8900} {Name:3d64c10d51d4d90 MacAddress:46:60:8c:b2:fd:67 Speed:10000 Mtu:8900} {Name:4482916b3b4b521 MacAddress:5a:26:02:e7:e7:73 Speed:10000 Mtu:8900} {Name:47433e6e63affa1 MacAddress:96:97:7c:d3:d9:94 Speed:10000 Mtu:8900} {Name:9b135e9cc968b9e MacAddress:ee:eb:82:5c:16:d8 Speed:10000 Mtu:8900} {Name:a5455725a1362a8 MacAddress:26:eb:eb:0e:4a:4a Speed:10000 Mtu:8900} {Name:adfc23f0d784d89 MacAddress:22:f3:90:39:e7:5d Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:0e:c0:e1:46:be:d0 Speed:0 Mtu:8900} {Name:d47d14f256ba673 MacAddress:f6:22:f1:06:e6:47 Speed:10000 Mtu:8900} {Name:d73b051671cc575 MacAddress:2e:11:69:61:1c:62 Speed:10000 Mtu:8900} {Name:df9f6505570d879 MacAddress:7a:62:3f:f3:9f:50 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:26:03:3b Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:21:09:89 Speed:-1 Mtu:9000} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:06:d0:49:23:c0:ac Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 08 03:47:17.174621 master-0 kubenswrapper[7547]: I0308 03:47:17.174602 7547 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 08 03:47:17.175173 master-0 kubenswrapper[7547]: I0308 03:47:17.174938 7547 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 08 03:47:17.175424 master-0 kubenswrapper[7547]: I0308 03:47:17.175399 7547 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 08 03:47:17.175657 master-0 kubenswrapper[7547]: I0308 03:47:17.175542 7547 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 08 03:47:17.175927 master-0 kubenswrapper[7547]: I0308 03:47:17.175629 7547 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 08 03:47:17.176024 master-0 kubenswrapper[7547]: I0308 03:47:17.175947 7547 topology_manager.go:138] "Creating topology manager with none policy" Mar 08 03:47:17.176024 master-0 kubenswrapper[7547]: I0308 03:47:17.175959 7547 container_manager_linux.go:303] "Creating device plugin manager" Mar 08 03:47:17.176024 master-0 kubenswrapper[7547]: I0308 03:47:17.175968 7547 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 03:47:17.176024 master-0 kubenswrapper[7547]: I0308 03:47:17.175985 7547 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 03:47:17.176218 master-0 kubenswrapper[7547]: I0308 03:47:17.176167 7547 state_mem.go:36] "Initialized new in-memory state store" Mar 08 03:47:17.176257 master-0 kubenswrapper[7547]: I0308 03:47:17.176247 7547 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 08 03:47:17.176317 master-0 kubenswrapper[7547]: I0308 03:47:17.176302 7547 kubelet.go:418] "Attempting to sync node with API server" Mar 08 03:47:17.176355 master-0 kubenswrapper[7547]: I0308 03:47:17.176317 7547 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 08 03:47:17.176355 master-0 kubenswrapper[7547]: I0308 03:47:17.176333 7547 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 08 03:47:17.176355 master-0 kubenswrapper[7547]: I0308 03:47:17.176345 7547 kubelet.go:324] "Adding apiserver pod source" Mar 08 03:47:17.176536 master-0 kubenswrapper[7547]: I0308 03:47:17.176361 7547 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 08 03:47:17.178379 master-0 kubenswrapper[7547]: I0308 03:47:17.177903 7547 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 08 03:47:17.178379 master-0 kubenswrapper[7547]: I0308 03:47:17.178082 7547 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 08 03:47:17.178621 master-0 kubenswrapper[7547]: I0308 03:47:17.178562 7547 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 08 03:47:17.178746 master-0 kubenswrapper[7547]: I0308 03:47:17.178713 7547 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 08 03:47:17.178746 master-0 kubenswrapper[7547]: I0308 03:47:17.178740 7547 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 08 03:47:17.178903 master-0 kubenswrapper[7547]: I0308 03:47:17.178749 7547 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 08 03:47:17.178903 master-0 kubenswrapper[7547]: I0308 03:47:17.178758 7547 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 08 03:47:17.178903 master-0 kubenswrapper[7547]: I0308 03:47:17.178767 7547 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 08 03:47:17.178903 master-0 kubenswrapper[7547]: I0308 03:47:17.178775 7547 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 08 03:47:17.178903 master-0 kubenswrapper[7547]: I0308 03:47:17.178783 7547 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 08 03:47:17.178903 master-0 kubenswrapper[7547]: I0308 03:47:17.178791 7547 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 08 03:47:17.178903 master-0 kubenswrapper[7547]: I0308 03:47:17.178802 7547 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 08 03:47:17.178903 master-0 kubenswrapper[7547]: I0308 03:47:17.178811 7547 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 08 03:47:17.178903 master-0 kubenswrapper[7547]: I0308 03:47:17.178837 7547 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 08 03:47:17.178903 master-0 kubenswrapper[7547]: I0308 03:47:17.178852 7547 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 08 03:47:17.178903 master-0 kubenswrapper[7547]: I0308 03:47:17.178878 7547 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 08 03:47:17.179489 master-0 kubenswrapper[7547]: I0308 03:47:17.179236 7547 server.go:1280] "Started kubelet" Mar 08 03:47:17.180881 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 08 03:47:17.182072 master-0 kubenswrapper[7547]: I0308 03:47:17.179770 7547 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 08 03:47:17.184834 master-0 kubenswrapper[7547]: I0308 03:47:17.181433 7547 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 08 03:47:17.184834 master-0 kubenswrapper[7547]: I0308 03:47:17.184661 7547 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 08 03:47:17.185633 master-0 kubenswrapper[7547]: I0308 03:47:17.185599 7547 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 08 03:47:17.187866 master-0 kubenswrapper[7547]: I0308 03:47:17.187660 7547 server.go:449] "Adding debug handlers to kubelet server" Mar 08 03:47:17.189103 master-0 kubenswrapper[7547]: I0308 03:47:17.188153 7547 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 08 03:47:17.189103 master-0 kubenswrapper[7547]: I0308 03:47:17.188185 7547 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 08 03:47:17.189103 master-0 kubenswrapper[7547]: I0308 03:47:17.188229 7547 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-09 03:37:12 +0000 UTC, rotation deadline is 2026-03-09 00:36:58.234199168 +0000 UTC Mar 08 03:47:17.189103 master-0 kubenswrapper[7547]: I0308 03:47:17.188285 7547 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 20h49m41.045919383s for next certificate rotation Mar 08 03:47:17.189103 master-0 kubenswrapper[7547]: E0308 03:47:17.188362 7547 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:47:17.189103 master-0 kubenswrapper[7547]: I0308 03:47:17.188409 7547 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 08 03:47:17.189103 master-0 kubenswrapper[7547]: I0308 03:47:17.188416 7547 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 08 03:47:17.189103 master-0 kubenswrapper[7547]: I0308 03:47:17.188505 7547 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 08 03:47:17.189429 master-0 kubenswrapper[7547]: I0308 03:47:17.189241 7547 factory.go:55] Registering systemd factory Mar 08 03:47:17.189429 master-0 kubenswrapper[7547]: I0308 03:47:17.189261 7547 factory.go:221] Registration of the systemd container factory successfully Mar 08 03:47:17.190641 master-0 kubenswrapper[7547]: I0308 03:47:17.189544 7547 factory.go:153] Registering CRI-O factory Mar 08 03:47:17.190641 master-0 kubenswrapper[7547]: I0308 03:47:17.189564 7547 factory.go:221] Registration of the crio container factory successfully Mar 08 03:47:17.190641 master-0 kubenswrapper[7547]: I0308 03:47:17.189639 7547 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 08 03:47:17.190641 master-0 kubenswrapper[7547]: I0308 03:47:17.189667 7547 factory.go:103] Registering Raw factory Mar 08 03:47:17.190641 master-0 kubenswrapper[7547]: I0308 03:47:17.189683 7547 manager.go:1196] Started watching for new ooms in manager Mar 08 03:47:17.190641 master-0 kubenswrapper[7547]: I0308 03:47:17.190209 7547 manager.go:319] Starting recovery of all containers Mar 08 03:47:17.193272 master-0 kubenswrapper[7547]: I0308 03:47:17.193146 7547 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 03:47:17.193869 master-0 kubenswrapper[7547]: I0308 03:47:17.193348 7547 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 03:47:17.196044 master-0 kubenswrapper[7547]: I0308 03:47:17.194241 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0ebf1330-e044-4ff5-8b48-2d667e0c5625" volumeName="kubernetes.io/configmap/0ebf1330-e044-4ff5-8b48-2d667e0c5625-config" seLinuxMountContext="" Mar 08 03:47:17.196150 master-0 kubenswrapper[7547]: I0308 03:47:17.196100 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0ebf1330-e044-4ff5-8b48-2d667e0c5625" volumeName="kubernetes.io/projected/0ebf1330-e044-4ff5-8b48-2d667e0c5625-kube-api-access-hccv4" seLinuxMountContext="" Mar 08 03:47:17.196203 master-0 kubenswrapper[7547]: I0308 03:47:17.196166 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="349d438d-d124-4d34-a172-4160e766c680" volumeName="kubernetes.io/projected/349d438d-d124-4d34-a172-4160e766c680-kube-api-access" seLinuxMountContext="" Mar 08 03:47:17.196203 master-0 kubenswrapper[7547]: I0308 03:47:17.196196 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a" volumeName="kubernetes.io/configmap/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-config" seLinuxMountContext="" Mar 08 03:47:17.196283 master-0 kubenswrapper[7547]: I0308 03:47:17.196220 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a" volumeName="kubernetes.io/secret/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-serving-cert" seLinuxMountContext="" Mar 08 03:47:17.196283 master-0 kubenswrapper[7547]: I0308 03:47:17.196265 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a7752f9-7b9a-451f-997a-e9f696d38b34" volumeName="kubernetes.io/projected/5a7752f9-7b9a-451f-997a-e9f696d38b34-kube-api-access-8b5zb" seLinuxMountContext="" Mar 08 03:47:17.196393 master-0 kubenswrapper[7547]: I0308 03:47:17.196296 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0f5f3f3-0856-4da3-9157-15f65c6aba6e" volumeName="kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovnkube-config" seLinuxMountContext="" Mar 08 03:47:17.196510 master-0 kubenswrapper[7547]: I0308 03:47:17.196395 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2dd4279d-a1a9-450a-a061-9008cd1ea8e0" volumeName="kubernetes.io/projected/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-kube-api-access-pnzt7" seLinuxMountContext="" Mar 08 03:47:17.196510 master-0 kubenswrapper[7547]: I0308 03:47:17.196467 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e4541b7b-3f7f-4851-9bd9-26fcda5cab13" volumeName="kubernetes.io/projected/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-kube-api-access" seLinuxMountContext="" Mar 08 03:47:17.196510 master-0 kubenswrapper[7547]: I0308 03:47:17.196493 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0918ba32-8e55-48d0-8e50-027c0dcb4bbd" volumeName="kubernetes.io/empty-dir/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-available-featuregates" seLinuxMountContext="" Mar 08 03:47:17.196628 master-0 kubenswrapper[7547]: I0308 03:47:17.196537 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1cbcb403-a424-4496-8c5c-5eb5e42dfb93" volumeName="kubernetes.io/configmap/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-config" seLinuxMountContext="" Mar 08 03:47:17.196628 master-0 kubenswrapper[7547]: I0308 03:47:17.196575 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ff63c73-62a3-44b4-acd3-1b3df175794f" volumeName="kubernetes.io/empty-dir/7ff63c73-62a3-44b4-acd3-1b3df175794f-operand-assets" seLinuxMountContext="" Mar 08 03:47:17.196628 master-0 kubenswrapper[7547]: I0308 03:47:17.196565 7547 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 03:47:17.196628 master-0 kubenswrapper[7547]: I0308 03:47:17.196621 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" volumeName="kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-service-ca-bundle" seLinuxMountContext="" Mar 08 03:47:17.196734 master-0 kubenswrapper[7547]: I0308 03:47:17.196653 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1cbcb403-a424-4496-8c5c-5eb5e42dfb93" volumeName="kubernetes.io/projected/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-kube-api-access" seLinuxMountContext="" Mar 08 03:47:17.196734 master-0 kubenswrapper[7547]: I0308 03:47:17.196684 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ff63c73-62a3-44b4-acd3-1b3df175794f" volumeName="kubernetes.io/secret/7ff63c73-62a3-44b4-acd3-1b3df175794f-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 08 03:47:17.196734 master-0 kubenswrapper[7547]: I0308 03:47:17.196706 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b3eea925-73b3-4693-8f0e-6dd26107f60a" volumeName="kubernetes.io/secret/b3eea925-73b3-4693-8f0e-6dd26107f60a-cluster-storage-operator-serving-cert" seLinuxMountContext="" Mar 08 03:47:17.196812 master-0 kubenswrapper[7547]: I0308 03:47:17.196753 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0418ff42-7eac-4266-97b5-4df88623d066" volumeName="kubernetes.io/configmap/0418ff42-7eac-4266-97b5-4df88623d066-telemetry-config" seLinuxMountContext="" Mar 08 03:47:17.196812 master-0 kubenswrapper[7547]: I0308 03:47:17.196776 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0918ba32-8e55-48d0-8e50-027c0dcb4bbd" volumeName="kubernetes.io/projected/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-kube-api-access-mghmh" seLinuxMountContext="" Mar 08 03:47:17.196812 master-0 kubenswrapper[7547]: I0308 03:47:17.196797 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1eb851be-f157-48ea-9a39-1361b68d2639" volumeName="kubernetes.io/projected/1eb851be-f157-48ea-9a39-1361b68d2639-kube-api-access-nqhzl" seLinuxMountContext="" Mar 08 03:47:17.196933 master-0 kubenswrapper[7547]: I0308 03:47:17.196855 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a" volumeName="kubernetes.io/projected/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-kube-api-access-cfvnn" seLinuxMountContext="" Mar 08 03:47:17.196969 master-0 kubenswrapper[7547]: I0308 03:47:17.196945 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="69eb8ba2-7bfb-4433-8951-08f89e7bcb5f" volumeName="kubernetes.io/configmap/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-trusted-ca" seLinuxMountContext="" Mar 08 03:47:17.196999 master-0 kubenswrapper[7547]: I0308 03:47:17.196980 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e4541b7b-3f7f-4851-9bd9-26fcda5cab13" volumeName="kubernetes.io/secret/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-serving-cert" seLinuxMountContext="" Mar 08 03:47:17.197035 master-0 kubenswrapper[7547]: I0308 03:47:17.197004 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="232c421d-96f0-4894-b8d8-74f43d02bbd3" volumeName="kubernetes.io/configmap/232c421d-96f0-4894-b8d8-74f43d02bbd3-trusted-ca" seLinuxMountContext="" Mar 08 03:47:17.197035 master-0 kubenswrapper[7547]: I0308 03:47:17.197024 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a7752f9-7b9a-451f-997a-e9f696d38b34" volumeName="kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-service-ca" seLinuxMountContext="" Mar 08 03:47:17.197090 master-0 kubenswrapper[7547]: I0308 03:47:17.197050 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a7752f9-7b9a-451f-997a-e9f696d38b34" volumeName="kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-config" seLinuxMountContext="" Mar 08 03:47:17.197090 master-0 kubenswrapper[7547]: I0308 03:47:17.197073 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c9de4939-680a-4e3e-89fd-e20ecb8b10f2" volumeName="kubernetes.io/projected/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-bound-sa-token" seLinuxMountContext="" Mar 08 03:47:17.197142 master-0 kubenswrapper[7547]: I0308 03:47:17.197125 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0f5f3f3-0856-4da3-9157-15f65c6aba6e" volumeName="kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-env-overrides" seLinuxMountContext="" Mar 08 03:47:17.197172 master-0 kubenswrapper[7547]: I0308 03:47:17.197158 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0f5f3f3-0856-4da3-9157-15f65c6aba6e" volumeName="kubernetes.io/projected/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-kube-api-access-2vklx" seLinuxMountContext="" Mar 08 03:47:17.197200 master-0 kubenswrapper[7547]: I0308 03:47:17.197180 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0918ba32-8e55-48d0-8e50-027c0dcb4bbd" volumeName="kubernetes.io/secret/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-serving-cert" seLinuxMountContext="" Mar 08 03:47:17.197231 master-0 kubenswrapper[7547]: I0308 03:47:17.197203 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="164586b1-f133-4427-8ab6-eb0839b79738" volumeName="kubernetes.io/secret/164586b1-f133-4427-8ab6-eb0839b79738-webhook-cert" seLinuxMountContext="" Mar 08 03:47:17.197258 master-0 kubenswrapper[7547]: I0308 03:47:17.197231 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="26180f77-0b1a-4d0f-9ed0-a12fdee69817" volumeName="kubernetes.io/projected/26180f77-0b1a-4d0f-9ed0-a12fdee69817-kube-api-access" seLinuxMountContext="" Mar 08 03:47:17.197286 master-0 kubenswrapper[7547]: I0308 03:47:17.197254 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" volumeName="kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-config" seLinuxMountContext="" Mar 08 03:47:17.197286 master-0 kubenswrapper[7547]: I0308 03:47:17.197280 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" volumeName="kubernetes.io/projected/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-kube-api-access-zxkm6" seLinuxMountContext="" Mar 08 03:47:17.197341 master-0 kubenswrapper[7547]: I0308 03:47:17.197302 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e4541b7b-3f7f-4851-9bd9-26fcda5cab13" volumeName="kubernetes.io/configmap/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-config" seLinuxMountContext="" Mar 08 03:47:17.197341 master-0 kubenswrapper[7547]: I0308 03:47:17.197324 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="26180f77-0b1a-4d0f-9ed0-a12fdee69817" volumeName="kubernetes.io/secret/26180f77-0b1a-4d0f-9ed0-a12fdee69817-serving-cert" seLinuxMountContext="" Mar 08 03:47:17.197396 master-0 kubenswrapper[7547]: I0308 03:47:17.197352 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4a19441e-e61b-4d58-85db-813ae88e1f9b" volumeName="kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-whereabouts-configmap" seLinuxMountContext="" Mar 08 03:47:17.197396 master-0 kubenswrapper[7547]: I0308 03:47:17.197374 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="52b495ac-bb28-44f3-b925-3c54f86d5ec4" volumeName="kubernetes.io/projected/52b495ac-bb28-44f3-b925-3c54f86d5ec4-kube-api-access-dd549" seLinuxMountContext="" Mar 08 03:47:17.197449 master-0 kubenswrapper[7547]: I0308 03:47:17.197399 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="69eb8ba2-7bfb-4433-8951-08f89e7bcb5f" volumeName="kubernetes.io/projected/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-kube-api-access-fw7mr" seLinuxMountContext="" Mar 08 03:47:17.197449 master-0 kubenswrapper[7547]: I0308 03:47:17.197420 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" volumeName="kubernetes.io/secret/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-serving-cert" seLinuxMountContext="" Mar 08 03:47:17.197449 master-0 kubenswrapper[7547]: I0308 03:47:17.197440 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c9de4939-680a-4e3e-89fd-e20ecb8b10f2" volumeName="kubernetes.io/projected/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-kube-api-access-29dpg" seLinuxMountContext="" Mar 08 03:47:17.197527 master-0 kubenswrapper[7547]: I0308 03:47:17.197469 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d831cb23-7411-4072-8273-c167d9afca28" volumeName="kubernetes.io/configmap/d831cb23-7411-4072-8273-c167d9afca28-images" seLinuxMountContext="" Mar 08 03:47:17.197527 master-0 kubenswrapper[7547]: I0308 03:47:17.197502 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ee586416-6f56-4ea4-ad62-95de1e6df23b" volumeName="kubernetes.io/empty-dir/ee586416-6f56-4ea4-ad62-95de1e6df23b-snapshots" seLinuxMountContext="" Mar 08 03:47:17.197581 master-0 kubenswrapper[7547]: I0308 03:47:17.197541 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0d377285-0336-41b7-b48f-c44a7b563498" volumeName="kubernetes.io/projected/0d377285-0336-41b7-b48f-c44a7b563498-kube-api-access-7qn5v" seLinuxMountContext="" Mar 08 03:47:17.197613 master-0 kubenswrapper[7547]: I0308 03:47:17.197572 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0d377285-0336-41b7-b48f-c44a7b563498" volumeName="kubernetes.io/secret/0d377285-0336-41b7-b48f-c44a7b563498-serving-cert" seLinuxMountContext="" Mar 08 03:47:17.197613 master-0 kubenswrapper[7547]: I0308 03:47:17.197606 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="164586b1-f133-4427-8ab6-eb0839b79738" volumeName="kubernetes.io/configmap/164586b1-f133-4427-8ab6-eb0839b79738-ovnkube-identity-cm" seLinuxMountContext="" Mar 08 03:47:17.197666 master-0 kubenswrapper[7547]: I0308 03:47:17.197641 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7e5935ea-8d95-45e3-b836-c7892953ef3d" volumeName="kubernetes.io/configmap/7e5935ea-8d95-45e3-b836-c7892953ef3d-ovnkube-config" seLinuxMountContext="" Mar 08 03:47:17.197699 master-0 kubenswrapper[7547]: I0308 03:47:17.197667 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7e5935ea-8d95-45e3-b836-c7892953ef3d" volumeName="kubernetes.io/projected/7e5935ea-8d95-45e3-b836-c7892953ef3d-kube-api-access-c6gml" seLinuxMountContext="" Mar 08 03:47:17.197699 master-0 kubenswrapper[7547]: I0308 03:47:17.197688 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ff63c73-62a3-44b4-acd3-1b3df175794f" volumeName="kubernetes.io/projected/7ff63c73-62a3-44b4-acd3-1b3df175794f-kube-api-access-vfqc5" seLinuxMountContext="" Mar 08 03:47:17.197754 master-0 kubenswrapper[7547]: I0308 03:47:17.197716 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="164586b1-f133-4427-8ab6-eb0839b79738" volumeName="kubernetes.io/configmap/164586b1-f133-4427-8ab6-eb0839b79738-env-overrides" seLinuxMountContext="" Mar 08 03:47:17.197754 master-0 kubenswrapper[7547]: I0308 03:47:17.197739 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4a19441e-e61b-4d58-85db-813ae88e1f9b" volumeName="kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-cni-sysctl-allowlist" seLinuxMountContext="" Mar 08 03:47:17.197803 master-0 kubenswrapper[7547]: I0308 03:47:17.197765 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7e5935ea-8d95-45e3-b836-c7892953ef3d" volumeName="kubernetes.io/secret/7e5935ea-8d95-45e3-b836-c7892953ef3d-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 08 03:47:17.197803 master-0 kubenswrapper[7547]: I0308 03:47:17.197785 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" volumeName="kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-trusted-ca-bundle" seLinuxMountContext="" Mar 08 03:47:17.197890 master-0 kubenswrapper[7547]: I0308 03:47:17.197853 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="164586b1-f133-4427-8ab6-eb0839b79738" volumeName="kubernetes.io/projected/164586b1-f133-4427-8ab6-eb0839b79738-kube-api-access-r4stz" seLinuxMountContext="" Mar 08 03:47:17.197890 master-0 kubenswrapper[7547]: I0308 03:47:17.197885 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="30211469-7108-4820-a988-26fc4ced734e" volumeName="kubernetes.io/secret/30211469-7108-4820-a988-26fc4ced734e-serving-cert" seLinuxMountContext="" Mar 08 03:47:17.197969 master-0 kubenswrapper[7547]: I0308 03:47:17.197916 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0f5f3f3-0856-4da3-9157-15f65c6aba6e" volumeName="kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovnkube-script-lib" seLinuxMountContext="" Mar 08 03:47:17.197969 master-0 kubenswrapper[7547]: I0308 03:47:17.197954 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="54ad284e-d40e-4e69-b898-f5093952a0e6" volumeName="kubernetes.io/projected/54ad284e-d40e-4e69-b898-f5093952a0e6-kube-api-access-9lfcj" seLinuxMountContext="" Mar 08 03:47:17.198042 master-0 kubenswrapper[7547]: I0308 03:47:17.197978 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0418ff42-7eac-4266-97b5-4df88623d066" volumeName="kubernetes.io/projected/0418ff42-7eac-4266-97b5-4df88623d066-kube-api-access-kmpdd" seLinuxMountContext="" Mar 08 03:47:17.198042 master-0 kubenswrapper[7547]: I0308 03:47:17.198004 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="093f17f0-2818-4e24-b3c3-6ab4da9d21fb" volumeName="kubernetes.io/configmap/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-daemon-config" seLinuxMountContext="" Mar 08 03:47:17.198042 master-0 kubenswrapper[7547]: I0308 03:47:17.198026 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="093f17f0-2818-4e24-b3c3-6ab4da9d21fb" volumeName="kubernetes.io/configmap/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-cni-binary-copy" seLinuxMountContext="" Mar 08 03:47:17.198125 master-0 kubenswrapper[7547]: I0308 03:47:17.198055 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1cbcb403-a424-4496-8c5c-5eb5e42dfb93" volumeName="kubernetes.io/secret/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-serving-cert" seLinuxMountContext="" Mar 08 03:47:17.198125 master-0 kubenswrapper[7547]: I0308 03:47:17.198074 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="232c421d-96f0-4894-b8d8-74f43d02bbd3" volumeName="kubernetes.io/projected/232c421d-96f0-4894-b8d8-74f43d02bbd3-kube-api-access-fx4fw" seLinuxMountContext="" Mar 08 03:47:17.198125 master-0 kubenswrapper[7547]: I0308 03:47:17.198094 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="349d438d-d124-4d34-a172-4160e766c680" volumeName="kubernetes.io/configmap/349d438d-d124-4d34-a172-4160e766c680-service-ca" seLinuxMountContext="" Mar 08 03:47:17.198205 master-0 kubenswrapper[7547]: I0308 03:47:17.198139 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4a19441e-e61b-4d58-85db-813ae88e1f9b" volumeName="kubernetes.io/projected/4a19441e-e61b-4d58-85db-813ae88e1f9b-kube-api-access-dw7bx" seLinuxMountContext="" Mar 08 03:47:17.198205 master-0 kubenswrapper[7547]: I0308 03:47:17.198161 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b3eea925-73b3-4693-8f0e-6dd26107f60a" volumeName="kubernetes.io/projected/b3eea925-73b3-4693-8f0e-6dd26107f60a-kube-api-access-6sx5s" seLinuxMountContext="" Mar 08 03:47:17.198295 master-0 kubenswrapper[7547]: I0308 03:47:17.198211 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d831cb23-7411-4072-8273-c167d9afca28" volumeName="kubernetes.io/projected/d831cb23-7411-4072-8273-c167d9afca28-kube-api-access-dwkwt" seLinuxMountContext="" Mar 08 03:47:17.198295 master-0 kubenswrapper[7547]: I0308 03:47:17.198270 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e78b283b-981e-48d7-a5f2-53f8401766ea" volumeName="kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-images" seLinuxMountContext="" Mar 08 03:47:17.198295 master-0 kubenswrapper[7547]: I0308 03:47:17.198289 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ee586416-6f56-4ea4-ad62-95de1e6df23b" volumeName="kubernetes.io/configmap/ee586416-6f56-4ea4-ad62-95de1e6df23b-service-ca-bundle" seLinuxMountContext="" Mar 08 03:47:17.198493 master-0 kubenswrapper[7547]: I0308 03:47:17.198316 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ee586416-6f56-4ea4-ad62-95de1e6df23b" volumeName="kubernetes.io/configmap/ee586416-6f56-4ea4-ad62-95de1e6df23b-trusted-ca-bundle" seLinuxMountContext="" Mar 08 03:47:17.198493 master-0 kubenswrapper[7547]: I0308 03:47:17.198333 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0d377285-0336-41b7-b48f-c44a7b563498" volumeName="kubernetes.io/configmap/0d377285-0336-41b7-b48f-c44a7b563498-config" seLinuxMountContext="" Mar 08 03:47:17.198493 master-0 kubenswrapper[7547]: I0308 03:47:17.198374 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0ebf1330-e044-4ff5-8b48-2d667e0c5625" volumeName="kubernetes.io/secret/0ebf1330-e044-4ff5-8b48-2d667e0c5625-serving-cert" seLinuxMountContext="" Mar 08 03:47:17.198493 master-0 kubenswrapper[7547]: I0308 03:47:17.198389 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="30211469-7108-4820-a988-26fc4ced734e" volumeName="kubernetes.io/configmap/30211469-7108-4820-a988-26fc4ced734e-config" seLinuxMountContext="" Mar 08 03:47:17.198493 master-0 kubenswrapper[7547]: I0308 03:47:17.198404 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4c5a0c1d-867a-4ce4-9570-ea66452c8db3" volumeName="kubernetes.io/projected/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-kube-api-access-mkzb2" seLinuxMountContext="" Mar 08 03:47:17.198702 master-0 kubenswrapper[7547]: I0308 03:47:17.198422 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="54ad284e-d40e-4e69-b898-f5093952a0e6" volumeName="kubernetes.io/configmap/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-trusted-ca" seLinuxMountContext="" Mar 08 03:47:17.198702 master-0 kubenswrapper[7547]: I0308 03:47:17.198603 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e78b283b-981e-48d7-a5f2-53f8401766ea" volumeName="kubernetes.io/projected/e78b283b-981e-48d7-a5f2-53f8401766ea-kube-api-access-rchj5" seLinuxMountContext="" Mar 08 03:47:17.198702 master-0 kubenswrapper[7547]: I0308 03:47:17.198624 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6cde5024-edf7-4fa4-8964-cabe7899578b" volumeName="kubernetes.io/projected/6cde5024-edf7-4fa4-8964-cabe7899578b-kube-api-access-x997v" seLinuxMountContext="" Mar 08 03:47:17.198702 master-0 kubenswrapper[7547]: I0308 03:47:17.198649 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="093f17f0-2818-4e24-b3c3-6ab4da9d21fb" volumeName="kubernetes.io/projected/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-kube-api-access-7nk8r" seLinuxMountContext="" Mar 08 03:47:17.198702 master-0 kubenswrapper[7547]: I0308 03:47:17.198663 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ddfd0e7-fe76-41bc-b316-94505df81002" volumeName="kubernetes.io/projected/3ddfd0e7-fe76-41bc-b316-94505df81002-kube-api-access-bgc7c" seLinuxMountContext="" Mar 08 03:47:17.198702 master-0 kubenswrapper[7547]: I0308 03:47:17.198681 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ddfd0e7-fe76-41bc-b316-94505df81002" volumeName="kubernetes.io/secret/3ddfd0e7-fe76-41bc-b316-94505df81002-metrics-tls" seLinuxMountContext="" Mar 08 03:47:17.198702 master-0 kubenswrapper[7547]: I0308 03:47:17.198695 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4a19441e-e61b-4d58-85db-813ae88e1f9b" volumeName="kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-cni-binary-copy" seLinuxMountContext="" Mar 08 03:47:17.198702 master-0 kubenswrapper[7547]: I0308 03:47:17.198710 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4c5a0c1d-867a-4ce4-9570-ea66452c8db3" volumeName="kubernetes.io/configmap/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-iptables-alerter-script" seLinuxMountContext="" Mar 08 03:47:17.199257 master-0 kubenswrapper[7547]: I0308 03:47:17.198726 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a7752f9-7b9a-451f-997a-e9f696d38b34" volumeName="kubernetes.io/secret/5a7752f9-7b9a-451f-997a-e9f696d38b34-serving-cert" seLinuxMountContext="" Mar 08 03:47:17.199257 master-0 kubenswrapper[7547]: I0308 03:47:17.198738 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="69eb8ba2-7bfb-4433-8951-08f89e7bcb5f" volumeName="kubernetes.io/projected/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-bound-sa-token" seLinuxMountContext="" Mar 08 03:47:17.199257 master-0 kubenswrapper[7547]: I0308 03:47:17.198891 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7e5935ea-8d95-45e3-b836-c7892953ef3d" volumeName="kubernetes.io/configmap/7e5935ea-8d95-45e3-b836-c7892953ef3d-env-overrides" seLinuxMountContext="" Mar 08 03:47:17.199257 master-0 kubenswrapper[7547]: I0308 03:47:17.198912 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8efdcef9-9b31-4567-b7f9-cb59a894273d" volumeName="kubernetes.io/projected/8efdcef9-9b31-4567-b7f9-cb59a894273d-kube-api-access-cpsx7" seLinuxMountContext="" Mar 08 03:47:17.199257 master-0 kubenswrapper[7547]: I0308 03:47:17.198925 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c9de4939-680a-4e3e-89fd-e20ecb8b10f2" volumeName="kubernetes.io/configmap/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-trusted-ca" seLinuxMountContext="" Mar 08 03:47:17.199257 master-0 kubenswrapper[7547]: I0308 03:47:17.198942 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d831cb23-7411-4072-8273-c167d9afca28" volumeName="kubernetes.io/configmap/d831cb23-7411-4072-8273-c167d9afca28-config" seLinuxMountContext="" Mar 08 03:47:17.199257 master-0 kubenswrapper[7547]: I0308 03:47:17.198952 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ee586416-6f56-4ea4-ad62-95de1e6df23b" volumeName="kubernetes.io/projected/ee586416-6f56-4ea4-ad62-95de1e6df23b-kube-api-access-sxxhh" seLinuxMountContext="" Mar 08 03:47:17.199257 master-0 kubenswrapper[7547]: I0308 03:47:17.198968 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0f5f3f3-0856-4da3-9157-15f65c6aba6e" volumeName="kubernetes.io/secret/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovn-node-metrics-cert" seLinuxMountContext="" Mar 08 03:47:17.199257 master-0 kubenswrapper[7547]: I0308 03:47:17.198980 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1482d789-884b-4337-b598-f0e2b71eb9f2" volumeName="kubernetes.io/projected/1482d789-884b-4337-b598-f0e2b71eb9f2-kube-api-access-m2h62" seLinuxMountContext="" Mar 08 03:47:17.199257 master-0 kubenswrapper[7547]: I0308 03:47:17.198995 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="26180f77-0b1a-4d0f-9ed0-a12fdee69817" volumeName="kubernetes.io/configmap/26180f77-0b1a-4d0f-9ed0-a12fdee69817-config" seLinuxMountContext="" Mar 08 03:47:17.199257 master-0 kubenswrapper[7547]: I0308 03:47:17.199013 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="30211469-7108-4820-a988-26fc4ced734e" volumeName="kubernetes.io/projected/30211469-7108-4820-a988-26fc4ced734e-kube-api-access-fncng" seLinuxMountContext="" Mar 08 03:47:17.199257 master-0 kubenswrapper[7547]: I0308 03:47:17.199025 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a7752f9-7b9a-451f-997a-e9f696d38b34" volumeName="kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-ca" seLinuxMountContext="" Mar 08 03:47:17.199257 master-0 kubenswrapper[7547]: I0308 03:47:17.199109 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a7752f9-7b9a-451f-997a-e9f696d38b34" volumeName="kubernetes.io/secret/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-client" seLinuxMountContext="" Mar 08 03:47:17.199257 master-0 kubenswrapper[7547]: I0308 03:47:17.199123 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d5044ffd-0686-4679-9894-e696faf33699" volumeName="kubernetes.io/projected/d5044ffd-0686-4679-9894-e696faf33699-kube-api-access-mmhtb" seLinuxMountContext="" Mar 08 03:47:17.199257 master-0 kubenswrapper[7547]: I0308 03:47:17.199193 7547 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ee586416-6f56-4ea4-ad62-95de1e6df23b" volumeName="kubernetes.io/secret/ee586416-6f56-4ea4-ad62-95de1e6df23b-serving-cert" seLinuxMountContext="" Mar 08 03:47:17.199257 master-0 kubenswrapper[7547]: I0308 03:47:17.199206 7547 reconstruct.go:97] "Volume reconstruction finished" Mar 08 03:47:17.199257 master-0 kubenswrapper[7547]: I0308 03:47:17.199215 7547 reconciler.go:26] "Reconciler: start to sync state" Mar 08 03:47:17.203146 master-0 kubenswrapper[7547]: I0308 03:47:17.203117 7547 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 08 03:47:17.228756 master-0 kubenswrapper[7547]: I0308 03:47:17.228684 7547 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 08 03:47:17.230823 master-0 kubenswrapper[7547]: I0308 03:47:17.230803 7547 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 08 03:47:17.230938 master-0 kubenswrapper[7547]: I0308 03:47:17.230901 7547 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 08 03:47:17.230986 master-0 kubenswrapper[7547]: I0308 03:47:17.230953 7547 kubelet.go:2335] "Starting kubelet main sync loop" Mar 08 03:47:17.232343 master-0 kubenswrapper[7547]: E0308 03:47:17.231905 7547 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 08 03:47:17.233019 master-0 kubenswrapper[7547]: I0308 03:47:17.232974 7547 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 03:47:17.238739 master-0 kubenswrapper[7547]: I0308 03:47:17.238639 7547 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="92b1c53e472127e182a48a6a8f941f7dd97106f322656ce4711b76ad8c4fc359" exitCode=0 Mar 08 03:47:17.254047 master-0 kubenswrapper[7547]: I0308 03:47:17.253996 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 08 03:47:17.256503 master-0 kubenswrapper[7547]: I0308 03:47:17.256444 7547 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="1611cfa5e175032b10c844270b1926150f7a6bf4a58e7bfa0e9ab7a757d448fe" exitCode=1 Mar 08 03:47:17.256503 master-0 kubenswrapper[7547]: I0308 03:47:17.256499 7547 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="809953c4a3d0d1d245b0d287db991ef24c93f664dbc39c226e2a89fc2ba7da3d" exitCode=0 Mar 08 03:47:17.262874 master-0 kubenswrapper[7547]: I0308 03:47:17.262779 7547 generic.go:334] "Generic (PLEG): container finished" podID="f0f5f3f3-0856-4da3-9157-15f65c6aba6e" containerID="e04ec38e07d8783fc2ade88328995e37c561797980580532badf766ca8953982" exitCode=0 Mar 08 03:47:17.290080 master-0 kubenswrapper[7547]: I0308 03:47:17.290023 7547 generic.go:334] "Generic (PLEG): container finished" podID="689a1fe4-9189-4a55-a61a-94a155b8040d" containerID="4998ca5636ddd8d905b67c8fb24fdf903c161f9cdcca4bdb3b01719d5f1d5376" exitCode=0 Mar 08 03:47:17.294253 master-0 kubenswrapper[7547]: I0308 03:47:17.294212 7547 generic.go:334] "Generic (PLEG): container finished" podID="4a19441e-e61b-4d58-85db-813ae88e1f9b" containerID="1362430063d3256452d2e164a40d29841ede63c548ec607c9dddfdd02a33cead" exitCode=0 Mar 08 03:47:17.294253 master-0 kubenswrapper[7547]: I0308 03:47:17.294247 7547 generic.go:334] "Generic (PLEG): container finished" podID="4a19441e-e61b-4d58-85db-813ae88e1f9b" containerID="e3fd81814e45a4dba9c86317c3e1475a8abfc09eed557e2f1e1628bc58babab2" exitCode=0 Mar 08 03:47:17.294378 master-0 kubenswrapper[7547]: I0308 03:47:17.294259 7547 generic.go:334] "Generic (PLEG): container finished" podID="4a19441e-e61b-4d58-85db-813ae88e1f9b" containerID="6808f9225d491b34c3cf7a01d443c6732f48fff26280b582719d87525223329a" exitCode=0 Mar 08 03:47:17.294378 master-0 kubenswrapper[7547]: I0308 03:47:17.294272 7547 generic.go:334] "Generic (PLEG): container finished" podID="4a19441e-e61b-4d58-85db-813ae88e1f9b" containerID="5b762e908370687f296be27f837eacb773e5b2c7f10d7523e57f3d511196e87d" exitCode=0 Mar 08 03:47:17.294378 master-0 kubenswrapper[7547]: I0308 03:47:17.294284 7547 generic.go:334] "Generic (PLEG): container finished" podID="4a19441e-e61b-4d58-85db-813ae88e1f9b" containerID="3fee6f2c5c3a300e3baa43dfa5cafdfb86c438810bfee3e881783f584b000768" exitCode=0 Mar 08 03:47:17.294378 master-0 kubenswrapper[7547]: I0308 03:47:17.294293 7547 generic.go:334] "Generic (PLEG): container finished" podID="4a19441e-e61b-4d58-85db-813ae88e1f9b" containerID="46250aae369897400569e4111703b276aadaa65120ad7d4c39a342c4f39e31c8" exitCode=0 Mar 08 03:47:17.295312 master-0 kubenswrapper[7547]: I0308 03:47:17.295287 7547 generic.go:334] "Generic (PLEG): container finished" podID="7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" containerID="d11b35d3ea3d0150cbdfe887feb70180d8c9d1802a844e12699e549dc588011a" exitCode=0 Mar 08 03:47:17.332310 master-0 kubenswrapper[7547]: E0308 03:47:17.332263 7547 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 08 03:47:17.336495 master-0 kubenswrapper[7547]: I0308 03:47:17.336461 7547 manager.go:324] Recovery completed Mar 08 03:47:17.386605 master-0 kubenswrapper[7547]: I0308 03:47:17.386544 7547 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 08 03:47:17.386605 master-0 kubenswrapper[7547]: I0308 03:47:17.386594 7547 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 08 03:47:17.386813 master-0 kubenswrapper[7547]: I0308 03:47:17.386637 7547 state_mem.go:36] "Initialized new in-memory state store" Mar 08 03:47:17.386997 master-0 kubenswrapper[7547]: I0308 03:47:17.386958 7547 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 08 03:47:17.387040 master-0 kubenswrapper[7547]: I0308 03:47:17.386994 7547 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 08 03:47:17.387082 master-0 kubenswrapper[7547]: I0308 03:47:17.387035 7547 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 08 03:47:17.387082 master-0 kubenswrapper[7547]: I0308 03:47:17.387052 7547 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 08 03:47:17.387082 master-0 kubenswrapper[7547]: I0308 03:47:17.387067 7547 policy_none.go:49] "None policy: Start" Mar 08 03:47:17.390166 master-0 kubenswrapper[7547]: I0308 03:47:17.389509 7547 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 08 03:47:17.390166 master-0 kubenswrapper[7547]: I0308 03:47:17.389558 7547 state_mem.go:35] "Initializing new in-memory state store" Mar 08 03:47:17.390166 master-0 kubenswrapper[7547]: I0308 03:47:17.389885 7547 state_mem.go:75] "Updated machine memory state" Mar 08 03:47:17.390166 master-0 kubenswrapper[7547]: I0308 03:47:17.389902 7547 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 08 03:47:17.403906 master-0 kubenswrapper[7547]: I0308 03:47:17.401044 7547 manager.go:334] "Starting Device Plugin manager" Mar 08 03:47:17.403906 master-0 kubenswrapper[7547]: I0308 03:47:17.403901 7547 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 08 03:47:17.403906 master-0 kubenswrapper[7547]: I0308 03:47:17.403915 7547 server.go:79] "Starting device plugin registration server" Mar 08 03:47:17.404615 master-0 kubenswrapper[7547]: I0308 03:47:17.404598 7547 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 08 03:47:17.404712 master-0 kubenswrapper[7547]: I0308 03:47:17.404618 7547 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 08 03:47:17.404924 master-0 kubenswrapper[7547]: I0308 03:47:17.404904 7547 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 08 03:47:17.405012 master-0 kubenswrapper[7547]: I0308 03:47:17.404978 7547 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 08 03:47:17.405012 master-0 kubenswrapper[7547]: I0308 03:47:17.404986 7547 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 08 03:47:17.505768 master-0 kubenswrapper[7547]: I0308 03:47:17.505680 7547 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:47:17.507539 master-0 kubenswrapper[7547]: I0308 03:47:17.507480 7547 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:47:17.507539 master-0 kubenswrapper[7547]: I0308 03:47:17.507541 7547 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:47:17.507712 master-0 kubenswrapper[7547]: I0308 03:47:17.507559 7547 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:47:17.507712 master-0 kubenswrapper[7547]: I0308 03:47:17.507624 7547 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:47:17.521025 master-0 kubenswrapper[7547]: I0308 03:47:17.520973 7547 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 08 03:47:17.521270 master-0 kubenswrapper[7547]: I0308 03:47:17.521131 7547 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 08 03:47:17.532897 master-0 kubenswrapper[7547]: I0308 03:47:17.532805 7547 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Mar 08 03:47:17.533286 master-0 kubenswrapper[7547]: I0308 03:47:17.533110 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"512d196861598af69e92dd9aa3d25b53c40e97b92520ddd9df4d73c8065df7e5"} Mar 08 03:47:17.533286 master-0 kubenswrapper[7547]: I0308 03:47:17.533165 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"a87790639e12c044cf9f716dfd6c742c89b97ffde357a755afcc44a38db6328d"} Mar 08 03:47:17.533286 master-0 kubenswrapper[7547]: I0308 03:47:17.533178 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerDied","Data":"92b1c53e472127e182a48a6a8f941f7dd97106f322656ce4711b76ad8c4fc359"} Mar 08 03:47:17.534283 master-0 kubenswrapper[7547]: I0308 03:47:17.534191 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"5ffd7e8cf7a9593e9910a67c41b6e95af26b8d49eaf5fd007129fe49d1978425"} Mar 08 03:47:17.534384 master-0 kubenswrapper[7547]: I0308 03:47:17.534290 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"2c8736855304b1b6928cbfdc88bfeac2e98662a8092340731da4a5d87e7dfa39"} Mar 08 03:47:17.534384 master-0 kubenswrapper[7547]: I0308 03:47:17.534321 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"1611cfa5e175032b10c844270b1926150f7a6bf4a58e7bfa0e9ab7a757d448fe"} Mar 08 03:47:17.534384 master-0 kubenswrapper[7547]: I0308 03:47:17.534348 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"809953c4a3d0d1d245b0d287db991ef24c93f664dbc39c226e2a89fc2ba7da3d"} Mar 08 03:47:17.534384 master-0 kubenswrapper[7547]: I0308 03:47:17.534373 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"ed400b0e1b21fe5e4ef5385a05444bf39db4c2fd9c754a3d6c45427d3b29ef99"} Mar 08 03:47:17.535170 master-0 kubenswrapper[7547]: I0308 03:47:17.534424 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"c18b9f8b4dcef22d65b7b32df1f7077ca430d4d3cab49ca6d36290193d631e27"} Mar 08 03:47:17.535170 master-0 kubenswrapper[7547]: I0308 03:47:17.534448 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"4573b175e4638284868f035fd979eb84c441b639f2cba6882ebb0bdabc7d53f1"} Mar 08 03:47:17.535170 master-0 kubenswrapper[7547]: I0308 03:47:17.534470 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"604b5e18b0f1fc95cb4cabd9d6cb088bfcead3c4cba52acd70685e03b5856c7f"} Mar 08 03:47:17.535170 master-0 kubenswrapper[7547]: I0308 03:47:17.534500 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"b00978d6151280d243ba1f6c8276b934ba5c5276b57bc3800284f048820f905f"} Mar 08 03:47:17.536084 master-0 kubenswrapper[7547]: I0308 03:47:17.536023 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"724178cb9f231b822e2bf919b24049f88ede4ee540e7e7751c011ef4363756c9"} Mar 08 03:47:17.536234 master-0 kubenswrapper[7547]: I0308 03:47:17.536095 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"758a2c2e2af7455b02804a595f36886f4047114b8dbd25a8393a292e35b7254e"} Mar 08 03:47:17.536234 master-0 kubenswrapper[7547]: I0308 03:47:17.536156 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"f5a0acfb3a3f4f285f366c3abcb3f9d3bebb3626e4a976de0dab27a634745185"} Mar 08 03:47:17.536234 master-0 kubenswrapper[7547]: I0308 03:47:17.536182 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"5b8c31076d1db49fd8c133661fbbc131a58892112131cf3118f58212505e7460"} Mar 08 03:47:17.536234 master-0 kubenswrapper[7547]: I0308 03:47:17.536218 7547 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5952924b1f9e22bfe7dad849c062189ae8d85553be131f53d8d3ab00c359665" Mar 08 03:47:17.536538 master-0 kubenswrapper[7547]: I0308 03:47:17.536282 7547 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d29086141609fa12579213578ed2d780ee581ff60e20ceb99a14fefd44548805" Mar 08 03:47:17.554532 master-0 kubenswrapper[7547]: E0308 03:47:17.554448 7547 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.554784 master-0 kubenswrapper[7547]: E0308 03:47:17.554605 7547 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:47:17.554784 master-0 kubenswrapper[7547]: W0308 03:47:17.554609 7547 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 08 03:47:17.554784 master-0 kubenswrapper[7547]: E0308 03:47:17.554661 7547 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:47:17.554784 master-0 kubenswrapper[7547]: E0308 03:47:17.554672 7547 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:47:17.555114 master-0 kubenswrapper[7547]: E0308 03:47:17.554940 7547 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:17.604374 master-0 kubenswrapper[7547]: I0308 03:47:17.604333 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:17.604374 master-0 kubenswrapper[7547]: I0308 03:47:17.604373 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:47:17.604644 master-0 kubenswrapper[7547]: I0308 03:47:17.604400 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:47:17.604644 master-0 kubenswrapper[7547]: I0308 03:47:17.604475 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.604644 master-0 kubenswrapper[7547]: I0308 03:47:17.604513 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.604644 master-0 kubenswrapper[7547]: I0308 03:47:17.604551 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.604644 master-0 kubenswrapper[7547]: I0308 03:47:17.604574 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:17.604644 master-0 kubenswrapper[7547]: I0308 03:47:17.604590 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:47:17.604644 master-0 kubenswrapper[7547]: I0308 03:47:17.604604 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:47:17.604644 master-0 kubenswrapper[7547]: I0308 03:47:17.604621 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.604644 master-0 kubenswrapper[7547]: I0308 03:47:17.604638 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:17.605031 master-0 kubenswrapper[7547]: I0308 03:47:17.604655 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:17.605031 master-0 kubenswrapper[7547]: I0308 03:47:17.604674 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:17.605031 master-0 kubenswrapper[7547]: I0308 03:47:17.604691 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:47:17.605031 master-0 kubenswrapper[7547]: I0308 03:47:17.604752 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:47:17.605031 master-0 kubenswrapper[7547]: I0308 03:47:17.604819 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.605031 master-0 kubenswrapper[7547]: I0308 03:47:17.604902 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.706105 master-0 kubenswrapper[7547]: I0308 03:47:17.706054 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.706105 master-0 kubenswrapper[7547]: I0308 03:47:17.706105 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.706350 master-0 kubenswrapper[7547]: I0308 03:47:17.706133 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:17.706350 master-0 kubenswrapper[7547]: I0308 03:47:17.706299 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:47:17.706350 master-0 kubenswrapper[7547]: I0308 03:47:17.706291 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.706350 master-0 kubenswrapper[7547]: I0308 03:47:17.706322 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:47:17.706513 master-0 kubenswrapper[7547]: I0308 03:47:17.706359 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:47:17.706513 master-0 kubenswrapper[7547]: I0308 03:47:17.706393 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:47:17.706513 master-0 kubenswrapper[7547]: I0308 03:47:17.706411 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.706513 master-0 kubenswrapper[7547]: I0308 03:47:17.706432 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:47:17.706513 master-0 kubenswrapper[7547]: I0308 03:47:17.706436 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.706513 master-0 kubenswrapper[7547]: I0308 03:47:17.706456 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:17.706513 master-0 kubenswrapper[7547]: I0308 03:47:17.706478 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.706513 master-0 kubenswrapper[7547]: I0308 03:47:17.706504 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.706513 master-0 kubenswrapper[7547]: I0308 03:47:17.706513 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706525 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706484 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706577 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706597 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706581 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706618 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706639 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706649 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706680 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706692 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706709 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706717 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706742 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706750 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706777 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706796 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706833 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706854 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:47:17.706955 master-0 kubenswrapper[7547]: I0308 03:47:17.706874 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:18.177207 master-0 kubenswrapper[7547]: I0308 03:47:18.177152 7547 apiserver.go:52] "Watching apiserver" Mar 08 03:47:18.199423 master-0 kubenswrapper[7547]: I0308 03:47:18.199335 7547 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 03:47:18.203085 master-0 kubenswrapper[7547]: I0308 03:47:18.203013 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-t77qr","openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf","openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q","openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp","openshift-etcd/etcd-master-0-master-0","openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5","openshift-multus/multus-rpppb","openshift-network-node-identity/network-node-identity-ggzm8","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52","openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7","openshift-insights/insights-operator-8f89dfddd-4mr6p","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj","openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz","openshift-ovn-kubernetes/ovnkube-node-jc6rf","kube-system/bootstrap-kube-controller-manager-master-0","openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58","openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d","openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp","openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj","openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh","openshift-network-diagnostics/network-check-target-xmgpj","openshift-multus/network-metrics-daemon-schjl","openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh","kube-system/bootstrap-kube-scheduler-master-0","openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2","openshift-multus/multus-additional-cni-plugins-g564l","openshift-multus/multus-admission-controller-8d675b596-j8pv6","openshift-network-operator/iptables-alerter-7c28p","openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq","openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs","openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-xhbrl","openshift-dns-operator/dns-operator-589895fbb7-xttlz","openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6","openshift-network-operator/network-operator-7c649bf6d4-99d2k","assisted-installer/assisted-installer-controller-66tqt","openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72"] Mar 08 03:47:18.203705 master-0 kubenswrapper[7547]: I0308 03:47:18.203676 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:18.203784 master-0 kubenswrapper[7547]: I0308 03:47:18.203382 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:47:18.210795 master-0 kubenswrapper[7547]: I0308 03:47:18.210758 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:18.210932 master-0 kubenswrapper[7547]: I0308 03:47:18.210905 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:18.213521 master-0 kubenswrapper[7547]: I0308 03:47:18.213465 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:18.213862 master-0 kubenswrapper[7547]: I0308 03:47:18.213817 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:18.214110 master-0 kubenswrapper[7547]: I0308 03:47:18.214081 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:18.215986 master-0 kubenswrapper[7547]: I0308 03:47:18.215874 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:18.216441 master-0 kubenswrapper[7547]: I0308 03:47:18.216405 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:18.221239 master-0 kubenswrapper[7547]: I0308 03:47:18.219379 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:18.221239 master-0 kubenswrapper[7547]: I0308 03:47:18.220090 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:18.221239 master-0 kubenswrapper[7547]: I0308 03:47:18.220330 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:18.221239 master-0 kubenswrapper[7547]: I0308 03:47:18.220742 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:18.223337 master-0 kubenswrapper[7547]: I0308 03:47:18.223291 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:18.223660 master-0 kubenswrapper[7547]: I0308 03:47:18.223596 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 03:47:18.223818 master-0 kubenswrapper[7547]: I0308 03:47:18.223614 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:47:18.224386 master-0 kubenswrapper[7547]: I0308 03:47:18.224322 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 03:47:18.224386 master-0 kubenswrapper[7547]: I0308 03:47:18.224338 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 03:47:18.225323 master-0 kubenswrapper[7547]: I0308 03:47:18.224725 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 03:47:18.225323 master-0 kubenswrapper[7547]: I0308 03:47:18.224765 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 03:47:18.225323 master-0 kubenswrapper[7547]: I0308 03:47:18.224950 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 03:47:18.225323 master-0 kubenswrapper[7547]: I0308 03:47:18.224966 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 08 03:47:18.225323 master-0 kubenswrapper[7547]: I0308 03:47:18.225012 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:47:18.225323 master-0 kubenswrapper[7547]: I0308 03:47:18.225060 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 08 03:47:18.225323 master-0 kubenswrapper[7547]: I0308 03:47:18.225063 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 03:47:18.225323 master-0 kubenswrapper[7547]: I0308 03:47:18.225222 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 08 03:47:18.225323 master-0 kubenswrapper[7547]: I0308 03:47:18.225224 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 08 03:47:18.225853 master-0 kubenswrapper[7547]: I0308 03:47:18.225759 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 03:47:18.226240 master-0 kubenswrapper[7547]: I0308 03:47:18.226196 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 03:47:18.226240 master-0 kubenswrapper[7547]: I0308 03:47:18.226229 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 03:47:18.226330 master-0 kubenswrapper[7547]: I0308 03:47:18.226280 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 08 03:47:18.226330 master-0 kubenswrapper[7547]: I0308 03:47:18.226300 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 03:47:18.226330 master-0 kubenswrapper[7547]: I0308 03:47:18.226204 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 08 03:47:18.226870 master-0 kubenswrapper[7547]: I0308 03:47:18.226752 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 03:47:18.226870 master-0 kubenswrapper[7547]: I0308 03:47:18.226773 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 03:47:18.227097 master-0 kubenswrapper[7547]: I0308 03:47:18.227008 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 03:47:18.227097 master-0 kubenswrapper[7547]: I0308 03:47:18.227030 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 03:47:18.227262 master-0 kubenswrapper[7547]: I0308 03:47:18.227215 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:47:18.229628 master-0 kubenswrapper[7547]: I0308 03:47:18.229584 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 03:47:18.229782 master-0 kubenswrapper[7547]: I0308 03:47:18.229756 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 08 03:47:18.240273 master-0 kubenswrapper[7547]: I0308 03:47:18.240210 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 03:47:18.241404 master-0 kubenswrapper[7547]: I0308 03:47:18.241172 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 03:47:18.241458 master-0 kubenswrapper[7547]: I0308 03:47:18.241425 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.241741 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.241938 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.242023 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.242078 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.242037 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.242240 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.242393 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.242111 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.242597 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.242633 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.242729 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.242804 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.242814 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.242859 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.242874 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.242875 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.242911 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.242960 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.242973 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.243010 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.243018 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.243178 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.243208 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.243247 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.243265 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.243290 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 03:47:18.244242 master-0 kubenswrapper[7547]: I0308 03:47:18.243329 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 03:47:18.245166 master-0 kubenswrapper[7547]: I0308 03:47:18.244510 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 03:47:18.245166 master-0 kubenswrapper[7547]: I0308 03:47:18.244521 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 03:47:18.245166 master-0 kubenswrapper[7547]: I0308 03:47:18.244697 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 03:47:18.245166 master-0 kubenswrapper[7547]: I0308 03:47:18.244717 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 03:47:18.245166 master-0 kubenswrapper[7547]: I0308 03:47:18.244900 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 03:47:18.245300 master-0 kubenswrapper[7547]: I0308 03:47:18.245219 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:47:18.245456 master-0 kubenswrapper[7547]: I0308 03:47:18.245430 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 08 03:47:18.245533 master-0 kubenswrapper[7547]: I0308 03:47:18.245520 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 03:47:18.245626 master-0 kubenswrapper[7547]: I0308 03:47:18.245605 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 08 03:47:18.245852 master-0 kubenswrapper[7547]: I0308 03:47:18.245657 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 08 03:47:18.245852 master-0 kubenswrapper[7547]: I0308 03:47:18.245691 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 03:47:18.245852 master-0 kubenswrapper[7547]: I0308 03:47:18.245793 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 03:47:18.245942 master-0 kubenswrapper[7547]: I0308 03:47:18.245877 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 03:47:18.245988 master-0 kubenswrapper[7547]: I0308 03:47:18.245977 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 03:47:18.246053 master-0 kubenswrapper[7547]: I0308 03:47:18.246031 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 03:47:18.246126 master-0 kubenswrapper[7547]: I0308 03:47:18.246112 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 08 03:47:18.246276 master-0 kubenswrapper[7547]: I0308 03:47:18.246239 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 03:47:18.246311 master-0 kubenswrapper[7547]: I0308 03:47:18.246289 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 08 03:47:18.246384 master-0 kubenswrapper[7547]: I0308 03:47:18.246370 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 03:47:18.246458 master-0 kubenswrapper[7547]: I0308 03:47:18.246436 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 08 03:47:18.246503 master-0 kubenswrapper[7547]: I0308 03:47:18.246489 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 03:47:18.246573 master-0 kubenswrapper[7547]: I0308 03:47:18.246497 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 08 03:47:18.246606 master-0 kubenswrapper[7547]: I0308 03:47:18.246581 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 03:47:18.246606 master-0 kubenswrapper[7547]: I0308 03:47:18.246599 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 03:47:18.246787 master-0 kubenswrapper[7547]: I0308 03:47:18.246770 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 03:47:18.246947 master-0 kubenswrapper[7547]: I0308 03:47:18.246929 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 03:47:18.247012 master-0 kubenswrapper[7547]: I0308 03:47:18.246976 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 03:47:18.247476 master-0 kubenswrapper[7547]: I0308 03:47:18.246775 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 08 03:47:18.247476 master-0 kubenswrapper[7547]: I0308 03:47:18.247115 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 03:47:18.247476 master-0 kubenswrapper[7547]: I0308 03:47:18.247160 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 03:47:18.248159 master-0 kubenswrapper[7547]: I0308 03:47:18.248121 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 08 03:47:18.248300 master-0 kubenswrapper[7547]: I0308 03:47:18.248284 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 03:47:18.248660 master-0 kubenswrapper[7547]: I0308 03:47:18.248535 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 03:47:18.248660 master-0 kubenswrapper[7547]: I0308 03:47:18.248534 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 03:47:18.248660 master-0 kubenswrapper[7547]: I0308 03:47:18.248603 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 03:47:18.248660 master-0 kubenswrapper[7547]: I0308 03:47:18.248642 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 08 03:47:18.250203 master-0 kubenswrapper[7547]: I0308 03:47:18.248699 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 03:47:18.250203 master-0 kubenswrapper[7547]: I0308 03:47:18.248784 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 03:47:18.250203 master-0 kubenswrapper[7547]: I0308 03:47:18.248816 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 03:47:18.252327 master-0 kubenswrapper[7547]: I0308 03:47:18.252295 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 08 03:47:18.260978 master-0 kubenswrapper[7547]: I0308 03:47:18.260957 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 03:47:18.262183 master-0 kubenswrapper[7547]: I0308 03:47:18.262145 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 03:47:18.262634 master-0 kubenswrapper[7547]: I0308 03:47:18.262614 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 08 03:47:18.263363 master-0 kubenswrapper[7547]: I0308 03:47:18.263312 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 03:47:18.268505 master-0 kubenswrapper[7547]: I0308 03:47:18.266178 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 08 03:47:18.268723 master-0 kubenswrapper[7547]: I0308 03:47:18.268686 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 03:47:18.270902 master-0 kubenswrapper[7547]: I0308 03:47:18.270869 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 03:47:18.272570 master-0 kubenswrapper[7547]: I0308 03:47:18.272540 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 03:47:18.287258 master-0 kubenswrapper[7547]: I0308 03:47:18.287220 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 03:47:18.290584 master-0 kubenswrapper[7547]: I0308 03:47:18.290561 7547 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 08 03:47:18.307659 master-0 kubenswrapper[7547]: I0308 03:47:18.307277 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 03:47:18.314211 master-0 kubenswrapper[7547]: I0308 03:47:18.314139 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-cni-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.314289 master-0 kubenswrapper[7547]: I0308 03:47:18.314256 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-serving-cert\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:47:18.314339 master-0 kubenswrapper[7547]: I0308 03:47:18.314315 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.314399 master-0 kubenswrapper[7547]: I0308 03:47:18.314365 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpsx7\" (UniqueName: \"kubernetes.io/projected/8efdcef9-9b31-4567-b7f9-cb59a894273d-kube-api-access-cpsx7\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:18.314439 master-0 kubenswrapper[7547]: I0308 03:47:18.314413 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:18.314468 master-0 kubenswrapper[7547]: I0308 03:47:18.314453 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ee586416-6f56-4ea4-ad62-95de1e6df23b-snapshots\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:18.314524 master-0 kubenswrapper[7547]: I0308 03:47:18.314495 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee586416-6f56-4ea4-ad62-95de1e6df23b-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:18.314565 master-0 kubenswrapper[7547]: I0308 03:47:18.314542 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:18.315310 master-0 kubenswrapper[7547]: I0308 03:47:18.314715 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-iptables-alerter-script\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:47:18.315310 master-0 kubenswrapper[7547]: I0308 03:47:18.314896 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ebf1330-e044-4ff5-8b48-2d667e0c5625-config\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:47:18.315310 master-0 kubenswrapper[7547]: I0308 03:47:18.314932 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30211469-7108-4820-a988-26fc4ced734e-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:47:18.315310 master-0 kubenswrapper[7547]: I0308 03:47:18.314954 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d377285-0336-41b7-b48f-c44a7b563498-config\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:47:18.315310 master-0 kubenswrapper[7547]: I0308 03:47:18.314973 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ee586416-6f56-4ea4-ad62-95de1e6df23b-snapshots\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:18.315310 master-0 kubenswrapper[7547]: I0308 03:47:18.314990 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfvnn\" (UniqueName: \"kubernetes.io/projected/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-kube-api-access-cfvnn\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:47:18.315310 master-0 kubenswrapper[7547]: I0308 03:47:18.315021 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-ovn\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.315310 master-0 kubenswrapper[7547]: I0308 03:47:18.315040 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-log-socket\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.315310 master-0 kubenswrapper[7547]: I0308 03:47:18.315054 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee586416-6f56-4ea4-ad62-95de1e6df23b-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:18.315310 master-0 kubenswrapper[7547]: I0308 03:47:18.315262 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d377285-0336-41b7-b48f-c44a7b563498-config\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:47:18.315310 master-0 kubenswrapper[7547]: I0308 03:47:18.315292 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovnkube-config\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.315630 master-0 kubenswrapper[7547]: I0308 03:47:18.315333 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:47:18.315630 master-0 kubenswrapper[7547]: I0308 03:47:18.315359 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30211469-7108-4820-a988-26fc4ced734e-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:47:18.315630 master-0 kubenswrapper[7547]: I0308 03:47:18.315369 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ddfd0e7-fe76-41bc-b316-94505df81002-metrics-tls\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:47:18.315630 master-0 kubenswrapper[7547]: I0308 03:47:18.315394 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-trusted-ca\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:18.315630 master-0 kubenswrapper[7547]: I0308 03:47:18.315445 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/164586b1-f133-4427-8ab6-eb0839b79738-env-overrides\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:47:18.315630 master-0 kubenswrapper[7547]: I0308 03:47:18.315458 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ebf1330-e044-4ff5-8b48-2d667e0c5625-config\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:47:18.315630 master-0 kubenswrapper[7547]: I0308 03:47:18.315471 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:47:18.315630 master-0 kubenswrapper[7547]: I0308 03:47:18.315538 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:47:18.315630 master-0 kubenswrapper[7547]: I0308 03:47:18.315578 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgc7c\" (UniqueName: \"kubernetes.io/projected/3ddfd0e7-fe76-41bc-b316-94505df81002-kube-api-access-bgc7c\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:47:18.315630 master-0 kubenswrapper[7547]: I0308 03:47:18.315618 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxxhh\" (UniqueName: \"kubernetes.io/projected/ee586416-6f56-4ea4-ad62-95de1e6df23b-kube-api-access-sxxhh\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:18.315955 master-0 kubenswrapper[7547]: I0308 03:47:18.315649 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ddfd0e7-fe76-41bc-b316-94505df81002-metrics-tls\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:47:18.315955 master-0 kubenswrapper[7547]: I0308 03:47:18.315668 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-conf-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.315955 master-0 kubenswrapper[7547]: I0308 03:47:18.315711 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/164586b1-f133-4427-8ab6-eb0839b79738-webhook-cert\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:47:18.315955 master-0 kubenswrapper[7547]: I0308 03:47:18.315747 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-config\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:47:18.315955 master-0 kubenswrapper[7547]: I0308 03:47:18.315781 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-systemd-units\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.315955 master-0 kubenswrapper[7547]: I0308 03:47:18.315806 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/164586b1-f133-4427-8ab6-eb0839b79738-env-overrides\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:47:18.315955 master-0 kubenswrapper[7547]: I0308 03:47:18.315860 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/7ff63c73-62a3-44b4-acd3-1b3df175794f-operand-assets\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:47:18.315955 master-0 kubenswrapper[7547]: I0308 03:47:18.315917 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-serving-cert\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:18.316157 master-0 kubenswrapper[7547]: I0308 03:47:18.315960 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-run-ovn-kubernetes\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.316157 master-0 kubenswrapper[7547]: I0308 03:47:18.316002 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:47:18.316157 master-0 kubenswrapper[7547]: I0308 03:47:18.316012 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw7bx\" (UniqueName: \"kubernetes.io/projected/4a19441e-e61b-4d58-85db-813ae88e1f9b-kube-api-access-dw7bx\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.316157 master-0 kubenswrapper[7547]: I0308 03:47:18.316036 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-trusted-ca\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:18.316157 master-0 kubenswrapper[7547]: I0308 03:47:18.316065 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/232c421d-96f0-4894-b8d8-74f43d02bbd3-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:18.316157 master-0 kubenswrapper[7547]: I0308 03:47:18.316109 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-k8s-cni-cncf-io\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.316310 master-0 kubenswrapper[7547]: I0308 03:47:18.316155 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-etc-kubernetes\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.316310 master-0 kubenswrapper[7547]: I0308 03:47:18.316219 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-serving-cert\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:18.316310 master-0 kubenswrapper[7547]: I0308 03:47:18.316210 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-images\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:18.316310 master-0 kubenswrapper[7547]: I0308 03:47:18.316268 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d377285-0336-41b7-b48f-c44a7b563498-serving-cert\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:47:18.316310 master-0 kubenswrapper[7547]: I0308 03:47:18.316289 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/164586b1-f133-4427-8ab6-eb0839b79738-webhook-cert\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:47:18.316310 master-0 kubenswrapper[7547]: I0308 03:47:18.316292 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:47:18.316458 master-0 kubenswrapper[7547]: I0308 03:47:18.316350 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-client\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:18.316458 master-0 kubenswrapper[7547]: I0308 03:47:18.316378 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-os-release\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.316458 master-0 kubenswrapper[7547]: I0308 03:47:18.316399 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:18.316458 master-0 kubenswrapper[7547]: I0308 03:47:18.316418 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:18.316458 master-0 kubenswrapper[7547]: I0308 03:47:18.316436 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d377285-0336-41b7-b48f-c44a7b563498-serving-cert\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:47:18.316458 master-0 kubenswrapper[7547]: I0308 03:47:18.316441 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-cnibin\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.316611 master-0 kubenswrapper[7547]: I0308 03:47:18.316465 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-socket-dir-parent\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.316611 master-0 kubenswrapper[7547]: I0308 03:47:18.316492 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3eea925-73b3-4693-8f0e-6dd26107f60a-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-nm8fj\" (UID: \"b3eea925-73b3-4693-8f0e-6dd26107f60a\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" Mar 08 03:47:18.316611 master-0 kubenswrapper[7547]: I0308 03:47:18.316518 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7e5935ea-8d95-45e3-b836-c7892953ef3d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:47:18.316611 master-0 kubenswrapper[7547]: I0308 03:47:18.316544 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:47:18.316611 master-0 kubenswrapper[7547]: I0308 03:47:18.316578 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-images\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:18.316611 master-0 kubenswrapper[7547]: I0308 03:47:18.316595 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/7ff63c73-62a3-44b4-acd3-1b3df175794f-operand-assets\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:47:18.316761 master-0 kubenswrapper[7547]: I0308 03:47:18.316661 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ebf1330-e044-4ff5-8b48-2d667e0c5625-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:47:18.316761 master-0 kubenswrapper[7547]: I0308 03:47:18.316543 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ebf1330-e044-4ff5-8b48-2d667e0c5625-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:47:18.317378 master-0 kubenswrapper[7547]: I0308 03:47:18.316884 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:18.317378 master-0 kubenswrapper[7547]: I0308 03:47:18.316926 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-client\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:18.317378 master-0 kubenswrapper[7547]: I0308 03:47:18.316938 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3eea925-73b3-4693-8f0e-6dd26107f60a-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-nm8fj\" (UID: \"b3eea925-73b3-4693-8f0e-6dd26107f60a\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" Mar 08 03:47:18.317378 master-0 kubenswrapper[7547]: I0308 03:47:18.316956 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:18.317378 master-0 kubenswrapper[7547]: I0308 03:47:18.317002 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-kubelet\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.317378 master-0 kubenswrapper[7547]: I0308 03:47:18.317028 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26180f77-0b1a-4d0f-9ed0-a12fdee69817-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:47:18.317378 master-0 kubenswrapper[7547]: I0308 03:47:18.317052 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxkm6\" (UniqueName: \"kubernetes.io/projected/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-kube-api-access-zxkm6\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:18.317378 master-0 kubenswrapper[7547]: I0308 03:47:18.317154 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/232c421d-96f0-4894-b8d8-74f43d02bbd3-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:18.317378 master-0 kubenswrapper[7547]: I0308 03:47:18.317166 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:18.317378 master-0 kubenswrapper[7547]: I0308 03:47:18.317266 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovnkube-script-lib\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.317654 master-0 kubenswrapper[7547]: I0308 03:47:18.317396 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kc5q\" (UniqueName: \"kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q\") pod \"network-check-target-xmgpj\" (UID: \"e93b5361-30e6-44fd-a59e-2bc410c59480\") " pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:47:18.317654 master-0 kubenswrapper[7547]: I0308 03:47:18.317453 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fncng\" (UniqueName: \"kubernetes.io/projected/30211469-7108-4820-a988-26fc4ced734e-kube-api-access-fncng\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:47:18.317654 master-0 kubenswrapper[7547]: I0308 03:47:18.317491 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7752f9-7b9a-451f-997a-e9f696d38b34-serving-cert\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:18.317654 master-0 kubenswrapper[7547]: I0308 03:47:18.317532 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/349d438d-d124-4d34-a172-4160e766c680-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:18.317654 master-0 kubenswrapper[7547]: I0308 03:47:18.317569 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-cni-binary-copy\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.317654 master-0 kubenswrapper[7547]: I0308 03:47:18.317604 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rchj5\" (UniqueName: \"kubernetes.io/projected/e78b283b-981e-48d7-a5f2-53f8401766ea-kube-api-access-rchj5\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:18.317654 master-0 kubenswrapper[7547]: I0308 03:47:18.317641 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqhzl\" (UniqueName: \"kubernetes.io/projected/1eb851be-f157-48ea-9a39-1361b68d2639-kube-api-access-nqhzl\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:18.317844 master-0 kubenswrapper[7547]: I0308 03:47:18.317678 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-node-log\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.317844 master-0 kubenswrapper[7547]: I0308 03:47:18.317719 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd549\" (UniqueName: \"kubernetes.io/projected/52b495ac-bb28-44f3-b925-3c54f86d5ec4-kube-api-access-dd549\") pod \"csi-snapshot-controller-operator-5685fbc7d-xhbrl\" (UID: \"52b495ac-bb28-44f3-b925-3c54f86d5ec4\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-xhbrl" Mar 08 03:47:18.317844 master-0 kubenswrapper[7547]: I0308 03:47:18.317733 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7752f9-7b9a-451f-997a-e9f696d38b34-serving-cert\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:18.317844 master-0 kubenswrapper[7547]: I0308 03:47:18.317758 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkzb2\" (UniqueName: \"kubernetes.io/projected/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-kube-api-access-mkzb2\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:47:18.317956 master-0 kubenswrapper[7547]: I0308 03:47:18.317891 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff63c73-62a3-44b4-acd3-1b3df175794f-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:47:18.317956 master-0 kubenswrapper[7547]: I0308 03:47:18.317919 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee586416-6f56-4ea4-ad62-95de1e6df23b-serving-cert\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:18.317956 master-0 kubenswrapper[7547]: I0308 03:47:18.317944 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4stz\" (UniqueName: \"kubernetes.io/projected/164586b1-f133-4427-8ab6-eb0839b79738-kube-api-access-r4stz\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:47:18.318034 master-0 kubenswrapper[7547]: I0308 03:47:18.317969 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-slash\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.318034 master-0 kubenswrapper[7547]: I0308 03:47:18.317994 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-whereabouts-configmap\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.318034 master-0 kubenswrapper[7547]: I0308 03:47:18.318018 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwkwt\" (UniqueName: \"kubernetes.io/projected/d831cb23-7411-4072-8273-c167d9afca28-kube-api-access-dwkwt\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:18.318111 master-0 kubenswrapper[7547]: I0308 03:47:18.318041 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-cni-bin\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318148 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26180f77-0b1a-4d0f-9ed0-a12fdee69817-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318179 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff63c73-62a3-44b4-acd3-1b3df175794f-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318208 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318250 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26180f77-0b1a-4d0f-9ed0-a12fdee69817-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318249 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfqc5\" (UniqueName: \"kubernetes.io/projected/7ff63c73-62a3-44b4-acd3-1b3df175794f-kube-api-access-vfqc5\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318305 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hccv4\" (UniqueName: \"kubernetes.io/projected/0ebf1330-e044-4ff5-8b48-2d667e0c5625-kube-api-access-hccv4\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318334 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318355 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-systemd\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318357 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee586416-6f56-4ea4-ad62-95de1e6df23b-serving-cert\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318530 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318558 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmpdd\" (UniqueName: \"kubernetes.io/projected/0418ff42-7eac-4266-97b5-4df88623d066-kube-api-access-kmpdd\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318573 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26180f77-0b1a-4d0f-9ed0-a12fdee69817-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318623 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318579 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b5zb\" (UniqueName: \"kubernetes.io/projected/5a7752f9-7b9a-451f-997a-e9f696d38b34-kube-api-access-8b5zb\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318692 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318746 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vklx\" (UniqueName: \"kubernetes.io/projected/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-kube-api-access-2vklx\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318769 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/0418ff42-7eac-4266-97b5-4df88623d066-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318839 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318882 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30211469-7108-4820-a988-26fc4ced734e-config\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318907 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318956 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-host-slash\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.318979 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mghmh\" (UniqueName: \"kubernetes.io/projected/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-kube-api-access-mghmh\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.319001 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.319025 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmhtb\" (UniqueName: \"kubernetes.io/projected/d5044ffd-0686-4679-9894-e696faf33699-kube-api-access-mmhtb\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.319046 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-etc-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.319067 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-cni-netd\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.319089 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovn-node-metrics-cert\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.319103 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.319110 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7e5935ea-8d95-45e3-b836-c7892953ef3d-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.319137 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.319232 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/0418ff42-7eac-4266-97b5-4df88623d066-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.319260 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-os-release\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.319294 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qn5v\" (UniqueName: \"kubernetes.io/projected/0d377285-0336-41b7-b48f-c44a7b563498-kube-api-access-7qn5v\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.319337 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-kubelet\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.319355 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30211469-7108-4820-a988-26fc4ced734e-config\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.319360 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-run-netns\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.319385 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:18.319470 master-0 kubenswrapper[7547]: I0308 03:47:18.319443 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.319524 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-system-cni-dir\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.319551 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.319573 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-multus-certs\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.319615 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x997v\" (UniqueName: \"kubernetes.io/projected/6cde5024-edf7-4fa4-8964-cabe7899578b-kube-api-access-x997v\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.319637 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sx5s\" (UniqueName: \"kubernetes.io/projected/b3eea925-73b3-4693-8f0e-6dd26107f60a-kube-api-access-6sx5s\") pod \"cluster-storage-operator-6fbfc8dc8f-nm8fj\" (UID: \"b3eea925-73b3-4693-8f0e-6dd26107f60a\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.319664 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-cni-bin\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.319660 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.319684 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3ddfd0e7-fe76-41bc-b316-94505df81002-host-etc-kube\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.319707 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee586416-6f56-4ea4-ad62-95de1e6df23b-service-ca-bundle\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.319914 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-ca\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.319955 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.319963 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee586416-6f56-4ea4-ad62-95de1e6df23b-service-ca-bundle\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.319991 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/349d438d-d124-4d34-a172-4160e766c680-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.320016 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-cni-multus\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.320041 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-daemon-config\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.320241 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-ca\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.320277 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.320291 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.320332 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-config\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:18.320417 master-0 kubenswrapper[7547]: I0308 03:47:18.320390 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:18.320936 master-0 kubenswrapper[7547]: I0308 03:47:18.320482 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/349d438d-d124-4d34-a172-4160e766c680-kube-api-access\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:18.320936 master-0 kubenswrapper[7547]: I0308 03:47:18.320516 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-config\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:18.320936 master-0 kubenswrapper[7547]: I0308 03:47:18.320549 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:18.320936 master-0 kubenswrapper[7547]: I0308 03:47:18.320609 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-hostroot\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.320936 master-0 kubenswrapper[7547]: I0308 03:47:18.320657 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-bound-sa-token\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:18.320936 master-0 kubenswrapper[7547]: I0308 03:47:18.320751 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nk8r\" (UniqueName: \"kubernetes.io/projected/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-kube-api-access-7nk8r\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.320936 master-0 kubenswrapper[7547]: I0308 03:47:18.320788 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6gml\" (UniqueName: \"kubernetes.io/projected/7e5935ea-8d95-45e3-b836-c7892953ef3d-kube-api-access-c6gml\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:47:18.320936 master-0 kubenswrapper[7547]: I0308 03:47:18.320856 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/164586b1-f133-4427-8ab6-eb0839b79738-ovnkube-identity-cm\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:47:18.320936 master-0 kubenswrapper[7547]: I0308 03:47:18.320922 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:18.321160 master-0 kubenswrapper[7547]: I0308 03:47:18.320967 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-var-lib-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.321160 master-0 kubenswrapper[7547]: I0308 03:47:18.321028 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-env-overrides\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.321160 master-0 kubenswrapper[7547]: I0308 03:47:18.321069 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-available-featuregates\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:18.321160 master-0 kubenswrapper[7547]: I0308 03:47:18.321126 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw7mr\" (UniqueName: \"kubernetes.io/projected/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-kube-api-access-fw7mr\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:18.321160 master-0 kubenswrapper[7547]: I0308 03:47:18.321133 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/164586b1-f133-4427-8ab6-eb0839b79738-ovnkube-identity-cm\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:47:18.321289 master-0 kubenswrapper[7547]: I0308 03:47:18.321233 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-system-cni-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.321289 master-0 kubenswrapper[7547]: I0308 03:47:18.321261 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26180f77-0b1a-4d0f-9ed0-a12fdee69817-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:47:18.321342 master-0 kubenswrapper[7547]: I0308 03:47:18.321298 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lfcj\" (UniqueName: \"kubernetes.io/projected/54ad284e-d40e-4e69-b898-f5093952a0e6-kube-api-access-9lfcj\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:18.321342 master-0 kubenswrapper[7547]: I0308 03:47:18.321322 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:18.321394 master-0 kubenswrapper[7547]: I0308 03:47:18.321346 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-config\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:18.321394 master-0 kubenswrapper[7547]: I0308 03:47:18.321371 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:18.321892 master-0 kubenswrapper[7547]: I0308 03:47:18.321776 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-available-featuregates\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:18.322072 master-0 kubenswrapper[7547]: I0308 03:47:18.322046 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-config\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:18.322170 master-0 kubenswrapper[7547]: I0308 03:47:18.322136 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2h62\" (UniqueName: \"kubernetes.io/projected/1482d789-884b-4337-b598-f0e2b71eb9f2-kube-api-access-m2h62\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:18.322270 master-0 kubenswrapper[7547]: I0308 03:47:18.322237 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:18.322395 master-0 kubenswrapper[7547]: I0308 03:47:18.322356 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7e5935ea-8d95-45e3-b836-c7892953ef3d-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:47:18.322491 master-0 kubenswrapper[7547]: I0308 03:47:18.322464 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d831cb23-7411-4072-8273-c167d9afca28-config\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:18.322542 master-0 kubenswrapper[7547]: I0308 03:47:18.322514 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:18.322570 master-0 kubenswrapper[7547]: I0308 03:47:18.322553 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:47:18.322619 master-0 kubenswrapper[7547]: I0308 03:47:18.322589 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-serving-cert\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:18.322652 master-0 kubenswrapper[7547]: I0308 03:47:18.322633 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/349d438d-d124-4d34-a172-4160e766c680-service-ca\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:18.322909 master-0 kubenswrapper[7547]: I0308 03:47:18.322885 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d831cb23-7411-4072-8273-c167d9afca28-config\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:18.323038 master-0 kubenswrapper[7547]: I0308 03:47:18.323007 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-serving-cert\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:18.323112 master-0 kubenswrapper[7547]: I0308 03:47:18.323082 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx4fw\" (UniqueName: \"kubernetes.io/projected/232c421d-96f0-4894-b8d8-74f43d02bbd3-kube-api-access-fx4fw\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:18.323173 master-0 kubenswrapper[7547]: I0308 03:47:18.323143 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnzt7\" (UniqueName: \"kubernetes.io/projected/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-kube-api-access-pnzt7\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:18.323173 master-0 kubenswrapper[7547]: I0308 03:47:18.323161 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/349d438d-d124-4d34-a172-4160e766c680-service-ca\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:18.323375 master-0 kubenswrapper[7547]: I0308 03:47:18.323324 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d831cb23-7411-4072-8273-c167d9afca28-images\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:18.323414 master-0 kubenswrapper[7547]: I0308 03:47:18.323375 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-cnibin\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.323414 master-0 kubenswrapper[7547]: I0308 03:47:18.323396 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-netns\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.323471 master-0 kubenswrapper[7547]: I0308 03:47:18.323422 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29dpg\" (UniqueName: \"kubernetes.io/projected/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-kube-api-access-29dpg\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:18.323471 master-0 kubenswrapper[7547]: I0308 03:47:18.323448 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:18.323530 master-0 kubenswrapper[7547]: I0308 03:47:18.323483 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-cni-binary-copy\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.324067 master-0 kubenswrapper[7547]: I0308 03:47:18.324035 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:18.324297 master-0 kubenswrapper[7547]: I0308 03:47:18.324220 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d831cb23-7411-4072-8273-c167d9afca28-images\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:18.324447 master-0 kubenswrapper[7547]: I0308 03:47:18.324392 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:47:18.327155 master-0 kubenswrapper[7547]: I0308 03:47:18.327106 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 03:47:18.327974 master-0 kubenswrapper[7547]: I0308 03:47:18.327901 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7e5935ea-8d95-45e3-b836-c7892953ef3d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:47:18.346817 master-0 kubenswrapper[7547]: I0308 03:47:18.346783 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 03:47:18.349780 master-0 kubenswrapper[7547]: I0308 03:47:18.349735 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7e5935ea-8d95-45e3-b836-c7892953ef3d-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:47:18.357216 master-0 kubenswrapper[7547]: I0308 03:47:18.357152 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovnkube-config\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.372233 master-0 kubenswrapper[7547]: I0308 03:47:18.372204 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 03:47:18.372687 master-0 kubenswrapper[7547]: I0308 03:47:18.372650 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7e5935ea-8d95-45e3-b836-c7892953ef3d-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:47:18.381719 master-0 kubenswrapper[7547]: I0308 03:47:18.381687 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-env-overrides\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.387439 master-0 kubenswrapper[7547]: I0308 03:47:18.387408 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 03:47:18.407182 master-0 kubenswrapper[7547]: I0308 03:47:18.407144 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 03:47:18.411552 master-0 kubenswrapper[7547]: I0308 03:47:18.411501 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-daemon-config\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.424137 master-0 kubenswrapper[7547]: I0308 03:47:18.424087 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/349d438d-d124-4d34-a172-4160e766c680-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:18.424206 master-0 kubenswrapper[7547]: I0308 03:47:18.424146 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-cni-multus\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.424316 master-0 kubenswrapper[7547]: I0308 03:47:18.424273 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/349d438d-d124-4d34-a172-4160e766c680-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:18.424577 master-0 kubenswrapper[7547]: I0308 03:47:18.424488 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:18.424631 master-0 kubenswrapper[7547]: E0308 03:47:18.424609 7547 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:47:18.424694 master-0 kubenswrapper[7547]: I0308 03:47:18.424525 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-cni-multus\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.424727 master-0 kubenswrapper[7547]: E0308 03:47:18.424696 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert podName:6cde5024-edf7-4fa4-8964-cabe7899578b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:18.924663088 +0000 UTC m=+1.870347611 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-c46zz" (UID: "6cde5024-edf7-4fa4-8964-cabe7899578b") : secret "package-server-manager-serving-cert" not found Mar 08 03:47:18.424727 master-0 kubenswrapper[7547]: I0308 03:47:18.424661 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:18.424791 master-0 kubenswrapper[7547]: I0308 03:47:18.424774 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-hostroot\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.424949 master-0 kubenswrapper[7547]: E0308 03:47:18.424916 7547 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:18.424999 master-0 kubenswrapper[7547]: I0308 03:47:18.424973 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-var-lib-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.425046 master-0 kubenswrapper[7547]: I0308 03:47:18.425023 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-hostroot\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.425097 master-0 kubenswrapper[7547]: I0308 03:47:18.424923 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-var-lib-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.425130 master-0 kubenswrapper[7547]: E0308 03:47:18.425064 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:18.925013756 +0000 UTC m=+1.870698479 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:18.425202 master-0 kubenswrapper[7547]: I0308 03:47:18.425157 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-system-cni-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.425236 master-0 kubenswrapper[7547]: I0308 03:47:18.425215 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-system-cni-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.425236 master-0 kubenswrapper[7547]: I0308 03:47:18.425221 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:18.425330 master-0 kubenswrapper[7547]: E0308 03:47:18.425308 7547 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:47:18.425370 master-0 kubenswrapper[7547]: I0308 03:47:18.425340 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:18.425399 master-0 kubenswrapper[7547]: E0308 03:47:18.425352 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:18.925340374 +0000 UTC m=+1.871024897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "node-tuning-operator-tls" not found Mar 08 03:47:18.425429 master-0 kubenswrapper[7547]: E0308 03:47:18.425409 7547 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:18.425460 master-0 kubenswrapper[7547]: E0308 03:47:18.425435 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:18.925427436 +0000 UTC m=+1.871111959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:18.425490 master-0 kubenswrapper[7547]: I0308 03:47:18.425458 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:18.425521 master-0 kubenswrapper[7547]: I0308 03:47:18.425510 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-cnibin\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.425549 master-0 kubenswrapper[7547]: I0308 03:47:18.425535 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-netns\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.425682 master-0 kubenswrapper[7547]: I0308 03:47:18.425659 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-cni-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.425727 master-0 kubenswrapper[7547]: E0308 03:47:18.425675 7547 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 08 03:47:18.425727 master-0 kubenswrapper[7547]: I0308 03:47:18.425702 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.425790 master-0 kubenswrapper[7547]: I0308 03:47:18.425726 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:18.425790 master-0 kubenswrapper[7547]: E0308 03:47:18.425753 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:18.925724844 +0000 UTC m=+1.871409397 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : secret "mco-proxy-tls" not found Mar 08 03:47:18.425790 master-0 kubenswrapper[7547]: E0308 03:47:18.425786 7547 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:47:18.425893 master-0 kubenswrapper[7547]: I0308 03:47:18.425802 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:18.425893 master-0 kubenswrapper[7547]: E0308 03:47:18.425841 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert podName:1482d789-884b-4337-b598-f0e2b71eb9f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:18.925808386 +0000 UTC m=+1.871492909 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert") pod "catalog-operator-7d9c49f57b-qlfgq" (UID: "1482d789-884b-4337-b598-f0e2b71eb9f2") : secret "catalog-operator-serving-cert" not found Mar 08 03:47:18.425949 master-0 kubenswrapper[7547]: I0308 03:47:18.425902 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-cni-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.425949 master-0 kubenswrapper[7547]: I0308 03:47:18.425913 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-ovn\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.425949 master-0 kubenswrapper[7547]: I0308 03:47:18.425935 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.426022 master-0 kubenswrapper[7547]: I0308 03:47:18.425950 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-log-socket\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.426022 master-0 kubenswrapper[7547]: I0308 03:47:18.425959 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-netns\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.426075 master-0 kubenswrapper[7547]: I0308 03:47:18.426021 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-log-socket\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.426133 master-0 kubenswrapper[7547]: I0308 03:47:18.426103 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-cnibin\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.426193 master-0 kubenswrapper[7547]: I0308 03:47:18.426169 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-ovn\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.426310 master-0 kubenswrapper[7547]: I0308 03:47:18.426275 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-conf-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.426350 master-0 kubenswrapper[7547]: I0308 03:47:18.426323 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-conf-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.426380 master-0 kubenswrapper[7547]: I0308 03:47:18.426344 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-systemd-units\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.426408 master-0 kubenswrapper[7547]: I0308 03:47:18.426382 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-run-ovn-kubernetes\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.426453 master-0 kubenswrapper[7547]: E0308 03:47:18.426391 7547 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:18.426453 master-0 kubenswrapper[7547]: I0308 03:47:18.426445 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-run-ovn-kubernetes\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.426536 master-0 kubenswrapper[7547]: I0308 03:47:18.426482 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-systemd-units\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.426575 master-0 kubenswrapper[7547]: I0308 03:47:18.426551 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-k8s-cni-cncf-io\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.426604 master-0 kubenswrapper[7547]: I0308 03:47:18.426581 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-etc-kubernetes\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.426631 master-0 kubenswrapper[7547]: E0308 03:47:18.426609 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls podName:c9de4939-680a-4e3e-89fd-e20ecb8b10f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:18.926577914 +0000 UTC m=+1.872262427 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls") pod "ingress-operator-677db989d6-t77qr" (UID: "c9de4939-680a-4e3e-89fd-e20ecb8b10f2") : secret "metrics-tls" not found Mar 08 03:47:18.426631 master-0 kubenswrapper[7547]: I0308 03:47:18.426621 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-etc-kubernetes\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.426688 master-0 kubenswrapper[7547]: I0308 03:47:18.426646 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-os-release\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.426688 master-0 kubenswrapper[7547]: I0308 03:47:18.426657 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-k8s-cni-cncf-io\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.426740 master-0 kubenswrapper[7547]: I0308 03:47:18.426720 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:18.426767 master-0 kubenswrapper[7547]: I0308 03:47:18.426743 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:18.426898 master-0 kubenswrapper[7547]: E0308 03:47:18.426868 7547 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:47:18.426958 master-0 kubenswrapper[7547]: E0308 03:47:18.426937 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert podName:2dd4279d-a1a9-450a-a061-9008cd1ea8e0 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:18.926917911 +0000 UTC m=+1.872602454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert") pod "olm-operator-d64cfc9db-qddlp" (UID: "2dd4279d-a1a9-450a-a061-9008cd1ea8e0") : secret "olm-operator-serving-cert" not found Mar 08 03:47:18.426958 master-0 kubenswrapper[7547]: I0308 03:47:18.426938 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-cnibin\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.427082 master-0 kubenswrapper[7547]: I0308 03:47:18.426988 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-cnibin\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.427082 master-0 kubenswrapper[7547]: I0308 03:47:18.426996 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-os-release\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.427082 master-0 kubenswrapper[7547]: I0308 03:47:18.427000 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-socket-dir-parent\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.427082 master-0 kubenswrapper[7547]: I0308 03:47:18.427082 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.427090 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-socket-dir-parent\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: E0308 03:47:18.427010 7547 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: E0308 03:47:18.427151 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls podName:8efdcef9-9b31-4567-b7f9-cb59a894273d nodeName:}" failed. No retries permitted until 2026-03-08 03:47:18.927142727 +0000 UTC m=+1.872827240 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls") pod "dns-operator-589895fbb7-xttlz" (UID: "8efdcef9-9b31-4567-b7f9-cb59a894273d") : secret "metrics-tls" not found Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: E0308 03:47:18.427178 7547 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: E0308 03:47:18.427234 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics podName:54ad284e-d40e-4e69-b898-f5093952a0e6 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:18.927218908 +0000 UTC m=+1.872903461 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-9sw2d" (UID: "54ad284e-d40e-4e69-b898-f5093952a0e6") : secret "marketplace-operator-metrics" not found Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.427268 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.427342 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-kubelet\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.427394 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kc5q\" (UniqueName: \"kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q\") pod \"network-check-target-xmgpj\" (UID: \"e93b5361-30e6-44fd-a59e-2bc410c59480\") " pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: E0308 03:47:18.427405 7547 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: E0308 03:47:18.427431 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:18.927424213 +0000 UTC m=+1.873108726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.427447 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.427452 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-kubelet\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.427477 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/349d438d-d124-4d34-a172-4160e766c680-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.427535 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-node-log\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: E0308 03:47:18.427564 7547 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: E0308 03:47:18.427608 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs podName:1eb851be-f157-48ea-9a39-1361b68d2639 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:18.927594467 +0000 UTC m=+1.873279010 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs") pod "multus-admission-controller-8d675b596-j8pv6" (UID: "1eb851be-f157-48ea-9a39-1361b68d2639") : secret "multus-admission-controller-secret" not found Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.427606 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/349d438d-d124-4d34-a172-4160e766c680-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.427681 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-slash\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.427644 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-slash\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.427763 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-cni-bin\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.427638 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-node-log\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.427860 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-cni-bin\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.427883 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.427937 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-systemd\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.427959 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.427973 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.428014 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.428059 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: E0308 03:47:18.428090 7547 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.428092 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: E0308 03:47:18.428135 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls podName:0418ff42-7eac-4266-97b5-4df88623d066 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:18.928121889 +0000 UTC m=+1.873806412 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-clqwj" (UID: "0418ff42-7eac-4266-97b5-4df88623d066") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:18.428286 master-0 kubenswrapper[7547]: I0308 03:47:18.428238 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-systemd\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.428478 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: E0308 03:47:18.428634 7547 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: configmap "kube-rbac-proxy" not found Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.428658 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-host-slash\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.428713 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-etc-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.428749 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-etc-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.428761 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-host-slash\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: E0308 03:47:18.428750 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:18.928727674 +0000 UTC m=+1.874412217 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : configmap "kube-rbac-proxy" not found Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.428796 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-cni-netd\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.428859 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.428880 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.428895 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-os-release\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.428926 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-kubelet\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.428945 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-cni-netd\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: E0308 03:47:18.429006 7547 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: E0308 03:47:18.429049 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert podName:349d438d-d124-4d34-a172-4160e766c680 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:18.929032731 +0000 UTC m=+1.874717254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert") pod "cluster-version-operator-745944c6b7-gvmnp" (UID: "349d438d-d124-4d34-a172-4160e766c680") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.429076 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-run-netns\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.429104 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3ddfd0e7-fe76-41bc-b316-94505df81002-host-etc-kube\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.429170 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-run-netns\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.429216 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-kubelet\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.429279 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3ddfd0e7-fe76-41bc-b316-94505df81002-host-etc-kube\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.429339 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-system-cni-dir\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.429410 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-os-release\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.429421 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.429459 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-system-cni-dir\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.429483 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-multus-certs\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: E0308 03:47:18.429525 7547 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.429549 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-cni-bin\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: I0308 03:47:18.429583 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-multus-certs\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.429635 master-0 kubenswrapper[7547]: E0308 03:47:18.429557 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls podName:69eb8ba2-7bfb-4433-8951-08f89e7bcb5f nodeName:}" failed. No retries permitted until 2026-03-08 03:47:18.929547403 +0000 UTC m=+1.875231926 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-572xh" (UID: "69eb8ba2-7bfb-4433-8951-08f89e7bcb5f") : secret "image-registry-operator-tls" not found Mar 08 03:47:18.430711 master-0 kubenswrapper[7547]: I0308 03:47:18.429686 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-cni-bin\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.435456 master-0 kubenswrapper[7547]: I0308 03:47:18.435433 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-cni-binary-copy\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.438459 master-0 kubenswrapper[7547]: I0308 03:47:18.438398 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-cni-binary-copy\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:18.448862 master-0 kubenswrapper[7547]: I0308 03:47:18.448721 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 03:47:18.466677 master-0 kubenswrapper[7547]: I0308 03:47:18.466612 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 03:47:18.487857 master-0 kubenswrapper[7547]: I0308 03:47:18.487778 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 03:47:18.495874 master-0 kubenswrapper[7547]: I0308 03:47:18.495772 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-serving-cert\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:47:18.511596 master-0 kubenswrapper[7547]: I0308 03:47:18.511554 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 03:47:18.516683 master-0 kubenswrapper[7547]: I0308 03:47:18.516645 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-config\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:47:18.527167 master-0 kubenswrapper[7547]: I0308 03:47:18.527143 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 03:47:18.528081 master-0 kubenswrapper[7547]: I0308 03:47:18.528052 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovnkube-script-lib\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.545670 master-0 kubenswrapper[7547]: I0308 03:47:18.545644 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 03:47:18.550621 master-0 kubenswrapper[7547]: I0308 03:47:18.550582 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovn-node-metrics-cert\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:18.567923 master-0 kubenswrapper[7547]: I0308 03:47:18.567897 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 03:47:18.576185 master-0 kubenswrapper[7547]: I0308 03:47:18.576151 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-iptables-alerter-script\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:47:18.587596 master-0 kubenswrapper[7547]: I0308 03:47:18.587564 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 08 03:47:18.589212 master-0 kubenswrapper[7547]: I0308 03:47:18.589180 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-whereabouts-configmap\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.606236 master-0 kubenswrapper[7547]: I0308 03:47:18.606202 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 03:47:18.609706 master-0 kubenswrapper[7547]: E0308 03:47:18.609675 7547 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 03:47:18.609776 master-0 kubenswrapper[7547]: E0308 03:47:18.609741 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs podName:d5044ffd-0686-4679-9894-e696faf33699 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:19.109719619 +0000 UTC m=+2.055404142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs") pod "network-metrics-daemon-schjl" (UID: "d5044ffd-0686-4679-9894-e696faf33699") : secret "metrics-daemon-secret" not found Mar 08 03:47:18.627457 master-0 kubenswrapper[7547]: I0308 03:47:18.627396 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 03:47:18.630016 master-0 kubenswrapper[7547]: I0308 03:47:18.629975 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.646386 master-0 kubenswrapper[7547]: I0308 03:47:18.646332 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 03:47:18.697595 master-0 kubenswrapper[7547]: E0308 03:47:18.697543 7547 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:47:18.715112 master-0 kubenswrapper[7547]: E0308 03:47:18.715072 7547 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:18.738551 master-0 kubenswrapper[7547]: E0308 03:47:18.738509 7547 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:18.756221 master-0 kubenswrapper[7547]: W0308 03:47:18.755793 7547 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 08 03:47:18.756221 master-0 kubenswrapper[7547]: E0308 03:47:18.755886 7547 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:47:18.775095 master-0 kubenswrapper[7547]: E0308 03:47:18.774897 7547 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:47:18.799732 master-0 kubenswrapper[7547]: I0308 03:47:18.799687 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpsx7\" (UniqueName: \"kubernetes.io/projected/8efdcef9-9b31-4567-b7f9-cb59a894273d-kube-api-access-cpsx7\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:18.823528 master-0 kubenswrapper[7547]: I0308 03:47:18.823487 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:47:18.843914 master-0 kubenswrapper[7547]: I0308 03:47:18.843879 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:47:18.872642 master-0 kubenswrapper[7547]: I0308 03:47:18.872602 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxxhh\" (UniqueName: \"kubernetes.io/projected/ee586416-6f56-4ea4-ad62-95de1e6df23b-kube-api-access-sxxhh\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:47:18.900762 master-0 kubenswrapper[7547]: I0308 03:47:18.893764 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgc7c\" (UniqueName: \"kubernetes.io/projected/3ddfd0e7-fe76-41bc-b316-94505df81002-kube-api-access-bgc7c\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:47:18.911408 master-0 kubenswrapper[7547]: I0308 03:47:18.911337 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfvnn\" (UniqueName: \"kubernetes.io/projected/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-kube-api-access-cfvnn\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: I0308 03:47:18.936506 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: I0308 03:47:18.936625 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: I0308 03:47:18.936694 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: I0308 03:47:18.936758 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: I0308 03:47:18.936819 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: I0308 03:47:18.936895 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: I0308 03:47:18.936998 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: I0308 03:47:18.937055 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: I0308 03:47:18.937088 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: I0308 03:47:18.937154 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: I0308 03:47:18.937187 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: I0308 03:47:18.937245 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: I0308 03:47:18.937276 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: I0308 03:47:18.937312 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: I0308 03:47:18.937347 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: I0308 03:47:18.937400 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: E0308 03:47:18.937657 7547 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: E0308 03:47:18.937733 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs podName:1eb851be-f157-48ea-9a39-1361b68d2639 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:19.937709357 +0000 UTC m=+2.883393910 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs") pod "multus-admission-controller-8d675b596-j8pv6" (UID: "1eb851be-f157-48ea-9a39-1361b68d2639") : secret "multus-admission-controller-secret" not found Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: E0308 03:47:18.938532 7547 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: E0308 03:47:18.938583 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls podName:0418ff42-7eac-4266-97b5-4df88623d066 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:19.938565838 +0000 UTC m=+2.884250381 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-clqwj" (UID: "0418ff42-7eac-4266-97b5-4df88623d066") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: E0308 03:47:18.938632 7547 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: configmap "kube-rbac-proxy" not found Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: E0308 03:47:18.938666 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:19.9386543 +0000 UTC m=+2.884338843 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : configmap "kube-rbac-proxy" not found Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: E0308 03:47:18.938730 7547 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: E0308 03:47:18.938764 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert podName:349d438d-d124-4d34-a172-4160e766c680 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:19.938753092 +0000 UTC m=+2.884437645 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert") pod "cluster-version-operator-745944c6b7-gvmnp" (UID: "349d438d-d124-4d34-a172-4160e766c680") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: E0308 03:47:18.938858 7547 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: E0308 03:47:18.938893 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls podName:69eb8ba2-7bfb-4433-8951-08f89e7bcb5f nodeName:}" failed. No retries permitted until 2026-03-08 03:47:19.938882035 +0000 UTC m=+2.884566588 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-572xh" (UID: "69eb8ba2-7bfb-4433-8951-08f89e7bcb5f") : secret "image-registry-operator-tls" not found Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: E0308 03:47:18.938956 7547 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: E0308 03:47:18.938992 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert podName:6cde5024-edf7-4fa4-8964-cabe7899578b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:19.938978617 +0000 UTC m=+2.884663160 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-c46zz" (UID: "6cde5024-edf7-4fa4-8964-cabe7899578b") : secret "package-server-manager-serving-cert" not found Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: E0308 03:47:18.939049 7547 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: E0308 03:47:18.939082 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:19.93907142 +0000 UTC m=+2.884755963 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: E0308 03:47:18.939141 7547 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: E0308 03:47:18.939176 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:19.939165462 +0000 UTC m=+2.884850005 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "node-tuning-operator-tls" not found Mar 08 03:47:18.939209 master-0 kubenswrapper[7547]: E0308 03:47:18.939238 7547 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:18.940521 master-0 kubenswrapper[7547]: E0308 03:47:18.939271 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:19.939259854 +0000 UTC m=+2.884944407 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:18.940521 master-0 kubenswrapper[7547]: E0308 03:47:18.939331 7547 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:18.940521 master-0 kubenswrapper[7547]: E0308 03:47:18.939400 7547 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 08 03:47:18.940521 master-0 kubenswrapper[7547]: E0308 03:47:18.939507 7547 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:47:18.940521 master-0 kubenswrapper[7547]: E0308 03:47:18.939344 7547 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:47:18.940521 master-0 kubenswrapper[7547]: E0308 03:47:18.939566 7547 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:18.940521 master-0 kubenswrapper[7547]: E0308 03:47:18.939509 7547 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:47:18.940521 master-0 kubenswrapper[7547]: E0308 03:47:18.939416 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls podName:8efdcef9-9b31-4567-b7f9-cb59a894273d nodeName:}" failed. No retries permitted until 2026-03-08 03:47:19.939403087 +0000 UTC m=+2.885087640 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls") pod "dns-operator-589895fbb7-xttlz" (UID: "8efdcef9-9b31-4567-b7f9-cb59a894273d") : secret "metrics-tls" not found Mar 08 03:47:18.940521 master-0 kubenswrapper[7547]: E0308 03:47:18.939620 7547 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:18.940521 master-0 kubenswrapper[7547]: E0308 03:47:18.939633 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:19.939620072 +0000 UTC m=+2.885304615 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : secret "mco-proxy-tls" not found Mar 08 03:47:18.940521 master-0 kubenswrapper[7547]: E0308 03:47:18.939657 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert podName:1482d789-884b-4337-b598-f0e2b71eb9f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:19.939644453 +0000 UTC m=+2.885328996 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert") pod "catalog-operator-7d9c49f57b-qlfgq" (UID: "1482d789-884b-4337-b598-f0e2b71eb9f2") : secret "catalog-operator-serving-cert" not found Mar 08 03:47:18.940521 master-0 kubenswrapper[7547]: E0308 03:47:18.939710 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert podName:2dd4279d-a1a9-450a-a061-9008cd1ea8e0 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:19.939700254 +0000 UTC m=+2.885384797 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert") pod "olm-operator-d64cfc9db-qddlp" (UID: "2dd4279d-a1a9-450a-a061-9008cd1ea8e0") : secret "olm-operator-serving-cert" not found Mar 08 03:47:18.940521 master-0 kubenswrapper[7547]: E0308 03:47:18.939805 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls podName:c9de4939-680a-4e3e-89fd-e20ecb8b10f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:19.939721045 +0000 UTC m=+2.885405598 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls") pod "ingress-operator-677db989d6-t77qr" (UID: "c9de4939-680a-4e3e-89fd-e20ecb8b10f2") : secret "metrics-tls" not found Mar 08 03:47:18.940521 master-0 kubenswrapper[7547]: E0308 03:47:18.939852 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics podName:54ad284e-d40e-4e69-b898-f5093952a0e6 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:19.939818627 +0000 UTC m=+2.885503170 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-9sw2d" (UID: "54ad284e-d40e-4e69-b898-f5093952a0e6") : secret "marketplace-operator-metrics" not found Mar 08 03:47:18.940521 master-0 kubenswrapper[7547]: E0308 03:47:18.939874 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:19.939864798 +0000 UTC m=+2.885549341 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:18.940521 master-0 kubenswrapper[7547]: I0308 03:47:18.940432 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw7bx\" (UniqueName: \"kubernetes.io/projected/4a19441e-e61b-4d58-85db-813ae88e1f9b-kube-api-access-dw7bx\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:47:18.950330 master-0 kubenswrapper[7547]: I0308 03:47:18.950250 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxkm6\" (UniqueName: \"kubernetes.io/projected/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-kube-api-access-zxkm6\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:47:18.961256 master-0 kubenswrapper[7547]: I0308 03:47:18.961206 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fncng\" (UniqueName: \"kubernetes.io/projected/30211469-7108-4820-a988-26fc4ced734e-kube-api-access-fncng\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:47:18.988556 master-0 kubenswrapper[7547]: I0308 03:47:18.988508 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkzb2\" (UniqueName: \"kubernetes.io/projected/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-kube-api-access-mkzb2\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:47:18.989182 master-0 kubenswrapper[7547]: I0308 03:47:18.989081 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:19.006685 master-0 kubenswrapper[7547]: I0308 03:47:19.006658 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rchj5\" (UniqueName: \"kubernetes.io/projected/e78b283b-981e-48d7-a5f2-53f8401766ea-kube-api-access-rchj5\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:19.028663 master-0 kubenswrapper[7547]: I0308 03:47:19.028603 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqhzl\" (UniqueName: \"kubernetes.io/projected/1eb851be-f157-48ea-9a39-1361b68d2639-kube-api-access-nqhzl\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:19.041650 master-0 kubenswrapper[7547]: I0308 03:47:19.041604 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd549\" (UniqueName: \"kubernetes.io/projected/52b495ac-bb28-44f3-b925-3c54f86d5ec4-kube-api-access-dd549\") pod \"csi-snapshot-controller-operator-5685fbc7d-xhbrl\" (UID: \"52b495ac-bb28-44f3-b925-3c54f86d5ec4\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-xhbrl" Mar 08 03:47:19.059074 master-0 kubenswrapper[7547]: I0308 03:47:19.058596 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfqc5\" (UniqueName: \"kubernetes.io/projected/7ff63c73-62a3-44b4-acd3-1b3df175794f-kube-api-access-vfqc5\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:47:19.078524 master-0 kubenswrapper[7547]: I0308 03:47:19.078478 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hccv4\" (UniqueName: \"kubernetes.io/projected/0ebf1330-e044-4ff5-8b48-2d667e0c5625-kube-api-access-hccv4\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:47:19.102341 master-0 kubenswrapper[7547]: I0308 03:47:19.102302 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4stz\" (UniqueName: \"kubernetes.io/projected/164586b1-f133-4427-8ab6-eb0839b79738-kube-api-access-r4stz\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:47:19.119158 master-0 kubenswrapper[7547]: I0308 03:47:19.118958 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwkwt\" (UniqueName: \"kubernetes.io/projected/d831cb23-7411-4072-8273-c167d9afca28-kube-api-access-dwkwt\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:19.121012 master-0 kubenswrapper[7547]: I0308 03:47:19.120684 7547 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 03:47:19.140485 master-0 kubenswrapper[7547]: I0308 03:47:19.140433 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:47:19.140725 master-0 kubenswrapper[7547]: E0308 03:47:19.140693 7547 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 03:47:19.140799 master-0 kubenswrapper[7547]: E0308 03:47:19.140768 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs podName:d5044ffd-0686-4679-9894-e696faf33699 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:20.140744602 +0000 UTC m=+3.086429145 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs") pod "network-metrics-daemon-schjl" (UID: "d5044ffd-0686-4679-9894-e696faf33699") : secret "metrics-daemon-secret" not found Mar 08 03:47:19.142629 master-0 kubenswrapper[7547]: I0308 03:47:19.142581 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b5zb\" (UniqueName: \"kubernetes.io/projected/5a7752f9-7b9a-451f-997a-e9f696d38b34-kube-api-access-8b5zb\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:47:19.159269 master-0 kubenswrapper[7547]: I0308 03:47:19.159228 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmpdd\" (UniqueName: \"kubernetes.io/projected/0418ff42-7eac-4266-97b5-4df88623d066-kube-api-access-kmpdd\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:19.177470 master-0 kubenswrapper[7547]: I0308 03:47:19.177421 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vklx\" (UniqueName: \"kubernetes.io/projected/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-kube-api-access-2vklx\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:19.197782 master-0 kubenswrapper[7547]: I0308 03:47:19.197741 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mghmh\" (UniqueName: \"kubernetes.io/projected/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-kube-api-access-mghmh\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:19.231613 master-0 kubenswrapper[7547]: I0308 03:47:19.231448 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmhtb\" (UniqueName: \"kubernetes.io/projected/d5044ffd-0686-4679-9894-e696faf33699-kube-api-access-mmhtb\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:47:19.239596 master-0 kubenswrapper[7547]: I0308 03:47:19.239546 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sx5s\" (UniqueName: \"kubernetes.io/projected/b3eea925-73b3-4693-8f0e-6dd26107f60a-kube-api-access-6sx5s\") pod \"cluster-storage-operator-6fbfc8dc8f-nm8fj\" (UID: \"b3eea925-73b3-4693-8f0e-6dd26107f60a\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" Mar 08 03:47:19.257550 master-0 kubenswrapper[7547]: I0308 03:47:19.257501 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qn5v\" (UniqueName: \"kubernetes.io/projected/0d377285-0336-41b7-b48f-c44a7b563498-kube-api-access-7qn5v\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:47:19.305737 master-0 kubenswrapper[7547]: I0308 03:47:19.305692 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/349d438d-d124-4d34-a172-4160e766c680-kube-api-access\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:19.305953 master-0 kubenswrapper[7547]: I0308 03:47:19.305904 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x997v\" (UniqueName: \"kubernetes.io/projected/6cde5024-edf7-4fa4-8964-cabe7899578b-kube-api-access-x997v\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:19.325599 master-0 kubenswrapper[7547]: I0308 03:47:19.325564 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw7mr\" (UniqueName: \"kubernetes.io/projected/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-kube-api-access-fw7mr\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:19.325770 master-0 kubenswrapper[7547]: I0308 03:47:19.325752 7547 request.go:700] Waited for 1.004197333s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-multus/serviceaccounts/default/token Mar 08 03:47:19.342630 master-0 kubenswrapper[7547]: I0308 03:47:19.342594 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nk8r\" (UniqueName: \"kubernetes.io/projected/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-kube-api-access-7nk8r\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:47:19.360538 master-0 kubenswrapper[7547]: I0308 03:47:19.360496 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26180f77-0b1a-4d0f-9ed0-a12fdee69817-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:47:19.380701 master-0 kubenswrapper[7547]: I0308 03:47:19.380645 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6gml\" (UniqueName: \"kubernetes.io/projected/7e5935ea-8d95-45e3-b836-c7892953ef3d-kube-api-access-c6gml\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:47:19.399799 master-0 kubenswrapper[7547]: I0308 03:47:19.399752 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-bound-sa-token\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:19.420748 master-0 kubenswrapper[7547]: I0308 03:47:19.420709 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lfcj\" (UniqueName: \"kubernetes.io/projected/54ad284e-d40e-4e69-b898-f5093952a0e6-kube-api-access-9lfcj\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:19.441790 master-0 kubenswrapper[7547]: I0308 03:47:19.441703 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:19.462111 master-0 kubenswrapper[7547]: I0308 03:47:19.462021 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2h62\" (UniqueName: \"kubernetes.io/projected/1482d789-884b-4337-b598-f0e2b71eb9f2-kube-api-access-m2h62\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:19.479293 master-0 kubenswrapper[7547]: I0308 03:47:19.479220 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnzt7\" (UniqueName: \"kubernetes.io/projected/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-kube-api-access-pnzt7\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:19.501972 master-0 kubenswrapper[7547]: I0308 03:47:19.501798 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx4fw\" (UniqueName: \"kubernetes.io/projected/232c421d-96f0-4894-b8d8-74f43d02bbd3-kube-api-access-fx4fw\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:19.521493 master-0 kubenswrapper[7547]: I0308 03:47:19.521444 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29dpg\" (UniqueName: \"kubernetes.io/projected/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-kube-api-access-29dpg\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:19.541527 master-0 kubenswrapper[7547]: I0308 03:47:19.541451 7547 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 08 03:47:19.548617 master-0 kubenswrapper[7547]: I0308 03:47:19.548560 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kc5q\" (UniqueName: \"kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q\") pod \"network-check-target-xmgpj\" (UID: \"e93b5361-30e6-44fd-a59e-2bc410c59480\") " pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:47:19.566907 master-0 kubenswrapper[7547]: E0308 03:47:19.566785 7547 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba" Mar 08 03:47:19.567396 master-0 kubenswrapper[7547]: E0308 03:47:19.567313 7547 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:service-ca-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba,Command:[service-ca-operator operator],Args:[--config=/var/run/configmaps/config/operator-config.yaml -v=2],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{83886080 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7qn5v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod service-ca-operator-69b6fc6b88-kg795_openshift-service-ca-operator(0d377285-0336-41b7-b48f-c44a7b563498): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 03:47:19.568645 master-0 kubenswrapper[7547]: E0308 03:47:19.568586 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" podUID="0d377285-0336-41b7-b48f-c44a7b563498" Mar 08 03:47:19.725705 master-0 kubenswrapper[7547]: I0308 03:47:19.725555 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:47:19.748749 master-0 kubenswrapper[7547]: I0308 03:47:19.748624 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:19.834743 master-0 kubenswrapper[7547]: I0308 03:47:19.834540 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:19.949310 master-0 kubenswrapper[7547]: I0308 03:47:19.949246 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:19.949310 master-0 kubenswrapper[7547]: I0308 03:47:19.949292 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:19.949310 master-0 kubenswrapper[7547]: I0308 03:47:19.949314 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:19.949310 master-0 kubenswrapper[7547]: I0308 03:47:19.949341 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:19.949974 master-0 kubenswrapper[7547]: E0308 03:47:19.949510 7547 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 08 03:47:19.949974 master-0 kubenswrapper[7547]: I0308 03:47:19.949738 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:19.949974 master-0 kubenswrapper[7547]: E0308 03:47:19.949817 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:21.949786586 +0000 UTC m=+4.895471139 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : secret "mco-proxy-tls" not found Mar 08 03:47:19.949974 master-0 kubenswrapper[7547]: I0308 03:47:19.949905 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:19.949974 master-0 kubenswrapper[7547]: I0308 03:47:19.949958 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:19.950171 master-0 kubenswrapper[7547]: I0308 03:47:19.949998 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:19.950171 master-0 kubenswrapper[7547]: I0308 03:47:19.950068 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:19.950171 master-0 kubenswrapper[7547]: I0308 03:47:19.950109 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:19.950171 master-0 kubenswrapper[7547]: I0308 03:47:19.950155 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:19.950324 master-0 kubenswrapper[7547]: I0308 03:47:19.950207 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:19.950324 master-0 kubenswrapper[7547]: I0308 03:47:19.950281 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:19.950324 master-0 kubenswrapper[7547]: I0308 03:47:19.950318 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:19.950421 master-0 kubenswrapper[7547]: I0308 03:47:19.950365 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:19.950421 master-0 kubenswrapper[7547]: I0308 03:47:19.950399 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:19.950575 master-0 kubenswrapper[7547]: E0308 03:47:19.949963 7547 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:47:19.950617 master-0 kubenswrapper[7547]: E0308 03:47:19.950601 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert podName:2dd4279d-a1a9-450a-a061-9008cd1ea8e0 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:21.950586864 +0000 UTC m=+4.896271417 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert") pod "olm-operator-d64cfc9db-qddlp" (UID: "2dd4279d-a1a9-450a-a061-9008cd1ea8e0") : secret "olm-operator-serving-cert" not found Mar 08 03:47:19.950707 master-0 kubenswrapper[7547]: E0308 03:47:19.950677 7547 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:47:19.950707 master-0 kubenswrapper[7547]: E0308 03:47:19.950699 7547 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:47:19.950847 master-0 kubenswrapper[7547]: E0308 03:47:19.950724 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs podName:1eb851be-f157-48ea-9a39-1361b68d2639 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:21.950710447 +0000 UTC m=+4.896395000 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs") pod "multus-admission-controller-8d675b596-j8pv6" (UID: "1eb851be-f157-48ea-9a39-1361b68d2639") : secret "multus-admission-controller-secret" not found Mar 08 03:47:19.950847 master-0 kubenswrapper[7547]: E0308 03:47:19.950754 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics podName:54ad284e-d40e-4e69-b898-f5093952a0e6 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:21.950737178 +0000 UTC m=+4.896421691 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-9sw2d" (UID: "54ad284e-d40e-4e69-b898-f5093952a0e6") : secret "marketplace-operator-metrics" not found Mar 08 03:47:19.950847 master-0 kubenswrapper[7547]: E0308 03:47:19.950786 7547 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:19.950847 master-0 kubenswrapper[7547]: E0308 03:47:19.950809 7547 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:47:19.950985 master-0 kubenswrapper[7547]: E0308 03:47:19.950861 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls podName:0418ff42-7eac-4266-97b5-4df88623d066 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:21.950813889 +0000 UTC m=+4.896498442 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-clqwj" (UID: "0418ff42-7eac-4266-97b5-4df88623d066") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:19.950985 master-0 kubenswrapper[7547]: E0308 03:47:19.950898 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert podName:6cde5024-edf7-4fa4-8964-cabe7899578b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:21.950880651 +0000 UTC m=+4.896565374 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-c46zz" (UID: "6cde5024-edf7-4fa4-8964-cabe7899578b") : secret "package-server-manager-serving-cert" not found Mar 08 03:47:19.950985 master-0 kubenswrapper[7547]: E0308 03:47:19.950007 7547 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:19.950985 master-0 kubenswrapper[7547]: E0308 03:47:19.950933 7547 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: configmap "kube-rbac-proxy" not found Mar 08 03:47:19.950985 master-0 kubenswrapper[7547]: E0308 03:47:19.950965 7547 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:19.951133 master-0 kubenswrapper[7547]: E0308 03:47:19.950972 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls podName:8efdcef9-9b31-4567-b7f9-cb59a894273d nodeName:}" failed. No retries permitted until 2026-03-08 03:47:21.950956823 +0000 UTC m=+4.896641336 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls") pod "dns-operator-589895fbb7-xttlz" (UID: "8efdcef9-9b31-4567-b7f9-cb59a894273d") : secret "metrics-tls" not found Mar 08 03:47:19.951133 master-0 kubenswrapper[7547]: E0308 03:47:19.951026 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:21.951013194 +0000 UTC m=+4.896697747 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : configmap "kube-rbac-proxy" not found Mar 08 03:47:19.951133 master-0 kubenswrapper[7547]: E0308 03:47:19.951034 7547 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:47:19.951133 master-0 kubenswrapper[7547]: E0308 03:47:19.951037 7547 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:47:19.951133 master-0 kubenswrapper[7547]: E0308 03:47:19.950117 7547 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:19.951133 master-0 kubenswrapper[7547]: E0308 03:47:19.950073 7547 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:47:19.951133 master-0 kubenswrapper[7547]: E0308 03:47:19.951082 7547 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:47:19.951133 master-0 kubenswrapper[7547]: E0308 03:47:19.950532 7547 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:19.951133 master-0 kubenswrapper[7547]: E0308 03:47:19.951049 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:21.951036665 +0000 UTC m=+4.896721208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:19.951386 master-0 kubenswrapper[7547]: E0308 03:47:19.951160 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls podName:69eb8ba2-7bfb-4433-8951-08f89e7bcb5f nodeName:}" failed. No retries permitted until 2026-03-08 03:47:21.951151277 +0000 UTC m=+4.896835790 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-572xh" (UID: "69eb8ba2-7bfb-4433-8951-08f89e7bcb5f") : secret "image-registry-operator-tls" not found Mar 08 03:47:19.951386 master-0 kubenswrapper[7547]: E0308 03:47:19.950042 7547 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:19.951386 master-0 kubenswrapper[7547]: E0308 03:47:19.951177 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:21.951168248 +0000 UTC m=+4.896852761 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "node-tuning-operator-tls" not found Mar 08 03:47:19.951386 master-0 kubenswrapper[7547]: E0308 03:47:19.951190 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:21.951182978 +0000 UTC m=+4.896867491 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:19.951386 master-0 kubenswrapper[7547]: E0308 03:47:19.951208 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert podName:1482d789-884b-4337-b598-f0e2b71eb9f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:21.951198518 +0000 UTC m=+4.896883261 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert") pod "catalog-operator-7d9c49f57b-qlfgq" (UID: "1482d789-884b-4337-b598-f0e2b71eb9f2") : secret "catalog-operator-serving-cert" not found Mar 08 03:47:19.951386 master-0 kubenswrapper[7547]: E0308 03:47:19.951232 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert podName:349d438d-d124-4d34-a172-4160e766c680 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:21.951222079 +0000 UTC m=+4.896906812 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert") pod "cluster-version-operator-745944c6b7-gvmnp" (UID: "349d438d-d124-4d34-a172-4160e766c680") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:47:19.951386 master-0 kubenswrapper[7547]: E0308 03:47:19.951252 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:21.951244209 +0000 UTC m=+4.896928932 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:19.951386 master-0 kubenswrapper[7547]: E0308 03:47:19.951270 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls podName:c9de4939-680a-4e3e-89fd-e20ecb8b10f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:21.9512623 +0000 UTC m=+4.896947123 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls") pod "ingress-operator-677db989d6-t77qr" (UID: "c9de4939-680a-4e3e-89fd-e20ecb8b10f2") : secret "metrics-tls" not found Mar 08 03:47:19.978907 master-0 kubenswrapper[7547]: E0308 03:47:19.978811 7547 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ceca1efee55b9fd5089428476bbc401fe73db7c0b0f5e16d4ad28ed0f0f9d43" Mar 08 03:47:19.979135 master-0 kubenswrapper[7547]: E0308 03:47:19.979066 7547 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:openshift-api,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ceca1efee55b9fd5089428476bbc401fe73db7c0b0f5e16d4ad28ed0f0f9d43,Command:[write-available-featuresets --asset-output-dir=/available-featuregates --payload-version=$(OPERATOR_IMAGE_VERSION)],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:available-featuregates,ReadOnly:false,MountPath:/available-featuregates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mghmh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-config-operator-64488f9d78-vfgfp_openshift-config-operator(0918ba32-8e55-48d0-8e50-027c0dcb4bbd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 03:47:19.980267 master-0 kubenswrapper[7547]: E0308 03:47:19.980204 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-api\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" podUID="0918ba32-8e55-48d0-8e50-027c0dcb4bbd" Mar 08 03:47:20.158168 master-0 kubenswrapper[7547]: I0308 03:47:20.157652 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:47:20.158438 master-0 kubenswrapper[7547]: E0308 03:47:20.158406 7547 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 03:47:20.158685 master-0 kubenswrapper[7547]: E0308 03:47:20.158489 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs podName:d5044ffd-0686-4679-9894-e696faf33699 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:22.158469442 +0000 UTC m=+5.104153955 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs") pod "network-metrics-daemon-schjl" (UID: "d5044ffd-0686-4679-9894-e696faf33699") : secret "metrics-daemon-secret" not found Mar 08 03:47:20.175213 master-0 kubenswrapper[7547]: I0308 03:47:20.175158 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xmgpj"] Mar 08 03:47:20.351902 master-0 kubenswrapper[7547]: I0308 03:47:20.347901 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" event={"ID":"0ebf1330-e044-4ff5-8b48-2d667e0c5625","Type":"ContainerStarted","Data":"8a84af60c043e955bcc0105f0aa3f93048f54c92376777b25ef1335389f355a8"} Mar 08 03:47:20.351902 master-0 kubenswrapper[7547]: I0308 03:47:20.350332 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" event={"ID":"30211469-7108-4820-a988-26fc4ced734e","Type":"ContainerStarted","Data":"47c3f7232d0f0bc4de9dbe2ca382d3e0709c3d618e0b06a088f2ef41c6b071e7"} Mar 08 03:47:20.373001 master-0 kubenswrapper[7547]: I0308 03:47:20.359368 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xmgpj" event={"ID":"e93b5361-30e6-44fd-a59e-2bc410c59480","Type":"ContainerStarted","Data":"dc0e0feca8d08363d7aeeb0e56e61f125a0c90431bd31ea7f3ad61c6ddd5d77c"} Mar 08 03:47:20.373001 master-0 kubenswrapper[7547]: I0308 03:47:20.361016 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-xhbrl" event={"ID":"52b495ac-bb28-44f3-b925-3c54f86d5ec4","Type":"ContainerStarted","Data":"28f9b8de30e138ff1c0f5121ef3d99218547c5cff5db5ed14816e1b1a9d2b199"} Mar 08 03:47:20.389443 master-0 kubenswrapper[7547]: I0308 03:47:20.389391 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" event={"ID":"ee586416-6f56-4ea4-ad62-95de1e6df23b","Type":"ContainerStarted","Data":"f0c21a56c7d12d77087ad5558ab608389fecd51a0d4bdef95c63dd3e4d27cfef"} Mar 08 03:47:20.652697 master-0 kubenswrapper[7547]: I0308 03:47:20.652354 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:20.660456 master-0 kubenswrapper[7547]: I0308 03:47:20.660425 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:21.039103 master-0 kubenswrapper[7547]: I0308 03:47:21.039048 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:21.066892 master-0 kubenswrapper[7547]: I0308 03:47:21.064765 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:21.367977 master-0 kubenswrapper[7547]: I0308 03:47:21.367650 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp"] Mar 08 03:47:21.368588 master-0 kubenswrapper[7547]: E0308 03:47:21.368022 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" containerName="assisted-installer-controller" Mar 08 03:47:21.368588 master-0 kubenswrapper[7547]: I0308 03:47:21.368032 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" containerName="assisted-installer-controller" Mar 08 03:47:21.368588 master-0 kubenswrapper[7547]: E0308 03:47:21.368043 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="689a1fe4-9189-4a55-a61a-94a155b8040d" containerName="prober" Mar 08 03:47:21.368588 master-0 kubenswrapper[7547]: I0308 03:47:21.368048 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="689a1fe4-9189-4a55-a61a-94a155b8040d" containerName="prober" Mar 08 03:47:21.368588 master-0 kubenswrapper[7547]: I0308 03:47:21.368116 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="689a1fe4-9189-4a55-a61a-94a155b8040d" containerName="prober" Mar 08 03:47:21.368588 master-0 kubenswrapper[7547]: I0308 03:47:21.368127 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" containerName="assisted-installer-controller" Mar 08 03:47:21.368588 master-0 kubenswrapper[7547]: I0308 03:47:21.368334 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" Mar 08 03:47:21.384587 master-0 kubenswrapper[7547]: I0308 03:47:21.384555 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp"] Mar 08 03:47:21.399952 master-0 kubenswrapper[7547]: I0308 03:47:21.399913 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xmgpj" event={"ID":"e93b5361-30e6-44fd-a59e-2bc410c59480","Type":"ContainerStarted","Data":"41a9f9cd1ae7708a16181457a2350c58bd1e5f1153bd50faa3c75f4f037b3c15"} Mar 08 03:47:21.400443 master-0 kubenswrapper[7547]: I0308 03:47:21.400426 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:47:21.403072 master-0 kubenswrapper[7547]: I0308 03:47:21.403048 7547 generic.go:334] "Generic (PLEG): container finished" podID="7ff63c73-62a3-44b4-acd3-1b3df175794f" containerID="b570308b9f2efb1190e0fe5138fb6f12ce5146071b70dc77e8c4a2c70d7d56d5" exitCode=0 Mar 08 03:47:21.403142 master-0 kubenswrapper[7547]: I0308 03:47:21.403091 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" event={"ID":"7ff63c73-62a3-44b4-acd3-1b3df175794f","Type":"ContainerDied","Data":"b570308b9f2efb1190e0fe5138fb6f12ce5146071b70dc77e8c4a2c70d7d56d5"} Mar 08 03:47:21.436102 master-0 kubenswrapper[7547]: I0308 03:47:21.436060 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" event={"ID":"e4541b7b-3f7f-4851-9bd9-26fcda5cab13","Type":"ContainerStarted","Data":"2b3399b78be3045c232df9d3c4545d85577efb44cef1d2c0a18e98d67e4c7cb7"} Mar 08 03:47:21.462078 master-0 kubenswrapper[7547]: I0308 03:47:21.461981 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" event={"ID":"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a","Type":"ContainerStarted","Data":"357e1b5825405be66b8754168b57e640c102e496555d0a6b7dd9834bacebf15e"} Mar 08 03:47:21.476858 master-0 kubenswrapper[7547]: I0308 03:47:21.476368 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" event={"ID":"5a7752f9-7b9a-451f-997a-e9f696d38b34","Type":"ContainerStarted","Data":"37cbe69de0ead690fe5f97f7713b1a785d6ee472fbe38e74b1ca8bbb8ffc0b32"} Mar 08 03:47:21.481151 master-0 kubenswrapper[7547]: I0308 03:47:21.479176 7547 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:47:21.481151 master-0 kubenswrapper[7547]: I0308 03:47:21.479709 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" event={"ID":"26180f77-0b1a-4d0f-9ed0-a12fdee69817","Type":"ContainerStarted","Data":"2b806592f345fd33c0e6baaad7d7fe21c75572bbe4983f5588e4e61c09a25b29"} Mar 08 03:47:21.487503 master-0 kubenswrapper[7547]: I0308 03:47:21.487469 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:47:21.502695 master-0 kubenswrapper[7547]: I0308 03:47:21.502670 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vndvf\" (UniqueName: \"kubernetes.io/projected/9ec89e27-4360-48f2-a7ca-5d823bda4510-kube-api-access-vndvf\") pod \"csi-snapshot-controller-7577d6f48-h4qlp\" (UID: \"9ec89e27-4360-48f2-a7ca-5d823bda4510\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" Mar 08 03:47:21.507084 master-0 kubenswrapper[7547]: I0308 03:47:21.507053 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-wqldq"] Mar 08 03:47:21.507535 master-0 kubenswrapper[7547]: I0308 03:47:21.507514 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-wqldq" Mar 08 03:47:21.511873 master-0 kubenswrapper[7547]: I0308 03:47:21.511839 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 03:47:21.512090 master-0 kubenswrapper[7547]: I0308 03:47:21.511933 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 03:47:21.520925 master-0 kubenswrapper[7547]: I0308 03:47:21.518401 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:21.530293 master-0 kubenswrapper[7547]: I0308 03:47:21.530231 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-wqldq"] Mar 08 03:47:21.545957 master-0 kubenswrapper[7547]: I0308 03:47:21.545813 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:21.613736 master-0 kubenswrapper[7547]: I0308 03:47:21.610334 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vndvf\" (UniqueName: \"kubernetes.io/projected/9ec89e27-4360-48f2-a7ca-5d823bda4510-kube-api-access-vndvf\") pod \"csi-snapshot-controller-7577d6f48-h4qlp\" (UID: \"9ec89e27-4360-48f2-a7ca-5d823bda4510\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" Mar 08 03:47:21.613736 master-0 kubenswrapper[7547]: I0308 03:47:21.610637 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7smmf\" (UniqueName: \"kubernetes.io/projected/84d353ae-3992-4c17-a20e-3415edd92509-kube-api-access-7smmf\") pod \"migrator-57ccdf9b5-wqldq\" (UID: \"84d353ae-3992-4c17-a20e-3415edd92509\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-wqldq" Mar 08 03:47:21.720502 master-0 kubenswrapper[7547]: I0308 03:47:21.720445 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7smmf\" (UniqueName: \"kubernetes.io/projected/84d353ae-3992-4c17-a20e-3415edd92509-kube-api-access-7smmf\") pod \"migrator-57ccdf9b5-wqldq\" (UID: \"84d353ae-3992-4c17-a20e-3415edd92509\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-wqldq" Mar 08 03:47:21.735755 master-0 kubenswrapper[7547]: I0308 03:47:21.734141 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vndvf\" (UniqueName: \"kubernetes.io/projected/9ec89e27-4360-48f2-a7ca-5d823bda4510-kube-api-access-vndvf\") pod \"csi-snapshot-controller-7577d6f48-h4qlp\" (UID: \"9ec89e27-4360-48f2-a7ca-5d823bda4510\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" Mar 08 03:47:21.737155 master-0 kubenswrapper[7547]: I0308 03:47:21.737087 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7smmf\" (UniqueName: \"kubernetes.io/projected/84d353ae-3992-4c17-a20e-3415edd92509-kube-api-access-7smmf\") pod \"migrator-57ccdf9b5-wqldq\" (UID: \"84d353ae-3992-4c17-a20e-3415edd92509\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-wqldq" Mar 08 03:47:21.866040 master-0 kubenswrapper[7547]: I0308 03:47:21.865980 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-wqldq" Mar 08 03:47:21.995710 master-0 kubenswrapper[7547]: I0308 03:47:21.995165 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: I0308 03:47:22.023164 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: I0308 03:47:22.023590 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: I0308 03:47:22.023612 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: I0308 03:47:22.023643 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: I0308 03:47:22.023661 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: I0308 03:47:22.023688 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: I0308 03:47:22.023703 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: I0308 03:47:22.023723 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: I0308 03:47:22.023738 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: I0308 03:47:22.023755 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: I0308 03:47:22.023777 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: I0308 03:47:22.023796 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: I0308 03:47:22.023814 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: I0308 03:47:22.023858 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: I0308 03:47:22.023879 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: I0308 03:47:22.023898 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.024473 7547 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.024533 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.024515782 +0000 UTC m=+8.970200335 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.024941 7547 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.024965 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert podName:2dd4279d-a1a9-450a-a061-9008cd1ea8e0 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.024957643 +0000 UTC m=+8.970642146 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert") pod "olm-operator-d64cfc9db-qddlp" (UID: "2dd4279d-a1a9-450a-a061-9008cd1ea8e0") : secret "olm-operator-serving-cert" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025007 7547 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025025 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls podName:c9de4939-680a-4e3e-89fd-e20ecb8b10f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.025019414 +0000 UTC m=+8.970703927 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls") pod "ingress-operator-677db989d6-t77qr" (UID: "c9de4939-680a-4e3e-89fd-e20ecb8b10f2") : secret "metrics-tls" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025065 7547 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025075 7547 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025098 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls podName:8efdcef9-9b31-4567-b7f9-cb59a894273d nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.025077475 +0000 UTC m=+8.970761988 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls") pod "dns-operator-589895fbb7-xttlz" (UID: "8efdcef9-9b31-4567-b7f9-cb59a894273d") : secret "metrics-tls" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025114 7547 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025128 7547 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: configmap "kube-rbac-proxy" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025144 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.025126086 +0000 UTC m=+8.970810599 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : secret "mco-proxy-tls" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025159 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.025153077 +0000 UTC m=+8.970837590 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : configmap "kube-rbac-proxy" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025162 7547 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025169 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics podName:54ad284e-d40e-4e69-b898-f5093952a0e6 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.025164477 +0000 UTC m=+8.970848990 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-9sw2d" (UID: "54ad284e-d40e-4e69-b898-f5093952a0e6") : secret "marketplace-operator-metrics" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025181 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.025175368 +0000 UTC m=+8.970859881 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025192 7547 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025210 7547 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025222 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert podName:1482d789-884b-4337-b598-f0e2b71eb9f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.025210258 +0000 UTC m=+8.970894831 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert") pod "catalog-operator-7d9c49f57b-qlfgq" (UID: "1482d789-884b-4337-b598-f0e2b71eb9f2") : secret "catalog-operator-serving-cert" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025240 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.025232249 +0000 UTC m=+8.970916872 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025249 7547 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025266 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls podName:0418ff42-7eac-4266-97b5-4df88623d066 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.025260821 +0000 UTC m=+8.970945334 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-clqwj" (UID: "0418ff42-7eac-4266-97b5-4df88623d066") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025284 7547 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025295 7547 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025308 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs podName:1eb851be-f157-48ea-9a39-1361b68d2639 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.025301402 +0000 UTC m=+8.970986025 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs") pod "multus-admission-controller-8d675b596-j8pv6" (UID: "1eb851be-f157-48ea-9a39-1361b68d2639") : secret "multus-admission-controller-secret" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025323 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls podName:69eb8ba2-7bfb-4433-8951-08f89e7bcb5f nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.025315912 +0000 UTC m=+8.971000545 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-572xh" (UID: "69eb8ba2-7bfb-4433-8951-08f89e7bcb5f") : secret "image-registry-operator-tls" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025335 7547 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025355 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.025348823 +0000 UTC m=+8.971033336 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "node-tuning-operator-tls" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025364 7547 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025410 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert podName:349d438d-d124-4d34-a172-4160e766c680 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.025401904 +0000 UTC m=+8.971086417 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert") pod "cluster-version-operator-745944c6b7-gvmnp" (UID: "349d438d-d124-4d34-a172-4160e766c680") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025416 7547 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:47:22.025424 master-0 kubenswrapper[7547]: E0308 03:47:22.025440 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert podName:6cde5024-edf7-4fa4-8964-cabe7899578b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.025433235 +0000 UTC m=+8.971117868 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-c46zz" (UID: "6cde5024-edf7-4fa4-8964-cabe7899578b") : secret "package-server-manager-serving-cert" not found Mar 08 03:47:22.096888 master-0 kubenswrapper[7547]: I0308 03:47:22.096771 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-wqldq"] Mar 08 03:47:22.172536 master-0 kubenswrapper[7547]: I0308 03:47:22.172469 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp"] Mar 08 03:47:22.180308 master-0 kubenswrapper[7547]: W0308 03:47:22.180264 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ec89e27_4360_48f2_a7ca_5d823bda4510.slice/crio-f8dce45144c680f78255eabd0603f42f2f97c7b82b1aee0c5c17224722da19a3 WatchSource:0}: Error finding container f8dce45144c680f78255eabd0603f42f2f97c7b82b1aee0c5c17224722da19a3: Status 404 returned error can't find the container with id f8dce45144c680f78255eabd0603f42f2f97c7b82b1aee0c5c17224722da19a3 Mar 08 03:47:22.226232 master-0 kubenswrapper[7547]: I0308 03:47:22.226115 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:47:22.226446 master-0 kubenswrapper[7547]: E0308 03:47:22.226334 7547 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 03:47:22.226481 master-0 kubenswrapper[7547]: E0308 03:47:22.226444 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs podName:d5044ffd-0686-4679-9894-e696faf33699 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.22642593 +0000 UTC m=+9.172110443 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs") pod "network-metrics-daemon-schjl" (UID: "d5044ffd-0686-4679-9894-e696faf33699") : secret "metrics-daemon-secret" not found Mar 08 03:47:22.356590 master-0 kubenswrapper[7547]: I0308 03:47:22.356479 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:22.360617 master-0 kubenswrapper[7547]: I0308 03:47:22.360580 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:22.484648 master-0 kubenswrapper[7547]: I0308 03:47:22.484605 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" event={"ID":"9ec89e27-4360-48f2-a7ca-5d823bda4510","Type":"ContainerStarted","Data":"f8dce45144c680f78255eabd0603f42f2f97c7b82b1aee0c5c17224722da19a3"} Mar 08 03:47:22.487502 master-0 kubenswrapper[7547]: I0308 03:47:22.487445 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-wqldq" event={"ID":"84d353ae-3992-4c17-a20e-3415edd92509","Type":"ContainerStarted","Data":"8b3d4b079b2ebb85b87310aca0e7ee26b306a1f0013e66da4d3495d792aa5402"} Mar 08 03:47:22.487502 master-0 kubenswrapper[7547]: I0308 03:47:22.487471 7547 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:47:22.591702 master-0 kubenswrapper[7547]: I0308 03:47:22.589136 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-twx7s"] Mar 08 03:47:22.591702 master-0 kubenswrapper[7547]: I0308 03:47:22.589714 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:22.593334 master-0 kubenswrapper[7547]: I0308 03:47:22.593185 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 03:47:22.593896 master-0 kubenswrapper[7547]: I0308 03:47:22.593770 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-twx7s"] Mar 08 03:47:22.593950 master-0 kubenswrapper[7547]: I0308 03:47:22.593936 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 03:47:22.594070 master-0 kubenswrapper[7547]: I0308 03:47:22.594041 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 03:47:22.594242 master-0 kubenswrapper[7547]: I0308 03:47:22.594218 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 03:47:22.594346 master-0 kubenswrapper[7547]: I0308 03:47:22.594315 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 03:47:22.594398 master-0 kubenswrapper[7547]: I0308 03:47:22.594352 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 03:47:22.733944 master-0 kubenswrapper[7547]: I0308 03:47:22.733880 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f4df75b-f2f1-4107-99df-60b5a528c0a9-serving-cert\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:22.734218 master-0 kubenswrapper[7547]: I0308 03:47:22.733988 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:22.734218 master-0 kubenswrapper[7547]: I0308 03:47:22.734019 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-client-ca\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:22.734218 master-0 kubenswrapper[7547]: I0308 03:47:22.734042 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-config\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:22.734389 master-0 kubenswrapper[7547]: I0308 03:47:22.734288 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzr4c\" (UniqueName: \"kubernetes.io/projected/6f4df75b-f2f1-4107-99df-60b5a528c0a9-kube-api-access-rzr4c\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:22.835355 master-0 kubenswrapper[7547]: I0308 03:47:22.835257 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzr4c\" (UniqueName: \"kubernetes.io/projected/6f4df75b-f2f1-4107-99df-60b5a528c0a9-kube-api-access-rzr4c\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:22.835766 master-0 kubenswrapper[7547]: I0308 03:47:22.835679 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f4df75b-f2f1-4107-99df-60b5a528c0a9-serving-cert\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:22.835969 master-0 kubenswrapper[7547]: E0308 03:47:22.835767 7547 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:47:22.835969 master-0 kubenswrapper[7547]: E0308 03:47:22.835852 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f4df75b-f2f1-4107-99df-60b5a528c0a9-serving-cert podName:6f4df75b-f2f1-4107-99df-60b5a528c0a9 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:23.33581782 +0000 UTC m=+6.281502333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6f4df75b-f2f1-4107-99df-60b5a528c0a9-serving-cert") pod "controller-manager-6f7fd6c796-twx7s" (UID: "6f4df75b-f2f1-4107-99df-60b5a528c0a9") : secret "serving-cert" not found Mar 08 03:47:22.835969 master-0 kubenswrapper[7547]: I0308 03:47:22.835888 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:22.835969 master-0 kubenswrapper[7547]: I0308 03:47:22.835920 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-client-ca\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:22.835969 master-0 kubenswrapper[7547]: I0308 03:47:22.835945 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-config\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:22.836385 master-0 kubenswrapper[7547]: E0308 03:47:22.836094 7547 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 08 03:47:22.836385 master-0 kubenswrapper[7547]: E0308 03:47:22.836126 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-config podName:6f4df75b-f2f1-4107-99df-60b5a528c0a9 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:23.336116227 +0000 UTC m=+6.281800740 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-config") pod "controller-manager-6f7fd6c796-twx7s" (UID: "6f4df75b-f2f1-4107-99df-60b5a528c0a9") : configmap "config" not found Mar 08 03:47:22.836385 master-0 kubenswrapper[7547]: E0308 03:47:22.836175 7547 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Mar 08 03:47:22.836385 master-0 kubenswrapper[7547]: E0308 03:47:22.836212 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-proxy-ca-bundles podName:6f4df75b-f2f1-4107-99df-60b5a528c0a9 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:23.336204049 +0000 UTC m=+6.281888562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-proxy-ca-bundles") pod "controller-manager-6f7fd6c796-twx7s" (UID: "6f4df75b-f2f1-4107-99df-60b5a528c0a9") : configmap "openshift-global-ca" not found Mar 08 03:47:22.836385 master-0 kubenswrapper[7547]: E0308 03:47:22.836245 7547 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:22.836385 master-0 kubenswrapper[7547]: E0308 03:47:22.836266 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-client-ca podName:6f4df75b-f2f1-4107-99df-60b5a528c0a9 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:23.33625903 +0000 UTC m=+6.281943543 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-client-ca") pod "controller-manager-6f7fd6c796-twx7s" (UID: "6f4df75b-f2f1-4107-99df-60b5a528c0a9") : configmap "client-ca" not found Mar 08 03:47:22.859335 master-0 kubenswrapper[7547]: I0308 03:47:22.859270 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzr4c\" (UniqueName: \"kubernetes.io/projected/6f4df75b-f2f1-4107-99df-60b5a528c0a9-kube-api-access-rzr4c\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:23.341374 master-0 kubenswrapper[7547]: I0308 03:47:23.341282 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f4df75b-f2f1-4107-99df-60b5a528c0a9-serving-cert\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:23.341724 master-0 kubenswrapper[7547]: E0308 03:47:23.341663 7547 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:47:23.341919 master-0 kubenswrapper[7547]: E0308 03:47:23.341869 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f4df75b-f2f1-4107-99df-60b5a528c0a9-serving-cert podName:6f4df75b-f2f1-4107-99df-60b5a528c0a9 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.341785781 +0000 UTC m=+7.287470294 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6f4df75b-f2f1-4107-99df-60b5a528c0a9-serving-cert") pod "controller-manager-6f7fd6c796-twx7s" (UID: "6f4df75b-f2f1-4107-99df-60b5a528c0a9") : secret "serving-cert" not found Mar 08 03:47:23.342673 master-0 kubenswrapper[7547]: I0308 03:47:23.342635 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:23.342751 master-0 kubenswrapper[7547]: I0308 03:47:23.342718 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-client-ca\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:23.342804 master-0 kubenswrapper[7547]: I0308 03:47:23.342782 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-config\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:23.343246 master-0 kubenswrapper[7547]: E0308 03:47:23.343187 7547 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 08 03:47:23.346734 master-0 kubenswrapper[7547]: E0308 03:47:23.343368 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-config podName:6f4df75b-f2f1-4107-99df-60b5a528c0a9 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.343357319 +0000 UTC m=+7.289041832 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-config") pod "controller-manager-6f7fd6c796-twx7s" (UID: "6f4df75b-f2f1-4107-99df-60b5a528c0a9") : configmap "config" not found Mar 08 03:47:23.346734 master-0 kubenswrapper[7547]: E0308 03:47:23.343433 7547 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Mar 08 03:47:23.346734 master-0 kubenswrapper[7547]: E0308 03:47:23.343455 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-proxy-ca-bundles podName:6f4df75b-f2f1-4107-99df-60b5a528c0a9 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.343448341 +0000 UTC m=+7.289132854 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-proxy-ca-bundles") pod "controller-manager-6f7fd6c796-twx7s" (UID: "6f4df75b-f2f1-4107-99df-60b5a528c0a9") : configmap "openshift-global-ca" not found Mar 08 03:47:23.346734 master-0 kubenswrapper[7547]: E0308 03:47:23.343510 7547 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:23.346734 master-0 kubenswrapper[7547]: E0308 03:47:23.343530 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-client-ca podName:6f4df75b-f2f1-4107-99df-60b5a528c0a9 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.343524103 +0000 UTC m=+7.289208836 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-client-ca") pod "controller-manager-6f7fd6c796-twx7s" (UID: "6f4df75b-f2f1-4107-99df-60b5a528c0a9") : configmap "client-ca" not found Mar 08 03:47:23.500801 master-0 kubenswrapper[7547]: I0308 03:47:23.500344 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7c28p" event={"ID":"4c5a0c1d-867a-4ce4-9570-ea66452c8db3","Type":"ContainerStarted","Data":"40de018f1b76ab3578709bb7baf8823de1472d2ce674193b796771783f47d5df"} Mar 08 03:47:23.674670 master-0 kubenswrapper[7547]: I0308 03:47:23.674547 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:23.683144 master-0 kubenswrapper[7547]: I0308 03:47:23.683082 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:23.723514 master-0 kubenswrapper[7547]: I0308 03:47:23.722997 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-twx7s"] Mar 08 03:47:23.723514 master-0 kubenswrapper[7547]: E0308 03:47:23.723291 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" podUID="6f4df75b-f2f1-4107-99df-60b5a528c0a9" Mar 08 03:47:23.733930 master-0 kubenswrapper[7547]: I0308 03:47:23.733873 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq"] Mar 08 03:47:23.734567 master-0 kubenswrapper[7547]: I0308 03:47:23.734526 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:23.751914 master-0 kubenswrapper[7547]: I0308 03:47:23.749154 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 03:47:23.751914 master-0 kubenswrapper[7547]: I0308 03:47:23.749358 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 03:47:23.751914 master-0 kubenswrapper[7547]: I0308 03:47:23.749476 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 03:47:23.751914 master-0 kubenswrapper[7547]: I0308 03:47:23.749611 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 03:47:23.751914 master-0 kubenswrapper[7547]: I0308 03:47:23.750025 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 03:47:23.770851 master-0 kubenswrapper[7547]: I0308 03:47:23.764269 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq"] Mar 08 03:47:23.851690 master-0 kubenswrapper[7547]: I0308 03:47:23.851623 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:23.852456 master-0 kubenswrapper[7547]: I0308 03:47:23.851712 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-config\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:23.852456 master-0 kubenswrapper[7547]: I0308 03:47:23.851782 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:23.852456 master-0 kubenswrapper[7547]: I0308 03:47:23.851876 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sccx\" (UniqueName: \"kubernetes.io/projected/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-kube-api-access-9sccx\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:23.952943 master-0 kubenswrapper[7547]: I0308 03:47:23.952904 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-config\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:23.953235 master-0 kubenswrapper[7547]: I0308 03:47:23.952949 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:23.953235 master-0 kubenswrapper[7547]: I0308 03:47:23.952992 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sccx\" (UniqueName: \"kubernetes.io/projected/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-kube-api-access-9sccx\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:23.953235 master-0 kubenswrapper[7547]: E0308 03:47:23.953185 7547 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:47:23.953323 master-0 kubenswrapper[7547]: E0308 03:47:23.953268 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert podName:22ebd67b-43b2-4f9d-955b-eb848d9d55d4 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.45324672 +0000 UTC m=+7.398931233 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert") pod "route-controller-manager-6d8669fddc-zn4lq" (UID: "22ebd67b-43b2-4f9d-955b-eb848d9d55d4") : secret "serving-cert" not found Mar 08 03:47:23.953449 master-0 kubenswrapper[7547]: I0308 03:47:23.953418 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:23.953676 master-0 kubenswrapper[7547]: E0308 03:47:23.953647 7547 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:23.953719 master-0 kubenswrapper[7547]: E0308 03:47:23.953692 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca podName:22ebd67b-43b2-4f9d-955b-eb848d9d55d4 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:24.45367921 +0000 UTC m=+7.399363723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca") pod "route-controller-manager-6d8669fddc-zn4lq" (UID: "22ebd67b-43b2-4f9d-955b-eb848d9d55d4") : configmap "client-ca" not found Mar 08 03:47:23.955935 master-0 kubenswrapper[7547]: I0308 03:47:23.955908 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-config\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:23.971439 master-0 kubenswrapper[7547]: I0308 03:47:23.971392 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sccx\" (UniqueName: \"kubernetes.io/projected/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-kube-api-access-9sccx\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:24.361550 master-0 kubenswrapper[7547]: I0308 03:47:24.361301 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f4df75b-f2f1-4107-99df-60b5a528c0a9-serving-cert\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:24.361550 master-0 kubenswrapper[7547]: I0308 03:47:24.361392 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:24.361550 master-0 kubenswrapper[7547]: I0308 03:47:24.361414 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-client-ca\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:24.361550 master-0 kubenswrapper[7547]: I0308 03:47:24.361437 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-config\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:24.362049 master-0 kubenswrapper[7547]: E0308 03:47:24.361718 7547 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:47:24.362049 master-0 kubenswrapper[7547]: E0308 03:47:24.361775 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f4df75b-f2f1-4107-99df-60b5a528c0a9-serving-cert podName:6f4df75b-f2f1-4107-99df-60b5a528c0a9 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.361761886 +0000 UTC m=+9.307446399 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6f4df75b-f2f1-4107-99df-60b5a528c0a9-serving-cert") pod "controller-manager-6f7fd6c796-twx7s" (UID: "6f4df75b-f2f1-4107-99df-60b5a528c0a9") : secret "serving-cert" not found Mar 08 03:47:24.362651 master-0 kubenswrapper[7547]: I0308 03:47:24.362608 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-config\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:24.363860 master-0 kubenswrapper[7547]: I0308 03:47:24.363786 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-twx7s\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:24.363994 master-0 kubenswrapper[7547]: E0308 03:47:24.363880 7547 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:24.363994 master-0 kubenswrapper[7547]: E0308 03:47:24.363937 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-client-ca podName:6f4df75b-f2f1-4107-99df-60b5a528c0a9 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:26.363926496 +0000 UTC m=+9.309611009 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-client-ca") pod "controller-manager-6f7fd6c796-twx7s" (UID: "6f4df75b-f2f1-4107-99df-60b5a528c0a9") : configmap "client-ca" not found Mar 08 03:47:24.463053 master-0 kubenswrapper[7547]: I0308 03:47:24.462960 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:24.463359 master-0 kubenswrapper[7547]: I0308 03:47:24.463144 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:24.463440 master-0 kubenswrapper[7547]: E0308 03:47:24.463363 7547 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:47:24.463440 master-0 kubenswrapper[7547]: E0308 03:47:24.463435 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert podName:22ebd67b-43b2-4f9d-955b-eb848d9d55d4 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:25.463412961 +0000 UTC m=+8.409097514 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert") pod "route-controller-manager-6d8669fddc-zn4lq" (UID: "22ebd67b-43b2-4f9d-955b-eb848d9d55d4") : secret "serving-cert" not found Mar 08 03:47:24.464975 master-0 kubenswrapper[7547]: E0308 03:47:24.464007 7547 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:24.464975 master-0 kubenswrapper[7547]: E0308 03:47:24.464069 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca podName:22ebd67b-43b2-4f9d-955b-eb848d9d55d4 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:25.464051946 +0000 UTC m=+8.409736499 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca") pod "route-controller-manager-6d8669fddc-zn4lq" (UID: "22ebd67b-43b2-4f9d-955b-eb848d9d55d4") : configmap "client-ca" not found Mar 08 03:47:24.505006 master-0 kubenswrapper[7547]: I0308 03:47:24.503854 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:24.514908 master-0 kubenswrapper[7547]: I0308 03:47:24.514481 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:24.666471 master-0 kubenswrapper[7547]: I0308 03:47:24.666163 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-config\") pod \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " Mar 08 03:47:24.666471 master-0 kubenswrapper[7547]: I0308 03:47:24.666300 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-proxy-ca-bundles\") pod \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " Mar 08 03:47:24.666471 master-0 kubenswrapper[7547]: I0308 03:47:24.666338 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzr4c\" (UniqueName: \"kubernetes.io/projected/6f4df75b-f2f1-4107-99df-60b5a528c0a9-kube-api-access-rzr4c\") pod \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\" (UID: \"6f4df75b-f2f1-4107-99df-60b5a528c0a9\") " Mar 08 03:47:24.666951 master-0 kubenswrapper[7547]: I0308 03:47:24.666900 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-config" (OuterVolumeSpecName: "config") pod "6f4df75b-f2f1-4107-99df-60b5a528c0a9" (UID: "6f4df75b-f2f1-4107-99df-60b5a528c0a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:47:24.667140 master-0 kubenswrapper[7547]: I0308 03:47:24.667072 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6f4df75b-f2f1-4107-99df-60b5a528c0a9" (UID: "6f4df75b-f2f1-4107-99df-60b5a528c0a9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:47:24.677813 master-0 kubenswrapper[7547]: I0308 03:47:24.677765 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f4df75b-f2f1-4107-99df-60b5a528c0a9-kube-api-access-rzr4c" (OuterVolumeSpecName: "kube-api-access-rzr4c") pod "6f4df75b-f2f1-4107-99df-60b5a528c0a9" (UID: "6f4df75b-f2f1-4107-99df-60b5a528c0a9"). InnerVolumeSpecName "kube-api-access-rzr4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:47:24.768298 master-0 kubenswrapper[7547]: I0308 03:47:24.768255 7547 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:24.768364 master-0 kubenswrapper[7547]: I0308 03:47:24.768302 7547 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:24.768364 master-0 kubenswrapper[7547]: I0308 03:47:24.768318 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzr4c\" (UniqueName: \"kubernetes.io/projected/6f4df75b-f2f1-4107-99df-60b5a528c0a9-kube-api-access-rzr4c\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:25.478878 master-0 kubenswrapper[7547]: I0308 03:47:25.478789 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:25.479089 master-0 kubenswrapper[7547]: E0308 03:47:25.478972 7547 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:25.479457 master-0 kubenswrapper[7547]: E0308 03:47:25.479247 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca podName:22ebd67b-43b2-4f9d-955b-eb848d9d55d4 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:27.479226127 +0000 UTC m=+10.424910640 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca") pod "route-controller-manager-6d8669fddc-zn4lq" (UID: "22ebd67b-43b2-4f9d-955b-eb848d9d55d4") : configmap "client-ca" not found Mar 08 03:47:25.479457 master-0 kubenswrapper[7547]: I0308 03:47:25.479278 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:25.479457 master-0 kubenswrapper[7547]: E0308 03:47:25.479399 7547 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:47:25.479457 master-0 kubenswrapper[7547]: E0308 03:47:25.479441 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert podName:22ebd67b-43b2-4f9d-955b-eb848d9d55d4 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:27.479432872 +0000 UTC m=+10.425117385 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert") pod "route-controller-manager-6d8669fddc-zn4lq" (UID: "22ebd67b-43b2-4f9d-955b-eb848d9d55d4") : secret "serving-cert" not found Mar 08 03:47:25.508140 master-0 kubenswrapper[7547]: I0308 03:47:25.508019 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-twx7s" Mar 08 03:47:25.510356 master-0 kubenswrapper[7547]: I0308 03:47:25.509145 7547 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:47:25.556038 master-0 kubenswrapper[7547]: I0308 03:47:25.555971 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-twx7s"] Mar 08 03:47:25.567765 master-0 kubenswrapper[7547]: I0308 03:47:25.567142 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-twx7s"] Mar 08 03:47:25.682808 master-0 kubenswrapper[7547]: I0308 03:47:25.682716 7547 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f4df75b-f2f1-4107-99df-60b5a528c0a9-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:25.682808 master-0 kubenswrapper[7547]: I0308 03:47:25.682778 7547 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f4df75b-f2f1-4107-99df-60b5a528c0a9-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:26.087286 master-0 kubenswrapper[7547]: I0308 03:47:26.086991 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:26.087430 master-0 kubenswrapper[7547]: I0308 03:47:26.087304 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:26.087430 master-0 kubenswrapper[7547]: I0308 03:47:26.087346 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:26.087430 master-0 kubenswrapper[7547]: E0308 03:47:26.087195 7547 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:26.087430 master-0 kubenswrapper[7547]: I0308 03:47:26.087396 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:26.087430 master-0 kubenswrapper[7547]: E0308 03:47:26.087435 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls podName:0418ff42-7eac-4266-97b5-4df88623d066 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.087412098 +0000 UTC m=+17.033096621 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-clqwj" (UID: "0418ff42-7eac-4266-97b5-4df88623d066") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:26.087789 master-0 kubenswrapper[7547]: E0308 03:47:26.087531 7547 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:47:26.087789 master-0 kubenswrapper[7547]: I0308 03:47:26.087550 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:26.087789 master-0 kubenswrapper[7547]: E0308 03:47:26.087588 7547 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:47:26.087789 master-0 kubenswrapper[7547]: E0308 03:47:26.087624 7547 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: configmap "kube-rbac-proxy" not found Mar 08 03:47:26.087789 master-0 kubenswrapper[7547]: E0308 03:47:26.087597 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls podName:69eb8ba2-7bfb-4433-8951-08f89e7bcb5f nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.087576172 +0000 UTC m=+17.033260725 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-572xh" (UID: "69eb8ba2-7bfb-4433-8951-08f89e7bcb5f") : secret "image-registry-operator-tls" not found Mar 08 03:47:26.087789 master-0 kubenswrapper[7547]: E0308 03:47:26.087635 7547 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:47:26.087789 master-0 kubenswrapper[7547]: E0308 03:47:26.087661 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.087649734 +0000 UTC m=+17.033334287 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : configmap "kube-rbac-proxy" not found Mar 08 03:47:26.087789 master-0 kubenswrapper[7547]: I0308 03:47:26.087662 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:26.087789 master-0 kubenswrapper[7547]: E0308 03:47:26.087678 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert podName:6cde5024-edf7-4fa4-8964-cabe7899578b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.087669874 +0000 UTC m=+17.033354397 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-c46zz" (UID: "6cde5024-edf7-4fa4-8964-cabe7899578b") : secret "package-server-manager-serving-cert" not found Mar 08 03:47:26.087789 master-0 kubenswrapper[7547]: I0308 03:47:26.087699 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:26.087789 master-0 kubenswrapper[7547]: E0308 03:47:26.087730 7547 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:26.087789 master-0 kubenswrapper[7547]: I0308 03:47:26.087738 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:26.087789 master-0 kubenswrapper[7547]: E0308 03:47:26.087765 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.087753546 +0000 UTC m=+17.033438089 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:26.087789 master-0 kubenswrapper[7547]: E0308 03:47:26.087785 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert podName:349d438d-d124-4d34-a172-4160e766c680 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.087775307 +0000 UTC m=+17.033459860 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert") pod "cluster-version-operator-745944c6b7-gvmnp" (UID: "349d438d-d124-4d34-a172-4160e766c680") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:47:26.087789 master-0 kubenswrapper[7547]: E0308 03:47:26.087800 7547 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:26.087789 master-0 kubenswrapper[7547]: E0308 03:47:26.087805 7547 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:47:26.087789 master-0 kubenswrapper[7547]: I0308 03:47:26.087809 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: E0308 03:47:26.087855 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.087817788 +0000 UTC m=+17.033502311 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: E0308 03:47:26.087873 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.087864079 +0000 UTC m=+17.033548602 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "node-tuning-operator-tls" not found Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: I0308 03:47:26.087891 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: I0308 03:47:26.087915 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: I0308 03:47:26.087950 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: I0308 03:47:26.087971 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: I0308 03:47:26.088003 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: I0308 03:47:26.088026 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: E0308 03:47:26.087919 7547 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: E0308 03:47:26.088066 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.088057483 +0000 UTC m=+17.033742016 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : secret "mco-proxy-tls" not found Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: E0308 03:47:26.087964 7547 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: E0308 03:47:26.088083 7547 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: E0308 03:47:26.088123 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls podName:8efdcef9-9b31-4567-b7f9-cb59a894273d nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.088111355 +0000 UTC m=+17.033795898 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls") pod "dns-operator-589895fbb7-xttlz" (UID: "8efdcef9-9b31-4567-b7f9-cb59a894273d") : secret "metrics-tls" not found Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: I0308 03:47:26.088164 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: E0308 03:47:26.088184 7547 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: E0308 03:47:26.088213 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert podName:2dd4279d-a1a9-450a-a061-9008cd1ea8e0 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.088203107 +0000 UTC m=+17.033887640 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert") pod "olm-operator-d64cfc9db-qddlp" (UID: "2dd4279d-a1a9-450a-a061-9008cd1ea8e0") : secret "olm-operator-serving-cert" not found Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: E0308 03:47:26.088025 7547 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: E0308 03:47:26.088227 7547 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: E0308 03:47:26.088262 7547 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: E0308 03:47:26.088265 7547 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: E0308 03:47:26.088274 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls podName:c9de4939-680a-4e3e-89fd-e20ecb8b10f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.088262008 +0000 UTC m=+17.033946531 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls") pod "ingress-operator-677db989d6-t77qr" (UID: "c9de4939-680a-4e3e-89fd-e20ecb8b10f2") : secret "metrics-tls" not found Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: E0308 03:47:26.088313 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert podName:1482d789-884b-4337-b598-f0e2b71eb9f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.088305499 +0000 UTC m=+17.033990022 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert") pod "catalog-operator-7d9c49f57b-qlfgq" (UID: "1482d789-884b-4337-b598-f0e2b71eb9f2") : secret "catalog-operator-serving-cert" not found Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: E0308 03:47:26.088327 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.088320889 +0000 UTC m=+17.034005422 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: E0308 03:47:26.088356 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs podName:1eb851be-f157-48ea-9a39-1361b68d2639 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.0883354 +0000 UTC m=+17.034019933 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs") pod "multus-admission-controller-8d675b596-j8pv6" (UID: "1eb851be-f157-48ea-9a39-1361b68d2639") : secret "multus-admission-controller-secret" not found Mar 08 03:47:26.088811 master-0 kubenswrapper[7547]: E0308 03:47:26.088373 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics podName:54ad284e-d40e-4e69-b898-f5093952a0e6 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.08836445 +0000 UTC m=+17.034048973 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-9sw2d" (UID: "54ad284e-d40e-4e69-b898-f5093952a0e6") : secret "marketplace-operator-metrics" not found Mar 08 03:47:26.116271 master-0 kubenswrapper[7547]: I0308 03:47:26.116188 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:26.123017 master-0 kubenswrapper[7547]: I0308 03:47:26.122971 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:47:26.291085 master-0 kubenswrapper[7547]: I0308 03:47:26.290868 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:47:26.291085 master-0 kubenswrapper[7547]: E0308 03:47:26.291046 7547 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 03:47:26.291267 master-0 kubenswrapper[7547]: E0308 03:47:26.291096 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs podName:d5044ffd-0686-4679-9894-e696faf33699 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.291080257 +0000 UTC m=+17.236764770 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs") pod "network-metrics-daemon-schjl" (UID: "d5044ffd-0686-4679-9894-e696faf33699") : secret "metrics-daemon-secret" not found Mar 08 03:47:26.513908 master-0 kubenswrapper[7547]: I0308 03:47:26.513780 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-wqldq" event={"ID":"84d353ae-3992-4c17-a20e-3415edd92509","Type":"ContainerStarted","Data":"ffd40630448ab30a7e535faa3fa89869042b73f3352c517ced3ba6fdea06ca26"} Mar 08 03:47:26.513908 master-0 kubenswrapper[7547]: I0308 03:47:26.513906 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-wqldq" event={"ID":"84d353ae-3992-4c17-a20e-3415edd92509","Type":"ContainerStarted","Data":"468305ffb1ad8c7762a1bae9e272103c33bcb7f2a0044bfbf235ca4202eb6f5f"} Mar 08 03:47:26.515432 master-0 kubenswrapper[7547]: I0308 03:47:26.515369 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" event={"ID":"b3eea925-73b3-4693-8f0e-6dd26107f60a","Type":"ContainerStarted","Data":"9c4b058bc98e254a8a4b1a2af3561d6b7519c1e36ed6446917dcc85e6786652f"} Mar 08 03:47:26.516805 master-0 kubenswrapper[7547]: I0308 03:47:26.516721 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" event={"ID":"a60bc804-52e7-422a-87fd-ac4c5aa90cb3","Type":"ContainerStarted","Data":"d4bcecac644708f2f25ce7ab391ef8889989648db06b3e9db25dd3f64bfa6da8"} Mar 08 03:47:26.519150 master-0 kubenswrapper[7547]: I0308 03:47:26.519094 7547 generic.go:334] "Generic (PLEG): container finished" podID="7ff63c73-62a3-44b4-acd3-1b3df175794f" containerID="4387f662533a08651e81f74de2284f00d980e00423a2533ad7cf2a0699bd920f" exitCode=0 Mar 08 03:47:26.519150 master-0 kubenswrapper[7547]: I0308 03:47:26.519129 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" event={"ID":"7ff63c73-62a3-44b4-acd3-1b3df175794f","Type":"ContainerDied","Data":"4387f662533a08651e81f74de2284f00d980e00423a2533ad7cf2a0699bd920f"} Mar 08 03:47:26.520454 master-0 kubenswrapper[7547]: I0308 03:47:26.520380 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" event={"ID":"9ec89e27-4360-48f2-a7ca-5d823bda4510","Type":"ContainerStarted","Data":"e1cf094994e913e66c5a9e6e155292c3e34468235cb173dcf1919a0eed0dd4ca"} Mar 08 03:47:26.555082 master-0 kubenswrapper[7547]: I0308 03:47:26.554906 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" podStartSLOduration=1.760077754 podStartE2EDuration="5.554878142s" podCreationTimestamp="2026-03-08 03:47:21 +0000 UTC" firstStartedPulling="2026-03-08 03:47:22.182953706 +0000 UTC m=+5.128638219" lastFinishedPulling="2026-03-08 03:47:25.977754104 +0000 UTC m=+8.923438607" observedRunningTime="2026-03-08 03:47:26.554428202 +0000 UTC m=+9.500112745" watchObservedRunningTime="2026-03-08 03:47:26.554878142 +0000 UTC m=+9.500562685" Mar 08 03:47:26.566548 master-0 kubenswrapper[7547]: I0308 03:47:26.556790 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-wqldq" podStartSLOduration=1.718514415 podStartE2EDuration="5.556774067s" podCreationTimestamp="2026-03-08 03:47:21 +0000 UTC" firstStartedPulling="2026-03-08 03:47:22.123914545 +0000 UTC m=+5.069599058" lastFinishedPulling="2026-03-08 03:47:25.962174167 +0000 UTC m=+8.907858710" observedRunningTime="2026-03-08 03:47:26.535114737 +0000 UTC m=+9.480799260" watchObservedRunningTime="2026-03-08 03:47:26.556774067 +0000 UTC m=+9.502458620" Mar 08 03:47:27.158044 master-0 kubenswrapper[7547]: I0308 03:47:27.155878 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7765bbc5bf-72v5f"] Mar 08 03:47:27.158322 master-0 kubenswrapper[7547]: I0308 03:47:27.158172 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:27.164508 master-0 kubenswrapper[7547]: I0308 03:47:27.160228 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 03:47:27.164508 master-0 kubenswrapper[7547]: I0308 03:47:27.161134 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 03:47:27.164508 master-0 kubenswrapper[7547]: I0308 03:47:27.164155 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 03:47:27.164508 master-0 kubenswrapper[7547]: I0308 03:47:27.164480 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 03:47:27.168914 master-0 kubenswrapper[7547]: I0308 03:47:27.168602 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7765bbc5bf-72v5f"] Mar 08 03:47:27.168914 master-0 kubenswrapper[7547]: I0308 03:47:27.168838 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 03:47:27.176034 master-0 kubenswrapper[7547]: I0308 03:47:27.176012 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 03:47:27.242223 master-0 kubenswrapper[7547]: I0308 03:47:27.242160 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f4df75b-f2f1-4107-99df-60b5a528c0a9" path="/var/lib/kubelet/pods/6f4df75b-f2f1-4107-99df-60b5a528c0a9/volumes" Mar 08 03:47:27.308862 master-0 kubenswrapper[7547]: I0308 03:47:27.306468 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgxsp\" (UniqueName: \"kubernetes.io/projected/4710894b-9971-464b-ae78-0d8520542328-kube-api-access-lgxsp\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:27.308862 master-0 kubenswrapper[7547]: I0308 03:47:27.306655 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-config\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:27.308862 master-0 kubenswrapper[7547]: I0308 03:47:27.306711 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:27.308862 master-0 kubenswrapper[7547]: I0308 03:47:27.306816 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-proxy-ca-bundles\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:27.308862 master-0 kubenswrapper[7547]: I0308 03:47:27.306911 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:27.407543 master-0 kubenswrapper[7547]: I0308 03:47:27.407495 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-config\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:27.407543 master-0 kubenswrapper[7547]: I0308 03:47:27.407536 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:27.407757 master-0 kubenswrapper[7547]: I0308 03:47:27.407717 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-proxy-ca-bundles\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:27.407882 master-0 kubenswrapper[7547]: I0308 03:47:27.407848 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:27.407992 master-0 kubenswrapper[7547]: E0308 03:47:27.407965 7547 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:27.408169 master-0 kubenswrapper[7547]: I0308 03:47:27.408110 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgxsp\" (UniqueName: \"kubernetes.io/projected/4710894b-9971-464b-ae78-0d8520542328-kube-api-access-lgxsp\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:27.408261 master-0 kubenswrapper[7547]: E0308 03:47:27.408200 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca podName:4710894b-9971-464b-ae78-0d8520542328 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:27.908121198 +0000 UTC m=+10.853805711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca") pod "controller-manager-7765bbc5bf-72v5f" (UID: "4710894b-9971-464b-ae78-0d8520542328") : configmap "client-ca" not found Mar 08 03:47:27.408261 master-0 kubenswrapper[7547]: E0308 03:47:27.408245 7547 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:47:27.408335 master-0 kubenswrapper[7547]: E0308 03:47:27.408321 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert podName:4710894b-9971-464b-ae78-0d8520542328 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:27.908301562 +0000 UTC m=+10.853986075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert") pod "controller-manager-7765bbc5bf-72v5f" (UID: "4710894b-9971-464b-ae78-0d8520542328") : secret "serving-cert" not found Mar 08 03:47:27.409004 master-0 kubenswrapper[7547]: I0308 03:47:27.408972 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-config\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:27.409200 master-0 kubenswrapper[7547]: I0308 03:47:27.409169 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-proxy-ca-bundles\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:27.439956 master-0 kubenswrapper[7547]: I0308 03:47:27.439888 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgxsp\" (UniqueName: \"kubernetes.io/projected/4710894b-9971-464b-ae78-0d8520542328-kube-api-access-lgxsp\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:27.509841 master-0 kubenswrapper[7547]: I0308 03:47:27.509752 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:27.510089 master-0 kubenswrapper[7547]: E0308 03:47:27.509988 7547 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:47:27.510089 master-0 kubenswrapper[7547]: E0308 03:47:27.510087 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert podName:22ebd67b-43b2-4f9d-955b-eb848d9d55d4 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:31.51006606 +0000 UTC m=+14.455750583 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert") pod "route-controller-manager-6d8669fddc-zn4lq" (UID: "22ebd67b-43b2-4f9d-955b-eb848d9d55d4") : secret "serving-cert" not found Mar 08 03:47:27.510253 master-0 kubenswrapper[7547]: I0308 03:47:27.510208 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:27.510440 master-0 kubenswrapper[7547]: E0308 03:47:27.510393 7547 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:27.510518 master-0 kubenswrapper[7547]: E0308 03:47:27.510499 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca podName:22ebd67b-43b2-4f9d-955b-eb848d9d55d4 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:31.51046538 +0000 UTC m=+14.456149903 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca") pod "route-controller-manager-6d8669fddc-zn4lq" (UID: "22ebd67b-43b2-4f9d-955b-eb848d9d55d4") : configmap "client-ca" not found Mar 08 03:47:27.925805 master-0 kubenswrapper[7547]: I0308 03:47:27.925707 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:27.926755 master-0 kubenswrapper[7547]: E0308 03:47:27.925937 7547 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:47:27.926755 master-0 kubenswrapper[7547]: E0308 03:47:27.926017 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert podName:4710894b-9971-464b-ae78-0d8520542328 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:28.925997771 +0000 UTC m=+11.871682284 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert") pod "controller-manager-7765bbc5bf-72v5f" (UID: "4710894b-9971-464b-ae78-0d8520542328") : secret "serving-cert" not found Mar 08 03:47:27.926755 master-0 kubenswrapper[7547]: I0308 03:47:27.926117 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:27.926755 master-0 kubenswrapper[7547]: E0308 03:47:27.926304 7547 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:27.926755 master-0 kubenswrapper[7547]: E0308 03:47:27.926449 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca podName:4710894b-9971-464b-ae78-0d8520542328 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:28.926425041 +0000 UTC m=+11.872109604 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca") pod "controller-manager-7765bbc5bf-72v5f" (UID: "4710894b-9971-464b-ae78-0d8520542328") : configmap "client-ca" not found Mar 08 03:47:28.939489 master-0 kubenswrapper[7547]: I0308 03:47:28.939406 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:28.940265 master-0 kubenswrapper[7547]: E0308 03:47:28.939548 7547 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:28.940265 master-0 kubenswrapper[7547]: E0308 03:47:28.939617 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca podName:4710894b-9971-464b-ae78-0d8520542328 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:30.939595985 +0000 UTC m=+13.885280508 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca") pod "controller-manager-7765bbc5bf-72v5f" (UID: "4710894b-9971-464b-ae78-0d8520542328") : configmap "client-ca" not found Mar 08 03:47:28.940265 master-0 kubenswrapper[7547]: I0308 03:47:28.940043 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:28.940435 master-0 kubenswrapper[7547]: E0308 03:47:28.940349 7547 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:47:28.940480 master-0 kubenswrapper[7547]: E0308 03:47:28.940434 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert podName:4710894b-9971-464b-ae78-0d8520542328 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:30.940411894 +0000 UTC m=+13.886096437 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert") pod "controller-manager-7765bbc5bf-72v5f" (UID: "4710894b-9971-464b-ae78-0d8520542328") : secret "serving-cert" not found Mar 08 03:47:30.540515 master-0 kubenswrapper[7547]: I0308 03:47:30.540137 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" event={"ID":"7ff63c73-62a3-44b4-acd3-1b3df175794f","Type":"ContainerStarted","Data":"0d9517a4dbfbd842f9c484f0081150c86a4b5af486a5ddb8461a1d470f81112c"} Mar 08 03:47:30.973846 master-0 kubenswrapper[7547]: I0308 03:47:30.972332 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:30.973846 master-0 kubenswrapper[7547]: I0308 03:47:30.972475 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:30.973846 master-0 kubenswrapper[7547]: E0308 03:47:30.972646 7547 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:30.973846 master-0 kubenswrapper[7547]: E0308 03:47:30.972706 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca podName:4710894b-9971-464b-ae78-0d8520542328 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.97268665 +0000 UTC m=+17.918371193 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca") pod "controller-manager-7765bbc5bf-72v5f" (UID: "4710894b-9971-464b-ae78-0d8520542328") : configmap "client-ca" not found Mar 08 03:47:30.973846 master-0 kubenswrapper[7547]: E0308 03:47:30.973136 7547 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:47:30.973846 master-0 kubenswrapper[7547]: E0308 03:47:30.973169 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert podName:4710894b-9971-464b-ae78-0d8520542328 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.973159302 +0000 UTC m=+17.918843835 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert") pod "controller-manager-7765bbc5bf-72v5f" (UID: "4710894b-9971-464b-ae78-0d8520542328") : secret "serving-cert" not found Mar 08 03:47:31.546250 master-0 kubenswrapper[7547]: I0308 03:47:31.546188 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" event={"ID":"0d377285-0336-41b7-b48f-c44a7b563498","Type":"ContainerStarted","Data":"db7380d3fe7301944a7a66ec837b1d91caad2bb5d7122a498e8b10a38c9f552b"} Mar 08 03:47:31.580041 master-0 kubenswrapper[7547]: E0308 03:47:31.579973 7547 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:31.580314 master-0 kubenswrapper[7547]: E0308 03:47:31.580106 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca podName:22ebd67b-43b2-4f9d-955b-eb848d9d55d4 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:39.580074982 +0000 UTC m=+22.525759545 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca") pod "route-controller-manager-6d8669fddc-zn4lq" (UID: "22ebd67b-43b2-4f9d-955b-eb848d9d55d4") : configmap "client-ca" not found Mar 08 03:47:31.580719 master-0 kubenswrapper[7547]: I0308 03:47:31.579798 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:31.581262 master-0 kubenswrapper[7547]: I0308 03:47:31.581016 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:31.581390 master-0 kubenswrapper[7547]: E0308 03:47:31.581184 7547 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:47:31.581476 master-0 kubenswrapper[7547]: E0308 03:47:31.581425 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert podName:22ebd67b-43b2-4f9d-955b-eb848d9d55d4 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:39.581400624 +0000 UTC m=+22.527085177 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert") pod "route-controller-manager-6d8669fddc-zn4lq" (UID: "22ebd67b-43b2-4f9d-955b-eb848d9d55d4") : secret "serving-cert" not found Mar 08 03:47:32.802786 master-0 kubenswrapper[7547]: I0308 03:47:32.802680 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-7f5cb457d9-m95rp"] Mar 08 03:47:32.804322 master-0 kubenswrapper[7547]: I0308 03:47:32.804279 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:32.806414 master-0 kubenswrapper[7547]: I0308 03:47:32.806317 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 03:47:32.807012 master-0 kubenswrapper[7547]: I0308 03:47:32.806950 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 03:47:32.807012 master-0 kubenswrapper[7547]: I0308 03:47:32.807003 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 03:47:32.807236 master-0 kubenswrapper[7547]: I0308 03:47:32.807176 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-0" Mar 08 03:47:32.808522 master-0 kubenswrapper[7547]: I0308 03:47:32.808482 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-0" Mar 08 03:47:32.808708 master-0 kubenswrapper[7547]: I0308 03:47:32.808678 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 03:47:32.809780 master-0 kubenswrapper[7547]: I0308 03:47:32.809733 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 03:47:32.809780 master-0 kubenswrapper[7547]: I0308 03:47:32.809779 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 03:47:32.810037 master-0 kubenswrapper[7547]: I0308 03:47:32.809945 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 03:47:32.818554 master-0 kubenswrapper[7547]: I0308 03:47:32.818497 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-7f5cb457d9-m95rp"] Mar 08 03:47:32.822629 master-0 kubenswrapper[7547]: I0308 03:47:32.822576 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 03:47:32.903940 master-0 kubenswrapper[7547]: I0308 03:47:32.903864 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-etcd-client\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:32.903940 master-0 kubenswrapper[7547]: I0308 03:47:32.903925 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3641f81d-1901-4b8b-a553-75fdb739919f-node-pullsecrets\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:32.904179 master-0 kubenswrapper[7547]: I0308 03:47:32.903982 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-trusted-ca-bundle\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:32.904179 master-0 kubenswrapper[7547]: I0308 03:47:32.904010 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-encryption-config\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:32.904179 master-0 kubenswrapper[7547]: I0308 03:47:32.904087 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-etcd-serving-ca\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:32.904179 master-0 kubenswrapper[7547]: I0308 03:47:32.904134 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-config\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:32.904179 master-0 kubenswrapper[7547]: I0308 03:47:32.904154 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-image-import-ca\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:32.904353 master-0 kubenswrapper[7547]: I0308 03:47:32.904192 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-audit\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:32.904353 master-0 kubenswrapper[7547]: I0308 03:47:32.904222 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5z8q\" (UniqueName: \"kubernetes.io/projected/3641f81d-1901-4b8b-a553-75fdb739919f-kube-api-access-j5z8q\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:32.904353 master-0 kubenswrapper[7547]: I0308 03:47:32.904270 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-serving-cert\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:32.904353 master-0 kubenswrapper[7547]: I0308 03:47:32.904292 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3641f81d-1901-4b8b-a553-75fdb739919f-audit-dir\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.005090 master-0 kubenswrapper[7547]: I0308 03:47:33.005008 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-config\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.005090 master-0 kubenswrapper[7547]: I0308 03:47:33.005084 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-image-import-ca\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.005577 master-0 kubenswrapper[7547]: I0308 03:47:33.005425 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-audit\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.005577 master-0 kubenswrapper[7547]: I0308 03:47:33.005492 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5z8q\" (UniqueName: \"kubernetes.io/projected/3641f81d-1901-4b8b-a553-75fdb739919f-kube-api-access-j5z8q\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.005746 master-0 kubenswrapper[7547]: E0308 03:47:33.005653 7547 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 08 03:47:33.006276 master-0 kubenswrapper[7547]: E0308 03:47:33.005817 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-audit podName:3641f81d-1901-4b8b-a553-75fdb739919f nodeName:}" failed. No retries permitted until 2026-03-08 03:47:33.505784457 +0000 UTC m=+16.451469000 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-audit") pod "apiserver-7f5cb457d9-m95rp" (UID: "3641f81d-1901-4b8b-a553-75fdb739919f") : configmap "audit-0" not found Mar 08 03:47:33.006388 master-0 kubenswrapper[7547]: I0308 03:47:33.006190 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-image-import-ca\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.006388 master-0 kubenswrapper[7547]: I0308 03:47:33.006222 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-config\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.006388 master-0 kubenswrapper[7547]: I0308 03:47:33.006360 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-serving-cert\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.006596 master-0 kubenswrapper[7547]: I0308 03:47:33.006476 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3641f81d-1901-4b8b-a553-75fdb739919f-audit-dir\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.006659 master-0 kubenswrapper[7547]: E0308 03:47:33.006625 7547 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 08 03:47:33.006720 master-0 kubenswrapper[7547]: I0308 03:47:33.006657 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3641f81d-1901-4b8b-a553-75fdb739919f-audit-dir\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.006720 master-0 kubenswrapper[7547]: E0308 03:47:33.006682 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-serving-cert podName:3641f81d-1901-4b8b-a553-75fdb739919f nodeName:}" failed. No retries permitted until 2026-03-08 03:47:33.506661978 +0000 UTC m=+16.452346521 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-serving-cert") pod "apiserver-7f5cb457d9-m95rp" (UID: "3641f81d-1901-4b8b-a553-75fdb739919f") : secret "serving-cert" not found Mar 08 03:47:33.006901 master-0 kubenswrapper[7547]: I0308 03:47:33.006851 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-etcd-client\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.007050 master-0 kubenswrapper[7547]: I0308 03:47:33.007011 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3641f81d-1901-4b8b-a553-75fdb739919f-node-pullsecrets\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.007138 master-0 kubenswrapper[7547]: I0308 03:47:33.007106 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3641f81d-1901-4b8b-a553-75fdb739919f-node-pullsecrets\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.007204 master-0 kubenswrapper[7547]: I0308 03:47:33.007158 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-trusted-ca-bundle\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.007266 master-0 kubenswrapper[7547]: I0308 03:47:33.007243 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-encryption-config\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.008906 master-0 kubenswrapper[7547]: I0308 03:47:33.008346 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-etcd-serving-ca\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.008906 master-0 kubenswrapper[7547]: I0308 03:47:33.008450 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-trusted-ca-bundle\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.009322 master-0 kubenswrapper[7547]: I0308 03:47:33.009026 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-etcd-serving-ca\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.013718 master-0 kubenswrapper[7547]: I0308 03:47:33.013646 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-encryption-config\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.014000 master-0 kubenswrapper[7547]: I0308 03:47:33.013927 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-etcd-client\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.026221 master-0 kubenswrapper[7547]: I0308 03:47:33.026149 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5z8q\" (UniqueName: \"kubernetes.io/projected/3641f81d-1901-4b8b-a553-75fdb739919f-kube-api-access-j5z8q\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.516504 master-0 kubenswrapper[7547]: I0308 03:47:33.516367 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-audit\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.516504 master-0 kubenswrapper[7547]: I0308 03:47:33.516477 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-serving-cert\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:33.516753 master-0 kubenswrapper[7547]: E0308 03:47:33.516526 7547 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 08 03:47:33.516753 master-0 kubenswrapper[7547]: E0308 03:47:33.516586 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-audit podName:3641f81d-1901-4b8b-a553-75fdb739919f nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.516569233 +0000 UTC m=+17.462253746 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-audit") pod "apiserver-7f5cb457d9-m95rp" (UID: "3641f81d-1901-4b8b-a553-75fdb739919f") : configmap "audit-0" not found Mar 08 03:47:33.516753 master-0 kubenswrapper[7547]: E0308 03:47:33.516622 7547 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 08 03:47:33.516753 master-0 kubenswrapper[7547]: E0308 03:47:33.516655 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-serving-cert podName:3641f81d-1901-4b8b-a553-75fdb739919f nodeName:}" failed. No retries permitted until 2026-03-08 03:47:34.516645675 +0000 UTC m=+17.462330188 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-serving-cert") pod "apiserver-7f5cb457d9-m95rp" (UID: "3641f81d-1901-4b8b-a553-75fdb739919f") : secret "serving-cert" not found Mar 08 03:47:34.123767 master-0 kubenswrapper[7547]: I0308 03:47:34.123242 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:34.123767 master-0 kubenswrapper[7547]: E0308 03:47:34.123574 7547 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.123806 7547 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: configmap "kube-rbac-proxy" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: I0308 03:47:34.123705 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.123870 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls podName:0418ff42-7eac-4266-97b5-4df88623d066 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:50.123787411 +0000 UTC m=+33.069471954 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-clqwj" (UID: "0418ff42-7eac-4266-97b5-4df88623d066") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.123988 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:50.123950394 +0000 UTC m=+33.069635017 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : configmap "kube-rbac-proxy" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: I0308 03:47:34.124059 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: I0308 03:47:34.124137 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: I0308 03:47:34.124199 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.124231 7547 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.124298 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert podName:349d438d-d124-4d34-a172-4160e766c680 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:50.124279452 +0000 UTC m=+33.069964005 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert") pod "cluster-version-operator-745944c6b7-gvmnp" (UID: "349d438d-d124-4d34-a172-4160e766c680") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.124367 7547 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.124370 7547 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.124440 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert podName:6cde5024-edf7-4fa4-8964-cabe7899578b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:50.124417366 +0000 UTC m=+33.070101909 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-c46zz" (UID: "6cde5024-edf7-4fa4-8964-cabe7899578b") : secret "package-server-manager-serving-cert" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: I0308 03:47:34.124505 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.124537 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls podName:69eb8ba2-7bfb-4433-8951-08f89e7bcb5f nodeName:}" failed. No retries permitted until 2026-03-08 03:47:50.124521699 +0000 UTC m=+33.070206242 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-572xh" (UID: "69eb8ba2-7bfb-4433-8951-08f89e7bcb5f") : secret "image-registry-operator-tls" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: I0308 03:47:34.124571 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: I0308 03:47:34.124623 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.124664 7547 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: I0308 03:47:34.124696 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.124718 7547 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.124744 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:50.124718993 +0000 UTC m=+33.070403546 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.124792 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:50.124771454 +0000 UTC m=+33.070455997 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "node-tuning-operator-tls" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: I0308 03:47:34.124787 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.124870 7547 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.124892 7547 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.124929 7547 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: I0308 03:47:34.124941 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.124941 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:47:50.124919768 +0000 UTC m=+33.070604411 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : secret "mco-proxy-tls" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.125025 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert podName:d831cb23-7411-4072-8273-c167d9afca28 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:50.12500296 +0000 UTC m=+33.070687513 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert") pod "cluster-baremetal-operator-5cdb4c5598-jghp5" (UID: "d831cb23-7411-4072-8273-c167d9afca28") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.125046 7547 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.125051 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert podName:1482d789-884b-4337-b598-f0e2b71eb9f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:50.125039121 +0000 UTC m=+33.070723664 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert") pod "catalog-operator-7d9c49f57b-qlfgq" (UID: "1482d789-884b-4337-b598-f0e2b71eb9f2") : secret "catalog-operator-serving-cert" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: I0308 03:47:34.125186 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: I0308 03:47:34.125239 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.125297 7547 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: I0308 03:47:34.125317 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.125374 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls podName:c9de4939-680a-4e3e-89fd-e20ecb8b10f2 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:50.125311727 +0000 UTC m=+33.070996290 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls") pod "ingress-operator-677db989d6-t77qr" (UID: "c9de4939-680a-4e3e-89fd-e20ecb8b10f2") : secret "metrics-tls" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.125409 7547 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.125420 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert podName:2dd4279d-a1a9-450a-a061-9008cd1ea8e0 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:50.125401619 +0000 UTC m=+33.071086292 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert") pod "olm-operator-d64cfc9db-qddlp" (UID: "2dd4279d-a1a9-450a-a061-9008cd1ea8e0") : secret "olm-operator-serving-cert" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.125444 7547 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: I0308 03:47:34.125466 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.125508 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics podName:54ad284e-d40e-4e69-b898-f5093952a0e6 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:50.125487391 +0000 UTC m=+33.071171944 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-9sw2d" (UID: "54ad284e-d40e-4e69-b898-f5093952a0e6") : secret "marketplace-operator-metrics" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.125535 7547 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: I0308 03:47:34.125544 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.125597 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert podName:232c421d-96f0-4894-b8d8-74f43d02bbd3 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:50.125579423 +0000 UTC m=+33.071263966 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-qjv52" (UID: "232c421d-96f0-4894-b8d8-74f43d02bbd3") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.125612 7547 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.125641 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls podName:8efdcef9-9b31-4567-b7f9-cb59a894273d nodeName:}" failed. No retries permitted until 2026-03-08 03:47:50.125621114 +0000 UTC m=+33.071305657 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls") pod "dns-operator-589895fbb7-xttlz" (UID: "8efdcef9-9b31-4567-b7f9-cb59a894273d") : secret "metrics-tls" not found Mar 08 03:47:34.126336 master-0 kubenswrapper[7547]: E0308 03:47:34.125686 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs podName:1eb851be-f157-48ea-9a39-1361b68d2639 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:50.125667835 +0000 UTC m=+33.071352488 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs") pod "multus-admission-controller-8d675b596-j8pv6" (UID: "1eb851be-f157-48ea-9a39-1361b68d2639") : secret "multus-admission-controller-secret" not found Mar 08 03:47:34.328837 master-0 kubenswrapper[7547]: I0308 03:47:34.328746 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:47:34.329017 master-0 kubenswrapper[7547]: E0308 03:47:34.328989 7547 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 03:47:34.329097 master-0 kubenswrapper[7547]: E0308 03:47:34.329079 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs podName:d5044ffd-0686-4679-9894-e696faf33699 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:50.329058428 +0000 UTC m=+33.274742951 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs") pod "network-metrics-daemon-schjl" (UID: "d5044ffd-0686-4679-9894-e696faf33699") : secret "metrics-daemon-secret" not found Mar 08 03:47:34.336630 master-0 kubenswrapper[7547]: I0308 03:47:34.336592 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-gj69x"] Mar 08 03:47:34.337220 master-0 kubenswrapper[7547]: I0308 03:47:34.337186 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" Mar 08 03:47:34.341699 master-0 kubenswrapper[7547]: I0308 03:47:34.341039 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 03:47:34.341699 master-0 kubenswrapper[7547]: I0308 03:47:34.341103 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 03:47:34.341699 master-0 kubenswrapper[7547]: I0308 03:47:34.341175 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 03:47:34.341699 master-0 kubenswrapper[7547]: I0308 03:47:34.341323 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 03:47:34.429894 master-0 kubenswrapper[7547]: I0308 03:47:34.429735 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/49ec083d-dc74-457e-b10f-3bde04e9e75e-signing-key\") pod \"service-ca-84bfdbbb7f-gj69x\" (UID: \"49ec083d-dc74-457e-b10f-3bde04e9e75e\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" Mar 08 03:47:34.430110 master-0 kubenswrapper[7547]: I0308 03:47:34.430071 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/49ec083d-dc74-457e-b10f-3bde04e9e75e-signing-cabundle\") pod \"service-ca-84bfdbbb7f-gj69x\" (UID: \"49ec083d-dc74-457e-b10f-3bde04e9e75e\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" Mar 08 03:47:34.430267 master-0 kubenswrapper[7547]: I0308 03:47:34.430231 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcjr9\" (UniqueName: \"kubernetes.io/projected/49ec083d-dc74-457e-b10f-3bde04e9e75e-kube-api-access-zcjr9\") pod \"service-ca-84bfdbbb7f-gj69x\" (UID: \"49ec083d-dc74-457e-b10f-3bde04e9e75e\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" Mar 08 03:47:34.448422 master-0 kubenswrapper[7547]: I0308 03:47:34.448343 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-gj69x"] Mar 08 03:47:34.531745 master-0 kubenswrapper[7547]: I0308 03:47:34.531679 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-audit\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:34.531946 master-0 kubenswrapper[7547]: I0308 03:47:34.531784 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/49ec083d-dc74-457e-b10f-3bde04e9e75e-signing-cabundle\") pod \"service-ca-84bfdbbb7f-gj69x\" (UID: \"49ec083d-dc74-457e-b10f-3bde04e9e75e\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" Mar 08 03:47:34.532195 master-0 kubenswrapper[7547]: E0308 03:47:34.532143 7547 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 08 03:47:34.532406 master-0 kubenswrapper[7547]: E0308 03:47:34.532372 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-audit podName:3641f81d-1901-4b8b-a553-75fdb739919f nodeName:}" failed. No retries permitted until 2026-03-08 03:47:36.532341128 +0000 UTC m=+19.478025681 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-audit") pod "apiserver-7f5cb457d9-m95rp" (UID: "3641f81d-1901-4b8b-a553-75fdb739919f") : configmap "audit-0" not found Mar 08 03:47:34.532484 master-0 kubenswrapper[7547]: I0308 03:47:34.532445 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-serving-cert\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:34.532766 master-0 kubenswrapper[7547]: E0308 03:47:34.532720 7547 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 08 03:47:34.532866 master-0 kubenswrapper[7547]: I0308 03:47:34.532787 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcjr9\" (UniqueName: \"kubernetes.io/projected/49ec083d-dc74-457e-b10f-3bde04e9e75e-kube-api-access-zcjr9\") pod \"service-ca-84bfdbbb7f-gj69x\" (UID: \"49ec083d-dc74-457e-b10f-3bde04e9e75e\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" Mar 08 03:47:34.532929 master-0 kubenswrapper[7547]: E0308 03:47:34.532864 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-serving-cert podName:3641f81d-1901-4b8b-a553-75fdb739919f nodeName:}" failed. No retries permitted until 2026-03-08 03:47:36.532802598 +0000 UTC m=+19.478487151 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-serving-cert") pod "apiserver-7f5cb457d9-m95rp" (UID: "3641f81d-1901-4b8b-a553-75fdb739919f") : secret "serving-cert" not found Mar 08 03:47:34.533096 master-0 kubenswrapper[7547]: I0308 03:47:34.533057 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/49ec083d-dc74-457e-b10f-3bde04e9e75e-signing-key\") pod \"service-ca-84bfdbbb7f-gj69x\" (UID: \"49ec083d-dc74-457e-b10f-3bde04e9e75e\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" Mar 08 03:47:34.533246 master-0 kubenswrapper[7547]: I0308 03:47:34.533202 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/49ec083d-dc74-457e-b10f-3bde04e9e75e-signing-cabundle\") pod \"service-ca-84bfdbbb7f-gj69x\" (UID: \"49ec083d-dc74-457e-b10f-3bde04e9e75e\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" Mar 08 03:47:34.538812 master-0 kubenswrapper[7547]: I0308 03:47:34.538776 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/49ec083d-dc74-457e-b10f-3bde04e9e75e-signing-key\") pod \"service-ca-84bfdbbb7f-gj69x\" (UID: \"49ec083d-dc74-457e-b10f-3bde04e9e75e\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" Mar 08 03:47:34.550423 master-0 kubenswrapper[7547]: I0308 03:47:34.550379 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcjr9\" (UniqueName: \"kubernetes.io/projected/49ec083d-dc74-457e-b10f-3bde04e9e75e-kube-api-access-zcjr9\") pod \"service-ca-84bfdbbb7f-gj69x\" (UID: \"49ec083d-dc74-457e-b10f-3bde04e9e75e\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" Mar 08 03:47:34.559119 master-0 kubenswrapper[7547]: I0308 03:47:34.559044 7547 generic.go:334] "Generic (PLEG): container finished" podID="0918ba32-8e55-48d0-8e50-027c0dcb4bbd" containerID="c9cf4c65bcca879489d6d583c27aa9216a027640b05dfba4d536ce4f8192a79a" exitCode=0 Mar 08 03:47:34.559215 master-0 kubenswrapper[7547]: I0308 03:47:34.559131 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" event={"ID":"0918ba32-8e55-48d0-8e50-027c0dcb4bbd","Type":"ContainerDied","Data":"c9cf4c65bcca879489d6d583c27aa9216a027640b05dfba4d536ce4f8192a79a"} Mar 08 03:47:34.671394 master-0 kubenswrapper[7547]: I0308 03:47:34.671298 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" Mar 08 03:47:34.939628 master-0 kubenswrapper[7547]: I0308 03:47:34.939337 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-gj69x"] Mar 08 03:47:35.041249 master-0 kubenswrapper[7547]: I0308 03:47:35.041122 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:35.041435 master-0 kubenswrapper[7547]: E0308 03:47:35.041374 7547 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:47:35.041490 master-0 kubenswrapper[7547]: E0308 03:47:35.041480 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert podName:4710894b-9971-464b-ae78-0d8520542328 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:43.041447634 +0000 UTC m=+25.987132207 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert") pod "controller-manager-7765bbc5bf-72v5f" (UID: "4710894b-9971-464b-ae78-0d8520542328") : secret "serving-cert" not found Mar 08 03:47:35.041631 master-0 kubenswrapper[7547]: I0308 03:47:35.041574 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:35.041732 master-0 kubenswrapper[7547]: E0308 03:47:35.041701 7547 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:35.041806 master-0 kubenswrapper[7547]: E0308 03:47:35.041774 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca podName:4710894b-9971-464b-ae78-0d8520542328 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:43.041755691 +0000 UTC m=+25.987440194 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca") pod "controller-manager-7765bbc5bf-72v5f" (UID: "4710894b-9971-464b-ae78-0d8520542328") : configmap "client-ca" not found Mar 08 03:47:35.564536 master-0 kubenswrapper[7547]: I0308 03:47:35.564485 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" event={"ID":"49ec083d-dc74-457e-b10f-3bde04e9e75e","Type":"ContainerStarted","Data":"7c5a574b00cddc6ae858097a251b08c51507a0a632d0ee57830702564612db3b"} Mar 08 03:47:35.564536 master-0 kubenswrapper[7547]: I0308 03:47:35.564529 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" event={"ID":"49ec083d-dc74-457e-b10f-3bde04e9e75e","Type":"ContainerStarted","Data":"946c23d9475281a0fd499a7ff53f910e9f2c222a2716d7d2886b8590024362cc"} Mar 08 03:47:35.582385 master-0 kubenswrapper[7547]: I0308 03:47:35.582238 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" podStartSLOduration=1.582224796 podStartE2EDuration="1.582224796s" podCreationTimestamp="2026-03-08 03:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:47:35.580055805 +0000 UTC m=+18.525740318" watchObservedRunningTime="2026-03-08 03:47:35.582224796 +0000 UTC m=+18.527909299" Mar 08 03:47:36.568372 master-0 kubenswrapper[7547]: I0308 03:47:36.568326 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-audit\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:36.568876 master-0 kubenswrapper[7547]: I0308 03:47:36.568400 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-serving-cert\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:36.568876 master-0 kubenswrapper[7547]: E0308 03:47:36.568787 7547 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 08 03:47:36.568990 master-0 kubenswrapper[7547]: E0308 03:47:36.568875 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-audit podName:3641f81d-1901-4b8b-a553-75fdb739919f nodeName:}" failed. No retries permitted until 2026-03-08 03:47:40.568857545 +0000 UTC m=+23.514542068 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-audit") pod "apiserver-7f5cb457d9-m95rp" (UID: "3641f81d-1901-4b8b-a553-75fdb739919f") : configmap "audit-0" not found Mar 08 03:47:36.577930 master-0 kubenswrapper[7547]: I0308 03:47:36.573020 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-serving-cert\") pod \"apiserver-7f5cb457d9-m95rp\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:36.794403 master-0 kubenswrapper[7547]: I0308 03:47:36.794328 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-7f5cb457d9-m95rp"] Mar 08 03:47:36.794719 master-0 kubenswrapper[7547]: E0308 03:47:36.794673 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" podUID="3641f81d-1901-4b8b-a553-75fdb739919f" Mar 08 03:47:37.573562 master-0 kubenswrapper[7547]: I0308 03:47:37.573502 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:37.575027 master-0 kubenswrapper[7547]: I0308 03:47:37.574115 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" event={"ID":"0918ba32-8e55-48d0-8e50-027c0dcb4bbd","Type":"ContainerStarted","Data":"1523789c3f1ce2ac99a20cd9e6a22cf2201e9f542fc114c065ce9962d3d4debb"} Mar 08 03:47:37.575027 master-0 kubenswrapper[7547]: I0308 03:47:37.574942 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:37.581562 master-0 kubenswrapper[7547]: I0308 03:47:37.581490 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:37.690903 master-0 kubenswrapper[7547]: I0308 03:47:37.690850 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-trusted-ca-bundle\") pod \"3641f81d-1901-4b8b-a553-75fdb739919f\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " Mar 08 03:47:37.690903 master-0 kubenswrapper[7547]: I0308 03:47:37.690897 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3641f81d-1901-4b8b-a553-75fdb739919f-audit-dir\") pod \"3641f81d-1901-4b8b-a553-75fdb739919f\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " Mar 08 03:47:37.691097 master-0 kubenswrapper[7547]: I0308 03:47:37.690918 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-config\") pod \"3641f81d-1901-4b8b-a553-75fdb739919f\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " Mar 08 03:47:37.691097 master-0 kubenswrapper[7547]: I0308 03:47:37.690958 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3641f81d-1901-4b8b-a553-75fdb739919f-node-pullsecrets\") pod \"3641f81d-1901-4b8b-a553-75fdb739919f\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " Mar 08 03:47:37.691097 master-0 kubenswrapper[7547]: I0308 03:47:37.690998 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3641f81d-1901-4b8b-a553-75fdb739919f-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "3641f81d-1901-4b8b-a553-75fdb739919f" (UID: "3641f81d-1901-4b8b-a553-75fdb739919f"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:47:37.691097 master-0 kubenswrapper[7547]: I0308 03:47:37.691032 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3641f81d-1901-4b8b-a553-75fdb739919f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3641f81d-1901-4b8b-a553-75fdb739919f" (UID: "3641f81d-1901-4b8b-a553-75fdb739919f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:47:37.691313 master-0 kubenswrapper[7547]: I0308 03:47:37.691290 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-serving-cert\") pod \"3641f81d-1901-4b8b-a553-75fdb739919f\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " Mar 08 03:47:37.691416 master-0 kubenswrapper[7547]: I0308 03:47:37.691378 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3641f81d-1901-4b8b-a553-75fdb739919f" (UID: "3641f81d-1901-4b8b-a553-75fdb739919f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:47:37.691455 master-0 kubenswrapper[7547]: I0308 03:47:37.691393 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-config" (OuterVolumeSpecName: "config") pod "3641f81d-1901-4b8b-a553-75fdb739919f" (UID: "3641f81d-1901-4b8b-a553-75fdb739919f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:47:37.691517 master-0 kubenswrapper[7547]: I0308 03:47:37.691503 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-image-import-ca\") pod \"3641f81d-1901-4b8b-a553-75fdb739919f\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " Mar 08 03:47:37.691626 master-0 kubenswrapper[7547]: I0308 03:47:37.691612 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5z8q\" (UniqueName: \"kubernetes.io/projected/3641f81d-1901-4b8b-a553-75fdb739919f-kube-api-access-j5z8q\") pod \"3641f81d-1901-4b8b-a553-75fdb739919f\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " Mar 08 03:47:37.692183 master-0 kubenswrapper[7547]: I0308 03:47:37.692169 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-etcd-client\") pod \"3641f81d-1901-4b8b-a553-75fdb739919f\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " Mar 08 03:47:37.692326 master-0 kubenswrapper[7547]: I0308 03:47:37.692315 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-etcd-serving-ca\") pod \"3641f81d-1901-4b8b-a553-75fdb739919f\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " Mar 08 03:47:37.692399 master-0 kubenswrapper[7547]: I0308 03:47:37.692388 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-encryption-config\") pod \"3641f81d-1901-4b8b-a553-75fdb739919f\" (UID: \"3641f81d-1901-4b8b-a553-75fdb739919f\") " Mar 08 03:47:37.692514 master-0 kubenswrapper[7547]: I0308 03:47:37.691726 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "3641f81d-1901-4b8b-a553-75fdb739919f" (UID: "3641f81d-1901-4b8b-a553-75fdb739919f"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:47:37.692660 master-0 kubenswrapper[7547]: I0308 03:47:37.692633 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "3641f81d-1901-4b8b-a553-75fdb739919f" (UID: "3641f81d-1901-4b8b-a553-75fdb739919f"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:47:37.693000 master-0 kubenswrapper[7547]: I0308 03:47:37.692948 7547 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3641f81d-1901-4b8b-a553-75fdb739919f-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:37.693077 master-0 kubenswrapper[7547]: I0308 03:47:37.693064 7547 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:37.693140 master-0 kubenswrapper[7547]: I0308 03:47:37.693131 7547 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3641f81d-1901-4b8b-a553-75fdb739919f-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:37.693200 master-0 kubenswrapper[7547]: I0308 03:47:37.693190 7547 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-image-import-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:37.693253 master-0 kubenswrapper[7547]: I0308 03:47:37.693244 7547 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:37.693305 master-0 kubenswrapper[7547]: I0308 03:47:37.693296 7547 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:37.694665 master-0 kubenswrapper[7547]: I0308 03:47:37.694645 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3641f81d-1901-4b8b-a553-75fdb739919f-kube-api-access-j5z8q" (OuterVolumeSpecName: "kube-api-access-j5z8q") pod "3641f81d-1901-4b8b-a553-75fdb739919f" (UID: "3641f81d-1901-4b8b-a553-75fdb739919f"). InnerVolumeSpecName "kube-api-access-j5z8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:47:37.694818 master-0 kubenswrapper[7547]: I0308 03:47:37.694784 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "3641f81d-1901-4b8b-a553-75fdb739919f" (UID: "3641f81d-1901-4b8b-a553-75fdb739919f"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:47:37.694915 master-0 kubenswrapper[7547]: I0308 03:47:37.694882 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "3641f81d-1901-4b8b-a553-75fdb739919f" (UID: "3641f81d-1901-4b8b-a553-75fdb739919f"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:47:37.707951 master-0 kubenswrapper[7547]: I0308 03:47:37.707902 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3641f81d-1901-4b8b-a553-75fdb739919f" (UID: "3641f81d-1901-4b8b-a553-75fdb739919f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:47:37.794915 master-0 kubenswrapper[7547]: I0308 03:47:37.794879 7547 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:37.795111 master-0 kubenswrapper[7547]: I0308 03:47:37.795095 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5z8q\" (UniqueName: \"kubernetes.io/projected/3641f81d-1901-4b8b-a553-75fdb739919f-kube-api-access-j5z8q\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:37.795192 master-0 kubenswrapper[7547]: I0308 03:47:37.795182 7547 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:37.795258 master-0 kubenswrapper[7547]: I0308 03:47:37.795248 7547 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3641f81d-1901-4b8b-a553-75fdb739919f-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:38.580047 master-0 kubenswrapper[7547]: I0308 03:47:38.579592 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-7f5cb457d9-m95rp" Mar 08 03:47:38.660738 master-0 kubenswrapper[7547]: I0308 03:47:38.660642 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-7f5cb457d9-m95rp"] Mar 08 03:47:38.667960 master-0 kubenswrapper[7547]: I0308 03:47:38.667704 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-6b779d99b8-7kmck"] Mar 08 03:47:38.671860 master-0 kubenswrapper[7547]: I0308 03:47:38.671779 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.681013 master-0 kubenswrapper[7547]: I0308 03:47:38.674694 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-7f5cb457d9-m95rp"] Mar 08 03:47:38.681013 master-0 kubenswrapper[7547]: I0308 03:47:38.679783 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 03:47:38.681013 master-0 kubenswrapper[7547]: I0308 03:47:38.680552 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 03:47:38.681013 master-0 kubenswrapper[7547]: I0308 03:47:38.680718 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 03:47:38.681013 master-0 kubenswrapper[7547]: I0308 03:47:38.680246 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 03:47:38.681499 master-0 kubenswrapper[7547]: I0308 03:47:38.681201 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 03:47:38.687732 master-0 kubenswrapper[7547]: I0308 03:47:38.686689 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 03:47:38.687732 master-0 kubenswrapper[7547]: I0308 03:47:38.686753 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 03:47:38.687732 master-0 kubenswrapper[7547]: I0308 03:47:38.687087 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 03:47:38.697214 master-0 kubenswrapper[7547]: I0308 03:47:38.687806 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 03:47:38.697214 master-0 kubenswrapper[7547]: I0308 03:47:38.688527 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-6b779d99b8-7kmck"] Mar 08 03:47:38.704910 master-0 kubenswrapper[7547]: I0308 03:47:38.703711 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 03:47:38.706435 master-0 kubenswrapper[7547]: I0308 03:47:38.705920 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-image-import-ca\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.706435 master-0 kubenswrapper[7547]: I0308 03:47:38.705984 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-audit\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.706435 master-0 kubenswrapper[7547]: I0308 03:47:38.706009 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-config\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.706435 master-0 kubenswrapper[7547]: I0308 03:47:38.706028 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-serving-cert\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.706435 master-0 kubenswrapper[7547]: I0308 03:47:38.706096 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-audit-dir\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.706435 master-0 kubenswrapper[7547]: I0308 03:47:38.706116 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-encryption-config\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.706435 master-0 kubenswrapper[7547]: I0308 03:47:38.706149 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-trusted-ca-bundle\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.706435 master-0 kubenswrapper[7547]: I0308 03:47:38.706176 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-etcd-client\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.706435 master-0 kubenswrapper[7547]: I0308 03:47:38.706220 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-node-pullsecrets\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.706435 master-0 kubenswrapper[7547]: I0308 03:47:38.706256 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-etcd-serving-ca\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.706435 master-0 kubenswrapper[7547]: I0308 03:47:38.706286 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v45k\" (UniqueName: \"kubernetes.io/projected/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-kube-api-access-4v45k\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.706435 master-0 kubenswrapper[7547]: I0308 03:47:38.706323 7547 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3641f81d-1901-4b8b-a553-75fdb739919f-audit\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:38.807678 master-0 kubenswrapper[7547]: I0308 03:47:38.807615 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v45k\" (UniqueName: \"kubernetes.io/projected/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-kube-api-access-4v45k\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.807894 master-0 kubenswrapper[7547]: I0308 03:47:38.807737 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-image-import-ca\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.808155 master-0 kubenswrapper[7547]: I0308 03:47:38.808124 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-audit\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.808265 master-0 kubenswrapper[7547]: I0308 03:47:38.808244 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-config\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.808383 master-0 kubenswrapper[7547]: I0308 03:47:38.808363 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-serving-cert\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.808534 master-0 kubenswrapper[7547]: I0308 03:47:38.808516 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-audit-dir\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.808621 master-0 kubenswrapper[7547]: I0308 03:47:38.808606 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-encryption-config\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.808732 master-0 kubenswrapper[7547]: I0308 03:47:38.808717 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-trusted-ca-bundle\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.808878 master-0 kubenswrapper[7547]: I0308 03:47:38.808860 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-etcd-client\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.809020 master-0 kubenswrapper[7547]: I0308 03:47:38.809004 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-node-pullsecrets\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.809167 master-0 kubenswrapper[7547]: I0308 03:47:38.809150 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-etcd-serving-ca\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.809679 master-0 kubenswrapper[7547]: I0308 03:47:38.809624 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-audit\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.809679 master-0 kubenswrapper[7547]: I0308 03:47:38.809659 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-config\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.809848 master-0 kubenswrapper[7547]: I0308 03:47:38.809788 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-audit-dir\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.811178 master-0 kubenswrapper[7547]: I0308 03:47:38.810640 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-etcd-serving-ca\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.811410 master-0 kubenswrapper[7547]: I0308 03:47:38.811330 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-trusted-ca-bundle\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.811507 master-0 kubenswrapper[7547]: I0308 03:47:38.809037 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-image-import-ca\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.811574 master-0 kubenswrapper[7547]: I0308 03:47:38.811554 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-node-pullsecrets\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.814171 master-0 kubenswrapper[7547]: I0308 03:47:38.813950 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-serving-cert\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.814744 master-0 kubenswrapper[7547]: I0308 03:47:38.814447 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-encryption-config\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.814744 master-0 kubenswrapper[7547]: I0308 03:47:38.814702 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-etcd-client\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:38.836497 master-0 kubenswrapper[7547]: I0308 03:47:38.836369 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v45k\" (UniqueName: \"kubernetes.io/projected/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-kube-api-access-4v45k\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:39.012684 master-0 kubenswrapper[7547]: I0308 03:47:39.012615 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:39.237797 master-0 kubenswrapper[7547]: I0308 03:47:39.237752 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3641f81d-1901-4b8b-a553-75fdb739919f" path="/var/lib/kubelet/pods/3641f81d-1901-4b8b-a553-75fdb739919f/volumes" Mar 08 03:47:39.348243 master-0 kubenswrapper[7547]: I0308 03:47:39.348173 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:39.348449 master-0 kubenswrapper[7547]: I0308 03:47:39.348397 7547 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:47:39.373527 master-0 kubenswrapper[7547]: I0308 03:47:39.372793 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:47:39.621005 master-0 kubenswrapper[7547]: I0308 03:47:39.620906 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:39.622071 master-0 kubenswrapper[7547]: E0308 03:47:39.621149 7547 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:39.622071 master-0 kubenswrapper[7547]: E0308 03:47:39.621221 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca podName:22ebd67b-43b2-4f9d-955b-eb848d9d55d4 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:55.621198298 +0000 UTC m=+38.566882841 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca") pod "route-controller-manager-6d8669fddc-zn4lq" (UID: "22ebd67b-43b2-4f9d-955b-eb848d9d55d4") : configmap "client-ca" not found Mar 08 03:47:39.622071 master-0 kubenswrapper[7547]: I0308 03:47:39.621305 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:39.622071 master-0 kubenswrapper[7547]: E0308 03:47:39.621772 7547 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:47:39.622071 master-0 kubenswrapper[7547]: E0308 03:47:39.621878 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert podName:22ebd67b-43b2-4f9d-955b-eb848d9d55d4 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:55.621808542 +0000 UTC m=+38.567493085 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert") pod "route-controller-manager-6d8669fddc-zn4lq" (UID: "22ebd67b-43b2-4f9d-955b-eb848d9d55d4") : secret "serving-cert" not found Mar 08 03:47:39.778945 master-0 kubenswrapper[7547]: I0308 03:47:39.778813 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-6b779d99b8-7kmck"] Mar 08 03:47:40.589906 master-0 kubenswrapper[7547]: I0308 03:47:40.589776 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" event={"ID":"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f","Type":"ContainerStarted","Data":"7f9b714050ae3b09bb8faf754e7059ffdbc5afd8e225d14b9c0ab424f1262da7"} Mar 08 03:47:41.463172 master-0 kubenswrapper[7547]: I0308 03:47:41.463085 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 08 03:47:41.464248 master-0 kubenswrapper[7547]: I0308 03:47:41.463882 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:47:41.465886 master-0 kubenswrapper[7547]: I0308 03:47:41.465837 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 08 03:47:41.550323 master-0 kubenswrapper[7547]: I0308 03:47:41.550254 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/70c6db8e-1612-4da7-84ad-0750258e310e-var-lock\") pod \"installer-1-master-0\" (UID: \"70c6db8e-1612-4da7-84ad-0750258e310e\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:47:41.550550 master-0 kubenswrapper[7547]: I0308 03:47:41.550410 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70c6db8e-1612-4da7-84ad-0750258e310e-kube-api-access\") pod \"installer-1-master-0\" (UID: \"70c6db8e-1612-4da7-84ad-0750258e310e\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:47:41.550550 master-0 kubenswrapper[7547]: I0308 03:47:41.550476 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70c6db8e-1612-4da7-84ad-0750258e310e-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"70c6db8e-1612-4da7-84ad-0750258e310e\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:47:41.651126 master-0 kubenswrapper[7547]: I0308 03:47:41.651045 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70c6db8e-1612-4da7-84ad-0750258e310e-kube-api-access\") pod \"installer-1-master-0\" (UID: \"70c6db8e-1612-4da7-84ad-0750258e310e\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:47:41.651363 master-0 kubenswrapper[7547]: I0308 03:47:41.651164 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70c6db8e-1612-4da7-84ad-0750258e310e-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"70c6db8e-1612-4da7-84ad-0750258e310e\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:47:41.651448 master-0 kubenswrapper[7547]: I0308 03:47:41.651405 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70c6db8e-1612-4da7-84ad-0750258e310e-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"70c6db8e-1612-4da7-84ad-0750258e310e\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:47:41.651652 master-0 kubenswrapper[7547]: I0308 03:47:41.651526 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/70c6db8e-1612-4da7-84ad-0750258e310e-var-lock\") pod \"installer-1-master-0\" (UID: \"70c6db8e-1612-4da7-84ad-0750258e310e\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:47:41.651762 master-0 kubenswrapper[7547]: I0308 03:47:41.651734 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/70c6db8e-1612-4da7-84ad-0750258e310e-var-lock\") pod \"installer-1-master-0\" (UID: \"70c6db8e-1612-4da7-84ad-0750258e310e\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:47:41.774031 master-0 kubenswrapper[7547]: I0308 03:47:41.770387 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 08 03:47:41.816879 master-0 kubenswrapper[7547]: I0308 03:47:41.816354 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70c6db8e-1612-4da7-84ad-0750258e310e-kube-api-access\") pod \"installer-1-master-0\" (UID: \"70c6db8e-1612-4da7-84ad-0750258e310e\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:47:42.080415 master-0 kubenswrapper[7547]: I0308 03:47:42.080298 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:47:42.657609 master-0 kubenswrapper[7547]: I0308 03:47:42.656863 7547 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vfgfp container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.22:8443/healthz\": dial tcp 10.128.0.22:8443: connect: connection refused" start-of-body= Mar 08 03:47:42.657609 master-0 kubenswrapper[7547]: I0308 03:47:42.657201 7547 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" podUID="0918ba32-8e55-48d0-8e50-027c0dcb4bbd" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.22:8443/healthz\": dial tcp 10.128.0.22:8443: connect: connection refused" Mar 08 03:47:42.820885 master-0 kubenswrapper[7547]: I0308 03:47:42.813830 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 08 03:47:43.088315 master-0 kubenswrapper[7547]: I0308 03:47:43.088170 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:43.088598 master-0 kubenswrapper[7547]: I0308 03:47:43.088468 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:43.089179 master-0 kubenswrapper[7547]: E0308 03:47:43.089118 7547 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:43.089305 master-0 kubenswrapper[7547]: E0308 03:47:43.089206 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca podName:4710894b-9971-464b-ae78-0d8520542328 nodeName:}" failed. No retries permitted until 2026-03-08 03:47:59.089185935 +0000 UTC m=+42.034870448 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca") pod "controller-manager-7765bbc5bf-72v5f" (UID: "4710894b-9971-464b-ae78-0d8520542328") : configmap "client-ca" not found Mar 08 03:47:43.096672 master-0 kubenswrapper[7547]: I0308 03:47:43.096610 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert\") pod \"controller-manager-7765bbc5bf-72v5f\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:43.405703 master-0 kubenswrapper[7547]: I0308 03:47:43.405137 7547 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-vfgfp container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.22:8443/healthz\": dial tcp 10.128.0.22:8443: connect: connection refused" start-of-body= Mar 08 03:47:43.405703 master-0 kubenswrapper[7547]: I0308 03:47:43.405638 7547 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" podUID="0918ba32-8e55-48d0-8e50-027c0dcb4bbd" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.22:8443/healthz\": dial tcp 10.128.0.22:8443: connect: connection refused" Mar 08 03:47:43.635432 master-0 kubenswrapper[7547]: I0308 03:47:43.635329 7547 generic.go:334] "Generic (PLEG): container finished" podID="0918ba32-8e55-48d0-8e50-027c0dcb4bbd" containerID="1523789c3f1ce2ac99a20cd9e6a22cf2201e9f542fc114c065ce9962d3d4debb" exitCode=0 Mar 08 03:47:43.635432 master-0 kubenswrapper[7547]: I0308 03:47:43.635407 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" event={"ID":"0918ba32-8e55-48d0-8e50-027c0dcb4bbd","Type":"ContainerDied","Data":"1523789c3f1ce2ac99a20cd9e6a22cf2201e9f542fc114c065ce9962d3d4debb"} Mar 08 03:47:43.635925 master-0 kubenswrapper[7547]: I0308 03:47:43.635894 7547 scope.go:117] "RemoveContainer" containerID="1523789c3f1ce2ac99a20cd9e6a22cf2201e9f542fc114c065ce9962d3d4debb" Mar 08 03:47:43.641736 master-0 kubenswrapper[7547]: I0308 03:47:43.641646 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"70c6db8e-1612-4da7-84ad-0750258e310e","Type":"ContainerStarted","Data":"cef32873cbe3d6b97478c715fa10b9fc2b7f6472cbc949089347f2bceb34bb4a"} Mar 08 03:47:43.641736 master-0 kubenswrapper[7547]: I0308 03:47:43.641707 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"70c6db8e-1612-4da7-84ad-0750258e310e","Type":"ContainerStarted","Data":"7d660df37cf1eac55715dfa6930009397ac7a30a1e9f2938482021ef45c4c112"} Mar 08 03:47:44.228171 master-0 kubenswrapper[7547]: I0308 03:47:44.228115 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb"] Mar 08 03:47:44.229035 master-0 kubenswrapper[7547]: I0308 03:47:44.228710 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:44.232696 master-0 kubenswrapper[7547]: I0308 03:47:44.231126 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 08 03:47:44.232696 master-0 kubenswrapper[7547]: I0308 03:47:44.231315 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 08 03:47:44.238676 master-0 kubenswrapper[7547]: I0308 03:47:44.238470 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 08 03:47:44.244406 master-0 kubenswrapper[7547]: I0308 03:47:44.244317 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj"] Mar 08 03:47:44.248999 master-0 kubenswrapper[7547]: I0308 03:47:44.245271 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.250011 master-0 kubenswrapper[7547]: I0308 03:47:44.249759 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb"] Mar 08 03:47:44.250342 master-0 kubenswrapper[7547]: I0308 03:47:44.250213 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 08 03:47:44.253897 master-0 kubenswrapper[7547]: I0308 03:47:44.250610 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 08 03:47:44.253897 master-0 kubenswrapper[7547]: I0308 03:47:44.251201 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 08 03:47:44.274299 master-0 kubenswrapper[7547]: I0308 03:47:44.271443 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 08 03:47:44.275025 master-0 kubenswrapper[7547]: I0308 03:47:44.274969 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj"] Mar 08 03:47:44.306370 master-0 kubenswrapper[7547]: I0308 03:47:44.306317 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1b69fbf6-1ca5-413e-bffd-965730bcec1b-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.306370 master-0 kubenswrapper[7547]: I0308 03:47:44.306370 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1b69fbf6-1ca5-413e-bffd-965730bcec1b-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.306622 master-0 kubenswrapper[7547]: I0308 03:47:44.306419 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2f59fe81-deee-4ced-ae9d-f17752c82c4b-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:44.306622 master-0 kubenswrapper[7547]: I0308 03:47:44.306531 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1b69fbf6-1ca5-413e-bffd-965730bcec1b-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.306705 master-0 kubenswrapper[7547]: I0308 03:47:44.306658 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/2f59fe81-deee-4ced-ae9d-f17752c82c4b-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:44.306705 master-0 kubenswrapper[7547]: I0308 03:47:44.306681 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm7bw\" (UniqueName: \"kubernetes.io/projected/2f59fe81-deee-4ced-ae9d-f17752c82c4b-kube-api-access-bm7bw\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:44.306794 master-0 kubenswrapper[7547]: I0308 03:47:44.306734 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/2f59fe81-deee-4ced-ae9d-f17752c82c4b-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:44.306794 master-0 kubenswrapper[7547]: I0308 03:47:44.306763 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1b69fbf6-1ca5-413e-bffd-965730bcec1b-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.306909 master-0 kubenswrapper[7547]: I0308 03:47:44.306837 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfz27\" (UniqueName: \"kubernetes.io/projected/1b69fbf6-1ca5-413e-bffd-965730bcec1b-kube-api-access-nfz27\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.306955 master-0 kubenswrapper[7547]: I0308 03:47:44.306933 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1b69fbf6-1ca5-413e-bffd-965730bcec1b-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.307041 master-0 kubenswrapper[7547]: I0308 03:47:44.306996 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/2f59fe81-deee-4ced-ae9d-f17752c82c4b-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:44.360889 master-0 kubenswrapper[7547]: I0308 03:47:44.358305 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7765bbc5bf-72v5f"] Mar 08 03:47:44.360889 master-0 kubenswrapper[7547]: E0308 03:47:44.358554 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" podUID="4710894b-9971-464b-ae78-0d8520542328" Mar 08 03:47:44.408535 master-0 kubenswrapper[7547]: I0308 03:47:44.407396 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/2f59fe81-deee-4ced-ae9d-f17752c82c4b-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:44.408535 master-0 kubenswrapper[7547]: I0308 03:47:44.407443 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1b69fbf6-1ca5-413e-bffd-965730bcec1b-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.408535 master-0 kubenswrapper[7547]: I0308 03:47:44.407477 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfz27\" (UniqueName: \"kubernetes.io/projected/1b69fbf6-1ca5-413e-bffd-965730bcec1b-kube-api-access-nfz27\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.408535 master-0 kubenswrapper[7547]: I0308 03:47:44.407510 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1b69fbf6-1ca5-413e-bffd-965730bcec1b-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.408535 master-0 kubenswrapper[7547]: I0308 03:47:44.407526 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/2f59fe81-deee-4ced-ae9d-f17752c82c4b-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:44.408535 master-0 kubenswrapper[7547]: I0308 03:47:44.407591 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1b69fbf6-1ca5-413e-bffd-965730bcec1b-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.408535 master-0 kubenswrapper[7547]: I0308 03:47:44.407611 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1b69fbf6-1ca5-413e-bffd-965730bcec1b-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.408535 master-0 kubenswrapper[7547]: I0308 03:47:44.407637 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2f59fe81-deee-4ced-ae9d-f17752c82c4b-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:44.408535 master-0 kubenswrapper[7547]: I0308 03:47:44.407673 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1b69fbf6-1ca5-413e-bffd-965730bcec1b-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.408535 master-0 kubenswrapper[7547]: I0308 03:47:44.407704 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/2f59fe81-deee-4ced-ae9d-f17752c82c4b-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:44.408535 master-0 kubenswrapper[7547]: I0308 03:47:44.407722 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm7bw\" (UniqueName: \"kubernetes.io/projected/2f59fe81-deee-4ced-ae9d-f17752c82c4b-kube-api-access-bm7bw\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:44.408535 master-0 kubenswrapper[7547]: I0308 03:47:44.408126 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/2f59fe81-deee-4ced-ae9d-f17752c82c4b-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:44.408535 master-0 kubenswrapper[7547]: I0308 03:47:44.408163 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1b69fbf6-1ca5-413e-bffd-965730bcec1b-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.408535 master-0 kubenswrapper[7547]: E0308 03:47:44.408317 7547 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 08 03:47:44.408535 master-0 kubenswrapper[7547]: E0308 03:47:44.408353 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b69fbf6-1ca5-413e-bffd-965730bcec1b-catalogserver-certs podName:1b69fbf6-1ca5-413e-bffd-965730bcec1b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:44.908340088 +0000 UTC m=+27.854024601 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/1b69fbf6-1ca5-413e-bffd-965730bcec1b-catalogserver-certs") pod "catalogd-controller-manager-7f8b8b6f4c-8h6fj" (UID: "1b69fbf6-1ca5-413e-bffd-965730bcec1b") : secret "catalogserver-cert" not found Mar 08 03:47:44.408535 master-0 kubenswrapper[7547]: I0308 03:47:44.408485 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/2f59fe81-deee-4ced-ae9d-f17752c82c4b-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:44.409753 master-0 kubenswrapper[7547]: I0308 03:47:44.409245 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1b69fbf6-1ca5-413e-bffd-965730bcec1b-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.409753 master-0 kubenswrapper[7547]: I0308 03:47:44.409267 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2f59fe81-deee-4ced-ae9d-f17752c82c4b-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:44.409753 master-0 kubenswrapper[7547]: I0308 03:47:44.409541 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1b69fbf6-1ca5-413e-bffd-965730bcec1b-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.420552 master-0 kubenswrapper[7547]: I0308 03:47:44.420508 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/2f59fe81-deee-4ced-ae9d-f17752c82c4b-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:44.423881 master-0 kubenswrapper[7547]: I0308 03:47:44.423384 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1b69fbf6-1ca5-413e-bffd-965730bcec1b-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.447841 master-0 kubenswrapper[7547]: I0308 03:47:44.447800 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm7bw\" (UniqueName: \"kubernetes.io/projected/2f59fe81-deee-4ced-ae9d-f17752c82c4b-kube-api-access-bm7bw\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:44.451952 master-0 kubenswrapper[7547]: I0308 03:47:44.448434 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfz27\" (UniqueName: \"kubernetes.io/projected/1b69fbf6-1ca5-413e-bffd-965730bcec1b-kube-api-access-nfz27\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.570970 master-0 kubenswrapper[7547]: I0308 03:47:44.570868 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:44.644781 master-0 kubenswrapper[7547]: I0308 03:47:44.644738 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:44.655450 master-0 kubenswrapper[7547]: I0308 03:47:44.655406 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:44.674147 master-0 kubenswrapper[7547]: I0308 03:47:44.674073 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-1-master-0" podStartSLOduration=3.674055129 podStartE2EDuration="3.674055129s" podCreationTimestamp="2026-03-08 03:47:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:47:44.669191275 +0000 UTC m=+27.614875788" watchObservedRunningTime="2026-03-08 03:47:44.674055129 +0000 UTC m=+27.619739632" Mar 08 03:47:44.710107 master-0 kubenswrapper[7547]: I0308 03:47:44.710065 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgxsp\" (UniqueName: \"kubernetes.io/projected/4710894b-9971-464b-ae78-0d8520542328-kube-api-access-lgxsp\") pod \"4710894b-9971-464b-ae78-0d8520542328\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " Mar 08 03:47:44.710212 master-0 kubenswrapper[7547]: I0308 03:47:44.710111 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert\") pod \"4710894b-9971-464b-ae78-0d8520542328\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " Mar 08 03:47:44.710212 master-0 kubenswrapper[7547]: I0308 03:47:44.710131 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-config\") pod \"4710894b-9971-464b-ae78-0d8520542328\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " Mar 08 03:47:44.710212 master-0 kubenswrapper[7547]: I0308 03:47:44.710155 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-proxy-ca-bundles\") pod \"4710894b-9971-464b-ae78-0d8520542328\" (UID: \"4710894b-9971-464b-ae78-0d8520542328\") " Mar 08 03:47:44.710918 master-0 kubenswrapper[7547]: I0308 03:47:44.710860 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-config" (OuterVolumeSpecName: "config") pod "4710894b-9971-464b-ae78-0d8520542328" (UID: "4710894b-9971-464b-ae78-0d8520542328"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:47:44.711328 master-0 kubenswrapper[7547]: I0308 03:47:44.711292 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4710894b-9971-464b-ae78-0d8520542328" (UID: "4710894b-9971-464b-ae78-0d8520542328"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:47:44.713858 master-0 kubenswrapper[7547]: I0308 03:47:44.713799 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4710894b-9971-464b-ae78-0d8520542328" (UID: "4710894b-9971-464b-ae78-0d8520542328"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:47:44.728427 master-0 kubenswrapper[7547]: I0308 03:47:44.728390 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4710894b-9971-464b-ae78-0d8520542328-kube-api-access-lgxsp" (OuterVolumeSpecName: "kube-api-access-lgxsp") pod "4710894b-9971-464b-ae78-0d8520542328" (UID: "4710894b-9971-464b-ae78-0d8520542328"). InnerVolumeSpecName "kube-api-access-lgxsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:47:44.811440 master-0 kubenswrapper[7547]: I0308 03:47:44.811357 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgxsp\" (UniqueName: \"kubernetes.io/projected/4710894b-9971-464b-ae78-0d8520542328-kube-api-access-lgxsp\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:44.811440 master-0 kubenswrapper[7547]: I0308 03:47:44.811390 7547 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4710894b-9971-464b-ae78-0d8520542328-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:44.811440 master-0 kubenswrapper[7547]: I0308 03:47:44.811399 7547 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:44.811440 master-0 kubenswrapper[7547]: I0308 03:47:44.811411 7547 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:44.912030 master-0 kubenswrapper[7547]: I0308 03:47:44.911800 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1b69fbf6-1ca5-413e-bffd-965730bcec1b-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:44.912030 master-0 kubenswrapper[7547]: E0308 03:47:44.911969 7547 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 08 03:47:44.912340 master-0 kubenswrapper[7547]: E0308 03:47:44.912233 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b69fbf6-1ca5-413e-bffd-965730bcec1b-catalogserver-certs podName:1b69fbf6-1ca5-413e-bffd-965730bcec1b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:45.91220817 +0000 UTC m=+28.857892703 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/1b69fbf6-1ca5-413e-bffd-965730bcec1b-catalogserver-certs") pod "catalogd-controller-manager-7f8b8b6f4c-8h6fj" (UID: "1b69fbf6-1ca5-413e-bffd-965730bcec1b") : secret "catalogserver-cert" not found Mar 08 03:47:45.653305 master-0 kubenswrapper[7547]: I0308 03:47:45.652993 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7765bbc5bf-72v5f" Mar 08 03:47:45.653985 master-0 kubenswrapper[7547]: I0308 03:47:45.653167 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" event={"ID":"0918ba32-8e55-48d0-8e50-027c0dcb4bbd","Type":"ContainerStarted","Data":"b15fbce1f89ff17f10430a35787c2a500a7f16d50b837842a7504088df7e7273"} Mar 08 03:47:45.654866 master-0 kubenswrapper[7547]: I0308 03:47:45.654017 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:45.678733 master-0 kubenswrapper[7547]: I0308 03:47:45.678473 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb"] Mar 08 03:47:45.683901 master-0 kubenswrapper[7547]: W0308 03:47:45.683787 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f59fe81_deee_4ced_ae9d_f17752c82c4b.slice/crio-40effd583e5274c7bd2d572652f2adc6b94ce532187f776e02a024e30b5ff7e5 WatchSource:0}: Error finding container 40effd583e5274c7bd2d572652f2adc6b94ce532187f776e02a024e30b5ff7e5: Status 404 returned error can't find the container with id 40effd583e5274c7bd2d572652f2adc6b94ce532187f776e02a024e30b5ff7e5 Mar 08 03:47:45.697403 master-0 kubenswrapper[7547]: I0308 03:47:45.697361 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-79784cf87c-jfjm4"] Mar 08 03:47:45.697875 master-0 kubenswrapper[7547]: I0308 03:47:45.697828 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:45.700753 master-0 kubenswrapper[7547]: I0308 03:47:45.700714 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 03:47:45.703932 master-0 kubenswrapper[7547]: I0308 03:47:45.703906 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 03:47:45.704197 master-0 kubenswrapper[7547]: I0308 03:47:45.704181 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 03:47:45.707254 master-0 kubenswrapper[7547]: I0308 03:47:45.707223 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 03:47:45.707473 master-0 kubenswrapper[7547]: I0308 03:47:45.707442 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 03:47:45.735422 master-0 kubenswrapper[7547]: I0308 03:47:45.725242 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7765bbc5bf-72v5f"] Mar 08 03:47:45.735422 master-0 kubenswrapper[7547]: I0308 03:47:45.728346 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 03:47:45.735422 master-0 kubenswrapper[7547]: I0308 03:47:45.729287 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79784cf87c-jfjm4"] Mar 08 03:47:45.736874 master-0 kubenswrapper[7547]: I0308 03:47:45.736309 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7765bbc5bf-72v5f"] Mar 08 03:47:45.828395 master-0 kubenswrapper[7547]: I0308 03:47:45.828036 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4a450c-dac8-445d-abfc-75e4920b676b-serving-cert\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:45.828479 master-0 kubenswrapper[7547]: I0308 03:47:45.828465 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gnhj\" (UniqueName: \"kubernetes.io/projected/0d4a450c-dac8-445d-abfc-75e4920b676b-kube-api-access-5gnhj\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:45.829241 master-0 kubenswrapper[7547]: I0308 03:47:45.829201 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:45.829468 master-0 kubenswrapper[7547]: I0308 03:47:45.829330 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-config\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:45.829468 master-0 kubenswrapper[7547]: I0308 03:47:45.829367 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-proxy-ca-bundles\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:45.930537 master-0 kubenswrapper[7547]: I0308 03:47:45.930378 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-config\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:45.930859 master-0 kubenswrapper[7547]: E0308 03:47:45.930762 7547 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:45.931036 master-0 kubenswrapper[7547]: I0308 03:47:45.930798 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:45.931230 master-0 kubenswrapper[7547]: E0308 03:47:45.930920 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca podName:0d4a450c-dac8-445d-abfc-75e4920b676b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:46.430888384 +0000 UTC m=+29.376572937 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca") pod "controller-manager-79784cf87c-jfjm4" (UID: "0d4a450c-dac8-445d-abfc-75e4920b676b") : configmap "client-ca" not found Mar 08 03:47:45.931563 master-0 kubenswrapper[7547]: I0308 03:47:45.931425 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-proxy-ca-bundles\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:45.931956 master-0 kubenswrapper[7547]: I0308 03:47:45.931866 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4a450c-dac8-445d-abfc-75e4920b676b-serving-cert\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:45.932215 master-0 kubenswrapper[7547]: I0308 03:47:45.932143 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1b69fbf6-1ca5-413e-bffd-965730bcec1b-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:45.932656 master-0 kubenswrapper[7547]: E0308 03:47:45.932478 7547 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 08 03:47:45.932656 master-0 kubenswrapper[7547]: E0308 03:47:45.932584 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b69fbf6-1ca5-413e-bffd-965730bcec1b-catalogserver-certs podName:1b69fbf6-1ca5-413e-bffd-965730bcec1b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:47.932563734 +0000 UTC m=+30.878248257 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/1b69fbf6-1ca5-413e-bffd-965730bcec1b-catalogserver-certs") pod "catalogd-controller-manager-7f8b8b6f4c-8h6fj" (UID: "1b69fbf6-1ca5-413e-bffd-965730bcec1b") : secret "catalogserver-cert" not found Mar 08 03:47:45.933343 master-0 kubenswrapper[7547]: I0308 03:47:45.932889 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gnhj\" (UniqueName: \"kubernetes.io/projected/0d4a450c-dac8-445d-abfc-75e4920b676b-kube-api-access-5gnhj\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:45.933343 master-0 kubenswrapper[7547]: I0308 03:47:45.933036 7547 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4710894b-9971-464b-ae78-0d8520542328-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:47:45.933494 master-0 kubenswrapper[7547]: I0308 03:47:45.933464 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-proxy-ca-bundles\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:45.934736 master-0 kubenswrapper[7547]: I0308 03:47:45.934693 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-config\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:45.939539 master-0 kubenswrapper[7547]: I0308 03:47:45.939484 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4a450c-dac8-445d-abfc-75e4920b676b-serving-cert\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:45.953359 master-0 kubenswrapper[7547]: I0308 03:47:45.953326 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gnhj\" (UniqueName: \"kubernetes.io/projected/0d4a450c-dac8-445d-abfc-75e4920b676b-kube-api-access-5gnhj\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:46.438007 master-0 kubenswrapper[7547]: I0308 03:47:46.437828 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:46.438007 master-0 kubenswrapper[7547]: E0308 03:47:46.437913 7547 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:46.438007 master-0 kubenswrapper[7547]: E0308 03:47:46.437987 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca podName:0d4a450c-dac8-445d-abfc-75e4920b676b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:47.437966183 +0000 UTC m=+30.383650696 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca") pod "controller-manager-79784cf87c-jfjm4" (UID: "0d4a450c-dac8-445d-abfc-75e4920b676b") : configmap "client-ca" not found Mar 08 03:47:46.660607 master-0 kubenswrapper[7547]: I0308 03:47:46.660498 7547 generic.go:334] "Generic (PLEG): container finished" podID="c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f" containerID="1b384f11136941d514d3d61afdf401bff1b3b2c0f0bf1870bb3feb8e7a8ab041" exitCode=0 Mar 08 03:47:46.660607 master-0 kubenswrapper[7547]: I0308 03:47:46.660553 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" event={"ID":"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f","Type":"ContainerDied","Data":"1b384f11136941d514d3d61afdf401bff1b3b2c0f0bf1870bb3feb8e7a8ab041"} Mar 08 03:47:46.666671 master-0 kubenswrapper[7547]: I0308 03:47:46.666588 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" event={"ID":"2f59fe81-deee-4ced-ae9d-f17752c82c4b","Type":"ContainerStarted","Data":"22b9b448ce2f560959561dc1c667d216e6b3f2c9bc3d5e516b6886760df474ff"} Mar 08 03:47:46.666671 master-0 kubenswrapper[7547]: I0308 03:47:46.666634 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" event={"ID":"2f59fe81-deee-4ced-ae9d-f17752c82c4b","Type":"ContainerStarted","Data":"3059f49f388319ee646920103084d28d8b0077750e77df3225c9bad4053dd550"} Mar 08 03:47:46.666671 master-0 kubenswrapper[7547]: I0308 03:47:46.666647 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" event={"ID":"2f59fe81-deee-4ced-ae9d-f17752c82c4b","Type":"ContainerStarted","Data":"40effd583e5274c7bd2d572652f2adc6b94ce532187f776e02a024e30b5ff7e5"} Mar 08 03:47:46.737951 master-0 kubenswrapper[7547]: I0308 03:47:46.737862 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" podStartSLOduration=2.737800788 podStartE2EDuration="2.737800788s" podCreationTimestamp="2026-03-08 03:47:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:47:46.733063576 +0000 UTC m=+29.678748129" watchObservedRunningTime="2026-03-08 03:47:46.737800788 +0000 UTC m=+29.683485341" Mar 08 03:47:47.242985 master-0 kubenswrapper[7547]: I0308 03:47:47.242389 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4710894b-9971-464b-ae78-0d8520542328" path="/var/lib/kubelet/pods/4710894b-9971-464b-ae78-0d8520542328/volumes" Mar 08 03:47:47.451549 master-0 kubenswrapper[7547]: I0308 03:47:47.451506 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:47.451771 master-0 kubenswrapper[7547]: E0308 03:47:47.451640 7547 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:47.451771 master-0 kubenswrapper[7547]: E0308 03:47:47.451719 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca podName:0d4a450c-dac8-445d-abfc-75e4920b676b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:49.45169719 +0000 UTC m=+32.397381723 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca") pod "controller-manager-79784cf87c-jfjm4" (UID: "0d4a450c-dac8-445d-abfc-75e4920b676b") : configmap "client-ca" not found Mar 08 03:47:47.675887 master-0 kubenswrapper[7547]: I0308 03:47:47.675677 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" event={"ID":"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f","Type":"ContainerStarted","Data":"c464658ab15fe15d032030f5ca5268b625c496bcfcf963a484db8e77a86e0ca9"} Mar 08 03:47:47.675887 master-0 kubenswrapper[7547]: I0308 03:47:47.675723 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" event={"ID":"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f","Type":"ContainerStarted","Data":"21c459242ea0ac2e10d80a8aacb088b9039eac881e1ac930e722de979037eaa0"} Mar 08 03:47:47.677132 master-0 kubenswrapper[7547]: I0308 03:47:47.676012 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:47.960680 master-0 kubenswrapper[7547]: I0308 03:47:47.960610 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1b69fbf6-1ca5-413e-bffd-965730bcec1b-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:47.960990 master-0 kubenswrapper[7547]: E0308 03:47:47.960787 7547 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 08 03:47:47.960990 master-0 kubenswrapper[7547]: E0308 03:47:47.960866 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b69fbf6-1ca5-413e-bffd-965730bcec1b-catalogserver-certs podName:1b69fbf6-1ca5-413e-bffd-965730bcec1b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:51.960849257 +0000 UTC m=+34.906533760 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/1b69fbf6-1ca5-413e-bffd-965730bcec1b-catalogserver-certs") pod "catalogd-controller-manager-7f8b8b6f4c-8h6fj" (UID: "1b69fbf6-1ca5-413e-bffd-965730bcec1b") : secret "catalogserver-cert" not found Mar 08 03:47:48.661111 master-0 kubenswrapper[7547]: I0308 03:47:48.660751 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:47:48.684725 master-0 kubenswrapper[7547]: I0308 03:47:48.684651 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" podStartSLOduration=6.941518926 podStartE2EDuration="12.684634232s" podCreationTimestamp="2026-03-08 03:47:36 +0000 UTC" firstStartedPulling="2026-03-08 03:47:39.799985981 +0000 UTC m=+22.745670524" lastFinishedPulling="2026-03-08 03:47:45.543101307 +0000 UTC m=+28.488785830" observedRunningTime="2026-03-08 03:47:47.717541314 +0000 UTC m=+30.663225867" watchObservedRunningTime="2026-03-08 03:47:48.684634232 +0000 UTC m=+31.630318745" Mar 08 03:47:49.013538 master-0 kubenswrapper[7547]: I0308 03:47:49.013483 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:49.013538 master-0 kubenswrapper[7547]: I0308 03:47:49.013542 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:49.020801 master-0 kubenswrapper[7547]: I0308 03:47:49.020759 7547 patch_prober.go:28] interesting pod/apiserver-6b779d99b8-7kmck container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 03:47:49.020801 master-0 kubenswrapper[7547]: [+]log ok Mar 08 03:47:49.020801 master-0 kubenswrapper[7547]: [+]etcd ok Mar 08 03:47:49.020801 master-0 kubenswrapper[7547]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 03:47:49.020801 master-0 kubenswrapper[7547]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 03:47:49.020801 master-0 kubenswrapper[7547]: [+]poststarthook/max-in-flight-filter ok Mar 08 03:47:49.020801 master-0 kubenswrapper[7547]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 03:47:49.020801 master-0 kubenswrapper[7547]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 03:47:49.020801 master-0 kubenswrapper[7547]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 03:47:49.020801 master-0 kubenswrapper[7547]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 08 03:47:49.020801 master-0 kubenswrapper[7547]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 03:47:49.020801 master-0 kubenswrapper[7547]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 03:47:49.020801 master-0 kubenswrapper[7547]: [+]poststarthook/openshift.io-startinformers ok Mar 08 03:47:49.020801 master-0 kubenswrapper[7547]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 03:47:49.020801 master-0 kubenswrapper[7547]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 03:47:49.020801 master-0 kubenswrapper[7547]: livez check failed Mar 08 03:47:49.021369 master-0 kubenswrapper[7547]: I0308 03:47:49.020811 7547 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" podUID="c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 03:47:49.481211 master-0 kubenswrapper[7547]: I0308 03:47:49.481149 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:49.481674 master-0 kubenswrapper[7547]: E0308 03:47:49.481299 7547 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:49.481674 master-0 kubenswrapper[7547]: E0308 03:47:49.481355 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca podName:0d4a450c-dac8-445d-abfc-75e4920b676b nodeName:}" failed. No retries permitted until 2026-03-08 03:47:53.481339614 +0000 UTC m=+36.427024117 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca") pod "controller-manager-79784cf87c-jfjm4" (UID: "0d4a450c-dac8-445d-abfc-75e4920b676b") : configmap "client-ca" not found Mar 08 03:47:50.189221 master-0 kubenswrapper[7547]: I0308 03:47:50.189146 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:50.189221 master-0 kubenswrapper[7547]: I0308 03:47:50.189229 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:50.190247 master-0 kubenswrapper[7547]: I0308 03:47:50.189288 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:50.190247 master-0 kubenswrapper[7547]: I0308 03:47:50.189340 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:50.190247 master-0 kubenswrapper[7547]: I0308 03:47:50.189389 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:50.190247 master-0 kubenswrapper[7547]: E0308 03:47:50.189394 7547 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: configmap "kube-rbac-proxy" not found Mar 08 03:47:50.190247 master-0 kubenswrapper[7547]: E0308 03:47:50.189469 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:48:22.18944256 +0000 UTC m=+65.135127083 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : configmap "kube-rbac-proxy" not found Mar 08 03:47:50.192907 master-0 kubenswrapper[7547]: I0308 03:47:50.190781 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:50.192907 master-0 kubenswrapper[7547]: I0308 03:47:50.190949 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:50.192907 master-0 kubenswrapper[7547]: I0308 03:47:50.191035 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:50.192907 master-0 kubenswrapper[7547]: I0308 03:47:50.191179 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:50.192907 master-0 kubenswrapper[7547]: I0308 03:47:50.191452 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:50.192907 master-0 kubenswrapper[7547]: I0308 03:47:50.191510 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:50.192907 master-0 kubenswrapper[7547]: I0308 03:47:50.191598 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:50.192907 master-0 kubenswrapper[7547]: I0308 03:47:50.191704 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:50.192907 master-0 kubenswrapper[7547]: I0308 03:47:50.191773 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:50.192907 master-0 kubenswrapper[7547]: I0308 03:47:50.191883 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:50.192907 master-0 kubenswrapper[7547]: I0308 03:47:50.191931 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:50.199533 master-0 kubenswrapper[7547]: I0308 03:47:50.198721 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:50.199533 master-0 kubenswrapper[7547]: I0308 03:47:50.199436 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:50.199790 master-0 kubenswrapper[7547]: I0308 03:47:50.199493 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:50.199790 master-0 kubenswrapper[7547]: I0308 03:47:50.199608 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:47:50.199790 master-0 kubenswrapper[7547]: I0308 03:47:50.199544 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:50.200043 master-0 kubenswrapper[7547]: I0308 03:47:50.199802 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:50.200792 master-0 kubenswrapper[7547]: I0308 03:47:50.200513 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:50.205265 master-0 kubenswrapper[7547]: I0308 03:47:50.201606 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"cluster-version-operator-745944c6b7-gvmnp\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:50.205265 master-0 kubenswrapper[7547]: I0308 03:47:50.201647 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:50.205265 master-0 kubenswrapper[7547]: I0308 03:47:50.201992 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:50.205265 master-0 kubenswrapper[7547]: I0308 03:47:50.203159 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:50.205265 master-0 kubenswrapper[7547]: I0308 03:47:50.203498 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:50.205265 master-0 kubenswrapper[7547]: I0308 03:47:50.203559 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:50.205265 master-0 kubenswrapper[7547]: I0308 03:47:50.203857 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:50.205809 master-0 kubenswrapper[7547]: I0308 03:47:50.205367 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:50.305793 master-0 kubenswrapper[7547]: I0308 03:47:50.305711 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:47:50.312209 master-0 kubenswrapper[7547]: I0308 03:47:50.312158 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:47:50.312209 master-0 kubenswrapper[7547]: I0308 03:47:50.312182 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:47:50.314524 master-0 kubenswrapper[7547]: I0308 03:47:50.314460 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:47:50.314645 master-0 kubenswrapper[7547]: I0308 03:47:50.314601 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:47:50.316639 master-0 kubenswrapper[7547]: I0308 03:47:50.316595 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:47:50.318692 master-0 kubenswrapper[7547]: I0308 03:47:50.318652 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:47:50.321657 master-0 kubenswrapper[7547]: I0308 03:47:50.321603 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:47:50.322698 master-0 kubenswrapper[7547]: I0308 03:47:50.322359 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:47:50.322698 master-0 kubenswrapper[7547]: I0308 03:47:50.322432 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:47:50.323037 master-0 kubenswrapper[7547]: I0308 03:47:50.322721 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:47:50.324185 master-0 kubenswrapper[7547]: I0308 03:47:50.324118 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:47:50.396186 master-0 kubenswrapper[7547]: I0308 03:47:50.396071 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:47:50.408541 master-0 kubenswrapper[7547]: I0308 03:47:50.408419 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:47:50.630375 master-0 kubenswrapper[7547]: I0308 03:47:50.629909 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:47:50.658646 master-0 kubenswrapper[7547]: I0308 03:47:50.658607 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj"] Mar 08 03:47:50.685649 master-0 kubenswrapper[7547]: I0308 03:47:50.685603 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" event={"ID":"349d438d-d124-4d34-a172-4160e766c680","Type":"ContainerStarted","Data":"f8e5340277c14fbc3c84261d569623ece5bc22cf9c8aa4ec6d8e525cc14aeb64"} Mar 08 03:47:50.970264 master-0 kubenswrapper[7547]: I0308 03:47:50.970213 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq"] Mar 08 03:47:50.975751 master-0 kubenswrapper[7547]: W0308 03:47:50.975700 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1482d789_884b_4337_b598_f0e2b71eb9f2.slice/crio-95267fdf9dd0903f45fd631ce455eb67b79bbeabc1d7f2fb9fb37ed66199c9e6 WatchSource:0}: Error finding container 95267fdf9dd0903f45fd631ce455eb67b79bbeabc1d7f2fb9fb37ed66199c9e6: Status 404 returned error can't find the container with id 95267fdf9dd0903f45fd631ce455eb67b79bbeabc1d7f2fb9fb37ed66199c9e6 Mar 08 03:47:51.021178 master-0 kubenswrapper[7547]: I0308 03:47:51.021142 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-schjl"] Mar 08 03:47:51.024599 master-0 kubenswrapper[7547]: W0308 03:47:51.024558 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5044ffd_0686_4679_9894_e696faf33699.slice/crio-b57da2053e178390131e07c76308479420cda5a65a12ef7fa425c01959c1b9c5 WatchSource:0}: Error finding container b57da2053e178390131e07c76308479420cda5a65a12ef7fa425c01959c1b9c5: Status 404 returned error can't find the container with id b57da2053e178390131e07c76308479420cda5a65a12ef7fa425c01959c1b9c5 Mar 08 03:47:51.085172 master-0 kubenswrapper[7547]: I0308 03:47:51.084711 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-xttlz"] Mar 08 03:47:51.103714 master-0 kubenswrapper[7547]: I0308 03:47:51.102237 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh"] Mar 08 03:47:51.105291 master-0 kubenswrapper[7547]: I0308 03:47:51.104093 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-j8pv6"] Mar 08 03:47:51.113768 master-0 kubenswrapper[7547]: I0308 03:47:51.107021 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp"] Mar 08 03:47:51.116994 master-0 kubenswrapper[7547]: I0308 03:47:51.116433 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-t77qr"] Mar 08 03:47:51.126096 master-0 kubenswrapper[7547]: I0308 03:47:51.124958 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5"] Mar 08 03:47:51.127445 master-0 kubenswrapper[7547]: I0308 03:47:51.126628 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 08 03:47:51.127445 master-0 kubenswrapper[7547]: I0308 03:47:51.127025 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-1-master-0" podUID="70c6db8e-1612-4da7-84ad-0750258e310e" containerName="installer" containerID="cri-o://cef32873cbe3d6b97478c715fa10b9fc2b7f6472cbc949089347f2bceb34bb4a" gracePeriod=30 Mar 08 03:47:51.128400 master-0 kubenswrapper[7547]: W0308 03:47:51.128053 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dd4279d_a1a9_450a_a061_9008cd1ea8e0.slice/crio-201fec6682662fad0acc461d7db4c8e108b597d14fa258495dcdfa10f6e193b5 WatchSource:0}: Error finding container 201fec6682662fad0acc461d7db4c8e108b597d14fa258495dcdfa10f6e193b5: Status 404 returned error can't find the container with id 201fec6682662fad0acc461d7db4c8e108b597d14fa258495dcdfa10f6e193b5 Mar 08 03:47:51.135173 master-0 kubenswrapper[7547]: I0308 03:47:51.135124 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz"] Mar 08 03:47:51.136721 master-0 kubenswrapper[7547]: I0308 03:47:51.136686 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52"] Mar 08 03:47:51.138988 master-0 kubenswrapper[7547]: I0308 03:47:51.138863 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d"] Mar 08 03:47:51.163333 master-0 kubenswrapper[7547]: W0308 03:47:51.163299 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd831cb23_7411_4072_8273_c167d9afca28.slice/crio-2f60507250e058dbc73dd9a2defceea722aafde0bbc43ed7857b1626b36814fe WatchSource:0}: Error finding container 2f60507250e058dbc73dd9a2defceea722aafde0bbc43ed7857b1626b36814fe: Status 404 returned error can't find the container with id 2f60507250e058dbc73dd9a2defceea722aafde0bbc43ed7857b1626b36814fe Mar 08 03:47:51.165015 master-0 kubenswrapper[7547]: W0308 03:47:51.164872 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod232c421d_96f0_4894_b8d8_74f43d02bbd3.slice/crio-34c7c8e7bf6b2608a8ed06595ef49b7b5823fe04e62631a07f3cbbca5adb876a WatchSource:0}: Error finding container 34c7c8e7bf6b2608a8ed06595ef49b7b5823fe04e62631a07f3cbbca5adb876a: Status 404 returned error can't find the container with id 34c7c8e7bf6b2608a8ed06595ef49b7b5823fe04e62631a07f3cbbca5adb876a Mar 08 03:47:51.165124 master-0 kubenswrapper[7547]: W0308 03:47:51.165104 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54ad284e_d40e_4e69_b898_f5093952a0e6.slice/crio-b7854fff4e6d290ba66d677fb1f4c348702f3c168d271f4daa5e0ff010a39d54 WatchSource:0}: Error finding container b7854fff4e6d290ba66d677fb1f4c348702f3c168d271f4daa5e0ff010a39d54: Status 404 returned error can't find the container with id b7854fff4e6d290ba66d677fb1f4c348702f3c168d271f4daa5e0ff010a39d54 Mar 08 03:47:51.389637 master-0 kubenswrapper[7547]: I0308 03:47:51.389583 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:47:51.548612 master-0 kubenswrapper[7547]: I0308 03:47:51.548540 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl"] Mar 08 03:47:51.549354 master-0 kubenswrapper[7547]: I0308 03:47:51.549314 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.553243 master-0 kubenswrapper[7547]: I0308 03:47:51.552298 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 03:47:51.553243 master-0 kubenswrapper[7547]: I0308 03:47:51.552372 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 03:47:51.554258 master-0 kubenswrapper[7547]: I0308 03:47:51.554203 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 03:47:51.554394 master-0 kubenswrapper[7547]: I0308 03:47:51.554381 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 03:47:51.555293 master-0 kubenswrapper[7547]: I0308 03:47:51.554504 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 03:47:51.555293 master-0 kubenswrapper[7547]: I0308 03:47:51.554626 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 03:47:51.555293 master-0 kubenswrapper[7547]: I0308 03:47:51.554755 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 03:47:51.555454 master-0 kubenswrapper[7547]: I0308 03:47:51.555368 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 03:47:51.561069 master-0 kubenswrapper[7547]: I0308 03:47:51.561027 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl"] Mar 08 03:47:51.690755 master-0 kubenswrapper[7547]: I0308 03:47:51.690505 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" event={"ID":"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f","Type":"ContainerStarted","Data":"ff14038e05786c394b22fac9e7aff676f3eef7f98d2a8dbbbe2bd0a62e05aecf"} Mar 08 03:47:51.691654 master-0 kubenswrapper[7547]: I0308 03:47:51.691621 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" event={"ID":"54ad284e-d40e-4e69-b898-f5093952a0e6","Type":"ContainerStarted","Data":"b7854fff4e6d290ba66d677fb1f4c348702f3c168d271f4daa5e0ff010a39d54"} Mar 08 03:47:51.692773 master-0 kubenswrapper[7547]: I0308 03:47:51.692738 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-schjl" event={"ID":"d5044ffd-0686-4679-9894-e696faf33699","Type":"ContainerStarted","Data":"b57da2053e178390131e07c76308479420cda5a65a12ef7fa425c01959c1b9c5"} Mar 08 03:47:51.693521 master-0 kubenswrapper[7547]: I0308 03:47:51.693487 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" event={"ID":"232c421d-96f0-4894-b8d8-74f43d02bbd3","Type":"ContainerStarted","Data":"34c7c8e7bf6b2608a8ed06595ef49b7b5823fe04e62631a07f3cbbca5adb876a"} Mar 08 03:47:51.694358 master-0 kubenswrapper[7547]: I0308 03:47:51.694315 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" event={"ID":"1eb851be-f157-48ea-9a39-1361b68d2639","Type":"ContainerStarted","Data":"48d46b7645a64ea18f3fa334445c914bbcaaadce3a50f149dedad680b9f63699"} Mar 08 03:47:51.697747 master-0 kubenswrapper[7547]: I0308 03:47:51.697702 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" event={"ID":"8efdcef9-9b31-4567-b7f9-cb59a894273d","Type":"ContainerStarted","Data":"b1a0209e0a9a4093bed7068dc639e8f6b3aa1b820bc97e5ac17eab47d3a362ec"} Mar 08 03:47:51.699714 master-0 kubenswrapper[7547]: I0308 03:47:51.699684 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" event={"ID":"6cde5024-edf7-4fa4-8964-cabe7899578b","Type":"ContainerStarted","Data":"ee39fc7bda15ddcb4dcdcb53f1e775ccad24404dd5a1d532e8d9c574c47c2bb1"} Mar 08 03:47:51.699714 master-0 kubenswrapper[7547]: I0308 03:47:51.699714 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" event={"ID":"6cde5024-edf7-4fa4-8964-cabe7899578b","Type":"ContainerStarted","Data":"24f253f45cdf8fe83c1408fe0ce3848ec429687603d0f7eff71df3320c693f47"} Mar 08 03:47:51.700769 master-0 kubenswrapper[7547]: I0308 03:47:51.700744 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" event={"ID":"d831cb23-7411-4072-8273-c167d9afca28","Type":"ContainerStarted","Data":"2f60507250e058dbc73dd9a2defceea722aafde0bbc43ed7857b1626b36814fe"} Mar 08 03:47:51.701818 master-0 kubenswrapper[7547]: I0308 03:47:51.701472 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" event={"ID":"1482d789-884b-4337-b598-f0e2b71eb9f2","Type":"ContainerStarted","Data":"95267fdf9dd0903f45fd631ce455eb67b79bbeabc1d7f2fb9fb37ed66199c9e6"} Mar 08 03:47:51.702466 master-0 kubenswrapper[7547]: I0308 03:47:51.702416 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" event={"ID":"2dd4279d-a1a9-450a-a061-9008cd1ea8e0","Type":"ContainerStarted","Data":"201fec6682662fad0acc461d7db4c8e108b597d14fa258495dcdfa10f6e193b5"} Mar 08 03:47:51.703410 master-0 kubenswrapper[7547]: I0308 03:47:51.703375 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" event={"ID":"0418ff42-7eac-4266-97b5-4df88623d066","Type":"ContainerStarted","Data":"f7ac4af0eeac6f90547286a05d56708b5e0e75b0367c4826038733ce85075489"} Mar 08 03:47:51.704613 master-0 kubenswrapper[7547]: I0308 03:47:51.704587 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" event={"ID":"c9de4939-680a-4e3e-89fd-e20ecb8b10f2","Type":"ContainerStarted","Data":"c6cc97386aa9d8bb895c877b8849fa8dc27a6fe973ccc0760f4274b321682e77"} Mar 08 03:47:51.714492 master-0 kubenswrapper[7547]: I0308 03:47:51.714450 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtts2\" (UniqueName: \"kubernetes.io/projected/76ba45a2-8945-4afe-b913-126c26725867-kube-api-access-dtts2\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.714585 master-0 kubenswrapper[7547]: I0308 03:47:51.714560 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76ba45a2-8945-4afe-b913-126c26725867-etcd-client\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.714633 master-0 kubenswrapper[7547]: I0308 03:47:51.714589 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76ba45a2-8945-4afe-b913-126c26725867-etcd-serving-ca\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.714633 master-0 kubenswrapper[7547]: I0308 03:47:51.714608 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76ba45a2-8945-4afe-b913-126c26725867-encryption-config\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.714633 master-0 kubenswrapper[7547]: I0308 03:47:51.714633 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/76ba45a2-8945-4afe-b913-126c26725867-audit-policies\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.714742 master-0 kubenswrapper[7547]: I0308 03:47:51.714677 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76ba45a2-8945-4afe-b913-126c26725867-serving-cert\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.714742 master-0 kubenswrapper[7547]: I0308 03:47:51.714695 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76ba45a2-8945-4afe-b913-126c26725867-audit-dir\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.714742 master-0 kubenswrapper[7547]: I0308 03:47:51.714716 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76ba45a2-8945-4afe-b913-126c26725867-trusted-ca-bundle\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.815600 master-0 kubenswrapper[7547]: I0308 03:47:51.815497 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/76ba45a2-8945-4afe-b913-126c26725867-audit-policies\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.815600 master-0 kubenswrapper[7547]: I0308 03:47:51.815552 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76ba45a2-8945-4afe-b913-126c26725867-serving-cert\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.815600 master-0 kubenswrapper[7547]: I0308 03:47:51.815573 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76ba45a2-8945-4afe-b913-126c26725867-audit-dir\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.815873 master-0 kubenswrapper[7547]: I0308 03:47:51.815778 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76ba45a2-8945-4afe-b913-126c26725867-trusted-ca-bundle\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.815923 master-0 kubenswrapper[7547]: I0308 03:47:51.815866 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76ba45a2-8945-4afe-b913-126c26725867-audit-dir\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.815923 master-0 kubenswrapper[7547]: I0308 03:47:51.815911 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtts2\" (UniqueName: \"kubernetes.io/projected/76ba45a2-8945-4afe-b913-126c26725867-kube-api-access-dtts2\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.815997 master-0 kubenswrapper[7547]: I0308 03:47:51.815967 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76ba45a2-8945-4afe-b913-126c26725867-etcd-client\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.816043 master-0 kubenswrapper[7547]: I0308 03:47:51.815998 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76ba45a2-8945-4afe-b913-126c26725867-etcd-serving-ca\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.816043 master-0 kubenswrapper[7547]: I0308 03:47:51.816023 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76ba45a2-8945-4afe-b913-126c26725867-encryption-config\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.816302 master-0 kubenswrapper[7547]: I0308 03:47:51.816272 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/76ba45a2-8945-4afe-b913-126c26725867-audit-policies\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.818106 master-0 kubenswrapper[7547]: I0308 03:47:51.817334 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76ba45a2-8945-4afe-b913-126c26725867-etcd-serving-ca\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.819146 master-0 kubenswrapper[7547]: I0308 03:47:51.819102 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76ba45a2-8945-4afe-b913-126c26725867-trusted-ca-bundle\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.842868 master-0 kubenswrapper[7547]: I0308 03:47:51.841006 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76ba45a2-8945-4afe-b913-126c26725867-etcd-client\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.857879 master-0 kubenswrapper[7547]: I0308 03:47:51.846450 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76ba45a2-8945-4afe-b913-126c26725867-encryption-config\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.857879 master-0 kubenswrapper[7547]: I0308 03:47:51.855692 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtts2\" (UniqueName: \"kubernetes.io/projected/76ba45a2-8945-4afe-b913-126c26725867-kube-api-access-dtts2\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.880877 master-0 kubenswrapper[7547]: I0308 03:47:51.877031 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76ba45a2-8945-4afe-b913-126c26725867-serving-cert\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:51.883342 master-0 kubenswrapper[7547]: I0308 03:47:51.883241 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:47:52.018216 master-0 kubenswrapper[7547]: I0308 03:47:52.017613 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1b69fbf6-1ca5-413e-bffd-965730bcec1b-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:52.030005 master-0 kubenswrapper[7547]: I0308 03:47:52.025915 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1b69fbf6-1ca5-413e-bffd-965730bcec1b-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:52.093908 master-0 kubenswrapper[7547]: I0308 03:47:52.092838 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:47:52.144641 master-0 kubenswrapper[7547]: I0308 03:47:52.144601 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl"] Mar 08 03:47:52.151807 master-0 kubenswrapper[7547]: W0308 03:47:52.151750 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76ba45a2_8945_4afe_b913_126c26725867.slice/crio-59d3785f249896d312f7b27b04e39fa314d9b06309adfc4aa055444977f4fa7e WatchSource:0}: Error finding container 59d3785f249896d312f7b27b04e39fa314d9b06309adfc4aa055444977f4fa7e: Status 404 returned error can't find the container with id 59d3785f249896d312f7b27b04e39fa314d9b06309adfc4aa055444977f4fa7e Mar 08 03:47:52.270266 master-0 kubenswrapper[7547]: I0308 03:47:52.270209 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj"] Mar 08 03:47:52.282068 master-0 kubenswrapper[7547]: W0308 03:47:52.281919 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b69fbf6_1ca5_413e_bffd_965730bcec1b.slice/crio-718b320dd5408c5dda7ee606dfc45fd377e09b1616f83b01ddd6bedbab6de149 WatchSource:0}: Error finding container 718b320dd5408c5dda7ee606dfc45fd377e09b1616f83b01ddd6bedbab6de149: Status 404 returned error can't find the container with id 718b320dd5408c5dda7ee606dfc45fd377e09b1616f83b01ddd6bedbab6de149 Mar 08 03:47:52.720621 master-0 kubenswrapper[7547]: I0308 03:47:52.719935 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" event={"ID":"76ba45a2-8945-4afe-b913-126c26725867","Type":"ContainerStarted","Data":"59d3785f249896d312f7b27b04e39fa314d9b06309adfc4aa055444977f4fa7e"} Mar 08 03:47:52.722362 master-0 kubenswrapper[7547]: I0308 03:47:52.722322 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" event={"ID":"1b69fbf6-1ca5-413e-bffd-965730bcec1b","Type":"ContainerStarted","Data":"89ad8d8104c873f50af7002b0d518d119a93e4a92c6cc61a41456cd88a3208e5"} Mar 08 03:47:52.722362 master-0 kubenswrapper[7547]: I0308 03:47:52.722355 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" event={"ID":"1b69fbf6-1ca5-413e-bffd-965730bcec1b","Type":"ContainerStarted","Data":"718b320dd5408c5dda7ee606dfc45fd377e09b1616f83b01ddd6bedbab6de149"} Mar 08 03:47:53.538258 master-0 kubenswrapper[7547]: I0308 03:47:53.538172 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:47:53.538587 master-0 kubenswrapper[7547]: E0308 03:47:53.538303 7547 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:53.538587 master-0 kubenswrapper[7547]: E0308 03:47:53.538357 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca podName:0d4a450c-dac8-445d-abfc-75e4920b676b nodeName:}" failed. No retries permitted until 2026-03-08 03:48:01.538339521 +0000 UTC m=+44.484024034 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca") pod "controller-manager-79784cf87c-jfjm4" (UID: "0d4a450c-dac8-445d-abfc-75e4920b676b") : configmap "client-ca" not found Mar 08 03:47:53.706553 master-0 kubenswrapper[7547]: I0308 03:47:53.706463 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 08 03:47:53.707347 master-0 kubenswrapper[7547]: I0308 03:47:53.707307 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:47:53.717982 master-0 kubenswrapper[7547]: I0308 03:47:53.717442 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 08 03:47:53.841284 master-0 kubenswrapper[7547]: I0308 03:47:53.841165 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:47:53.841284 master-0 kubenswrapper[7547]: I0308 03:47:53.841218 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c-var-lock\") pod \"installer-2-master-0\" (UID: \"0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:47:53.842452 master-0 kubenswrapper[7547]: I0308 03:47:53.841484 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c-kube-api-access\") pod \"installer-2-master-0\" (UID: \"0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:47:53.942685 master-0 kubenswrapper[7547]: I0308 03:47:53.942616 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:47:53.943113 master-0 kubenswrapper[7547]: I0308 03:47:53.942734 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:47:53.943113 master-0 kubenswrapper[7547]: I0308 03:47:53.942837 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c-var-lock\") pod \"installer-2-master-0\" (UID: \"0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:47:53.943113 master-0 kubenswrapper[7547]: I0308 03:47:53.942890 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c-kube-api-access\") pod \"installer-2-master-0\" (UID: \"0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:47:53.943113 master-0 kubenswrapper[7547]: I0308 03:47:53.942911 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c-var-lock\") pod \"installer-2-master-0\" (UID: \"0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:47:53.966447 master-0 kubenswrapper[7547]: I0308 03:47:53.966357 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c-kube-api-access\") pod \"installer-2-master-0\" (UID: \"0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:47:54.018342 master-0 kubenswrapper[7547]: I0308 03:47:54.018270 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:54.021938 master-0 kubenswrapper[7547]: I0308 03:47:54.021900 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:47:54.026500 master-0 kubenswrapper[7547]: I0308 03:47:54.026452 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:47:54.162911 master-0 kubenswrapper[7547]: I0308 03:47:54.158696 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 08 03:47:54.162911 master-0 kubenswrapper[7547]: I0308 03:47:54.159224 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 08 03:47:54.170201 master-0 kubenswrapper[7547]: I0308 03:47:54.169098 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 08 03:47:54.175847 master-0 kubenswrapper[7547]: I0308 03:47:54.170981 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 08 03:47:54.254073 master-0 kubenswrapper[7547]: I0308 03:47:54.249972 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:47:54.254073 master-0 kubenswrapper[7547]: I0308 03:47:54.250059 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b-var-lock\") pod \"installer-1-master-0\" (UID: \"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:47:54.254073 master-0 kubenswrapper[7547]: I0308 03:47:54.250177 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:47:54.351355 master-0 kubenswrapper[7547]: I0308 03:47:54.351212 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b-var-lock\") pod \"installer-1-master-0\" (UID: \"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:47:54.351355 master-0 kubenswrapper[7547]: I0308 03:47:54.351277 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:47:54.351355 master-0 kubenswrapper[7547]: I0308 03:47:54.351324 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b-var-lock\") pod \"installer-1-master-0\" (UID: \"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:47:54.351581 master-0 kubenswrapper[7547]: I0308 03:47:54.351373 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:47:54.351581 master-0 kubenswrapper[7547]: I0308 03:47:54.351377 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:47:54.365688 master-0 kubenswrapper[7547]: I0308 03:47:54.365601 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:47:54.500897 master-0 kubenswrapper[7547]: I0308 03:47:54.500070 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 08 03:47:54.574313 master-0 kubenswrapper[7547]: I0308 03:47:54.574200 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:47:55.689091 master-0 kubenswrapper[7547]: I0308 03:47:55.688141 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:55.689091 master-0 kubenswrapper[7547]: I0308 03:47:55.688217 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:47:55.689091 master-0 kubenswrapper[7547]: E0308 03:47:55.688345 7547 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:47:55.689091 master-0 kubenswrapper[7547]: E0308 03:47:55.688392 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca podName:22ebd67b-43b2-4f9d-955b-eb848d9d55d4 nodeName:}" failed. No retries permitted until 2026-03-08 03:48:27.688378342 +0000 UTC m=+70.634062855 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca") pod "route-controller-manager-6d8669fddc-zn4lq" (UID: "22ebd67b-43b2-4f9d-955b-eb848d9d55d4") : configmap "client-ca" not found Mar 08 03:47:55.704131 master-0 kubenswrapper[7547]: I0308 03:47:55.704098 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert\") pod \"route-controller-manager-6d8669fddc-zn4lq\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:48:01.331799 master-0 kubenswrapper[7547]: I0308 03:48:01.287765 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 08 03:48:01.557851 master-0 kubenswrapper[7547]: I0308 03:48:01.557786 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79784cf87c-jfjm4"] Mar 08 03:48:01.558279 master-0 kubenswrapper[7547]: E0308 03:48:01.558242 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" podUID="0d4a450c-dac8-445d-abfc-75e4920b676b" Mar 08 03:48:01.562027 master-0 kubenswrapper[7547]: I0308 03:48:01.561975 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:48:01.565378 master-0 kubenswrapper[7547]: I0308 03:48:01.565331 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca\") pod \"controller-manager-79784cf87c-jfjm4\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:48:01.571917 master-0 kubenswrapper[7547]: I0308 03:48:01.571841 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq"] Mar 08 03:48:01.572260 master-0 kubenswrapper[7547]: E0308 03:48:01.572225 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" podUID="22ebd67b-43b2-4f9d-955b-eb848d9d55d4" Mar 08 03:48:01.775801 master-0 kubenswrapper[7547]: I0308 03:48:01.775760 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:48:01.776026 master-0 kubenswrapper[7547]: I0308 03:48:01.775809 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:48:01.786467 master-0 kubenswrapper[7547]: I0308 03:48:01.786437 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:48:01.791992 master-0 kubenswrapper[7547]: I0308 03:48:01.791950 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:48:01.968553 master-0 kubenswrapper[7547]: I0308 03:48:01.968230 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gnhj\" (UniqueName: \"kubernetes.io/projected/0d4a450c-dac8-445d-abfc-75e4920b676b-kube-api-access-5gnhj\") pod \"0d4a450c-dac8-445d-abfc-75e4920b676b\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " Mar 08 03:48:01.968553 master-0 kubenswrapper[7547]: I0308 03:48:01.968268 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-config\") pod \"0d4a450c-dac8-445d-abfc-75e4920b676b\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " Mar 08 03:48:01.968553 master-0 kubenswrapper[7547]: I0308 03:48:01.968299 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert\") pod \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " Mar 08 03:48:01.968553 master-0 kubenswrapper[7547]: I0308 03:48:01.968322 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sccx\" (UniqueName: \"kubernetes.io/projected/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-kube-api-access-9sccx\") pod \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " Mar 08 03:48:01.968553 master-0 kubenswrapper[7547]: I0308 03:48:01.968358 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-config\") pod \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\" (UID: \"22ebd67b-43b2-4f9d-955b-eb848d9d55d4\") " Mar 08 03:48:01.968553 master-0 kubenswrapper[7547]: I0308 03:48:01.968379 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca\") pod \"0d4a450c-dac8-445d-abfc-75e4920b676b\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " Mar 08 03:48:01.969035 master-0 kubenswrapper[7547]: I0308 03:48:01.968836 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-proxy-ca-bundles\") pod \"0d4a450c-dac8-445d-abfc-75e4920b676b\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " Mar 08 03:48:01.969035 master-0 kubenswrapper[7547]: I0308 03:48:01.968853 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca" (OuterVolumeSpecName: "client-ca") pod "0d4a450c-dac8-445d-abfc-75e4920b676b" (UID: "0d4a450c-dac8-445d-abfc-75e4920b676b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:48:01.969035 master-0 kubenswrapper[7547]: I0308 03:48:01.968871 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4a450c-dac8-445d-abfc-75e4920b676b-serving-cert\") pod \"0d4a450c-dac8-445d-abfc-75e4920b676b\" (UID: \"0d4a450c-dac8-445d-abfc-75e4920b676b\") " Mar 08 03:48:01.969035 master-0 kubenswrapper[7547]: I0308 03:48:01.968983 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-config" (OuterVolumeSpecName: "config") pod "0d4a450c-dac8-445d-abfc-75e4920b676b" (UID: "0d4a450c-dac8-445d-abfc-75e4920b676b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:48:01.969173 master-0 kubenswrapper[7547]: I0308 03:48:01.969124 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-config" (OuterVolumeSpecName: "config") pod "22ebd67b-43b2-4f9d-955b-eb848d9d55d4" (UID: "22ebd67b-43b2-4f9d-955b-eb848d9d55d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:48:01.969290 master-0 kubenswrapper[7547]: I0308 03:48:01.969269 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0d4a450c-dac8-445d-abfc-75e4920b676b" (UID: "0d4a450c-dac8-445d-abfc-75e4920b676b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:48:01.970011 master-0 kubenswrapper[7547]: I0308 03:48:01.969643 7547 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:01.970011 master-0 kubenswrapper[7547]: I0308 03:48:01.969660 7547 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:01.970011 master-0 kubenswrapper[7547]: I0308 03:48:01.969671 7547 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:01.970011 master-0 kubenswrapper[7547]: I0308 03:48:01.969703 7547 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d4a450c-dac8-445d-abfc-75e4920b676b-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:01.973289 master-0 kubenswrapper[7547]: I0308 03:48:01.973246 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "22ebd67b-43b2-4f9d-955b-eb848d9d55d4" (UID: "22ebd67b-43b2-4f9d-955b-eb848d9d55d4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:48:01.973704 master-0 kubenswrapper[7547]: I0308 03:48:01.973664 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-kube-api-access-9sccx" (OuterVolumeSpecName: "kube-api-access-9sccx") pod "22ebd67b-43b2-4f9d-955b-eb848d9d55d4" (UID: "22ebd67b-43b2-4f9d-955b-eb848d9d55d4"). InnerVolumeSpecName "kube-api-access-9sccx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:48:01.973955 master-0 kubenswrapper[7547]: I0308 03:48:01.973910 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d4a450c-dac8-445d-abfc-75e4920b676b-kube-api-access-5gnhj" (OuterVolumeSpecName: "kube-api-access-5gnhj") pod "0d4a450c-dac8-445d-abfc-75e4920b676b" (UID: "0d4a450c-dac8-445d-abfc-75e4920b676b"). InnerVolumeSpecName "kube-api-access-5gnhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:48:01.975021 master-0 kubenswrapper[7547]: I0308 03:48:01.974537 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d4a450c-dac8-445d-abfc-75e4920b676b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0d4a450c-dac8-445d-abfc-75e4920b676b" (UID: "0d4a450c-dac8-445d-abfc-75e4920b676b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:48:02.071492 master-0 kubenswrapper[7547]: I0308 03:48:02.071402 7547 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4a450c-dac8-445d-abfc-75e4920b676b-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:02.071492 master-0 kubenswrapper[7547]: I0308 03:48:02.071442 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gnhj\" (UniqueName: \"kubernetes.io/projected/0d4a450c-dac8-445d-abfc-75e4920b676b-kube-api-access-5gnhj\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:02.071492 master-0 kubenswrapper[7547]: I0308 03:48:02.071457 7547 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:02.071492 master-0 kubenswrapper[7547]: I0308 03:48:02.071469 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sccx\" (UniqueName: \"kubernetes.io/projected/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-kube-api-access-9sccx\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:02.564086 master-0 kubenswrapper[7547]: I0308 03:48:02.563554 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 08 03:48:02.565901 master-0 kubenswrapper[7547]: I0308 03:48:02.565617 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:48:02.567951 master-0 kubenswrapper[7547]: I0308 03:48:02.567683 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 03:48:02.571683 master-0 kubenswrapper[7547]: I0308 03:48:02.571646 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 08 03:48:02.720066 master-0 kubenswrapper[7547]: I0308 03:48:02.718202 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f62034a-dae9-46af-8c14-006b728b631f-kube-api-access\") pod \"installer-1-master-0\" (UID: \"0f62034a-dae9-46af-8c14-006b728b631f\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:48:02.720066 master-0 kubenswrapper[7547]: I0308 03:48:02.718378 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f62034a-dae9-46af-8c14-006b728b631f-var-lock\") pod \"installer-1-master-0\" (UID: \"0f62034a-dae9-46af-8c14-006b728b631f\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:48:02.720066 master-0 kubenswrapper[7547]: I0308 03:48:02.718518 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f62034a-dae9-46af-8c14-006b728b631f-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"0f62034a-dae9-46af-8c14-006b728b631f\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:48:02.781543 master-0 kubenswrapper[7547]: I0308 03:48:02.781498 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79784cf87c-jfjm4" Mar 08 03:48:02.781640 master-0 kubenswrapper[7547]: I0308 03:48:02.781543 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq" Mar 08 03:48:02.824926 master-0 kubenswrapper[7547]: I0308 03:48:02.822014 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f62034a-dae9-46af-8c14-006b728b631f-kube-api-access\") pod \"installer-1-master-0\" (UID: \"0f62034a-dae9-46af-8c14-006b728b631f\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:48:02.824926 master-0 kubenswrapper[7547]: I0308 03:48:02.823138 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f62034a-dae9-46af-8c14-006b728b631f-var-lock\") pod \"installer-1-master-0\" (UID: \"0f62034a-dae9-46af-8c14-006b728b631f\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:48:02.824926 master-0 kubenswrapper[7547]: I0308 03:48:02.823223 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f62034a-dae9-46af-8c14-006b728b631f-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"0f62034a-dae9-46af-8c14-006b728b631f\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:48:02.824926 master-0 kubenswrapper[7547]: I0308 03:48:02.823695 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f62034a-dae9-46af-8c14-006b728b631f-var-lock\") pod \"installer-1-master-0\" (UID: \"0f62034a-dae9-46af-8c14-006b728b631f\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:48:02.824926 master-0 kubenswrapper[7547]: I0308 03:48:02.824002 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f62034a-dae9-46af-8c14-006b728b631f-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"0f62034a-dae9-46af-8c14-006b728b631f\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:48:02.828095 master-0 kubenswrapper[7547]: I0308 03:48:02.827000 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-597dfbc64c-f55p7"] Mar 08 03:48:02.830220 master-0 kubenswrapper[7547]: I0308 03:48:02.828254 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:02.830671 master-0 kubenswrapper[7547]: I0308 03:48:02.830312 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 03:48:02.830720 master-0 kubenswrapper[7547]: I0308 03:48:02.830690 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 03:48:02.840093 master-0 kubenswrapper[7547]: I0308 03:48:02.831570 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79784cf87c-jfjm4"] Mar 08 03:48:02.840093 master-0 kubenswrapper[7547]: I0308 03:48:02.833254 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 03:48:02.840093 master-0 kubenswrapper[7547]: I0308 03:48:02.834249 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 03:48:02.840093 master-0 kubenswrapper[7547]: I0308 03:48:02.834966 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 03:48:02.842028 master-0 kubenswrapper[7547]: I0308 03:48:02.841977 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-79784cf87c-jfjm4"] Mar 08 03:48:02.843560 master-0 kubenswrapper[7547]: I0308 03:48:02.843534 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-597dfbc64c-f55p7"] Mar 08 03:48:02.858215 master-0 kubenswrapper[7547]: I0308 03:48:02.858176 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 03:48:02.859164 master-0 kubenswrapper[7547]: I0308 03:48:02.859134 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f62034a-dae9-46af-8c14-006b728b631f-kube-api-access\") pod \"installer-1-master-0\" (UID: \"0f62034a-dae9-46af-8c14-006b728b631f\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:48:02.863480 master-0 kubenswrapper[7547]: I0308 03:48:02.863403 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq"] Mar 08 03:48:02.864710 master-0 kubenswrapper[7547]: I0308 03:48:02.864684 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8669fddc-zn4lq"] Mar 08 03:48:02.937214 master-0 kubenswrapper[7547]: I0308 03:48:02.937167 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:48:03.025275 master-0 kubenswrapper[7547]: I0308 03:48:03.025230 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/792f503e-c34b-4c30-9c9e-70bdea2f2629-proxy-ca-bundles\") pod \"controller-manager-597dfbc64c-f55p7\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:03.025476 master-0 kubenswrapper[7547]: I0308 03:48:03.025318 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk72v\" (UniqueName: \"kubernetes.io/projected/792f503e-c34b-4c30-9c9e-70bdea2f2629-kube-api-access-bk72v\") pod \"controller-manager-597dfbc64c-f55p7\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:03.025476 master-0 kubenswrapper[7547]: I0308 03:48:03.025378 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/792f503e-c34b-4c30-9c9e-70bdea2f2629-client-ca\") pod \"controller-manager-597dfbc64c-f55p7\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:03.025476 master-0 kubenswrapper[7547]: I0308 03:48:03.025411 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/792f503e-c34b-4c30-9c9e-70bdea2f2629-serving-cert\") pod \"controller-manager-597dfbc64c-f55p7\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:03.025476 master-0 kubenswrapper[7547]: I0308 03:48:03.025441 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/792f503e-c34b-4c30-9c9e-70bdea2f2629-config\") pod \"controller-manager-597dfbc64c-f55p7\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:03.025588 master-0 kubenswrapper[7547]: I0308 03:48:03.025480 7547 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22ebd67b-43b2-4f9d-955b-eb848d9d55d4-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:03.126972 master-0 kubenswrapper[7547]: I0308 03:48:03.126881 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/792f503e-c34b-4c30-9c9e-70bdea2f2629-config\") pod \"controller-manager-597dfbc64c-f55p7\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:03.126972 master-0 kubenswrapper[7547]: I0308 03:48:03.126957 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/792f503e-c34b-4c30-9c9e-70bdea2f2629-proxy-ca-bundles\") pod \"controller-manager-597dfbc64c-f55p7\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:03.127072 master-0 kubenswrapper[7547]: I0308 03:48:03.127048 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk72v\" (UniqueName: \"kubernetes.io/projected/792f503e-c34b-4c30-9c9e-70bdea2f2629-kube-api-access-bk72v\") pod \"controller-manager-597dfbc64c-f55p7\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:03.127131 master-0 kubenswrapper[7547]: I0308 03:48:03.127107 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/792f503e-c34b-4c30-9c9e-70bdea2f2629-client-ca\") pod \"controller-manager-597dfbc64c-f55p7\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:03.127188 master-0 kubenswrapper[7547]: I0308 03:48:03.127167 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/792f503e-c34b-4c30-9c9e-70bdea2f2629-serving-cert\") pod \"controller-manager-597dfbc64c-f55p7\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:03.128422 master-0 kubenswrapper[7547]: I0308 03:48:03.128396 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/792f503e-c34b-4c30-9c9e-70bdea2f2629-client-ca\") pod \"controller-manager-597dfbc64c-f55p7\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:03.128934 master-0 kubenswrapper[7547]: I0308 03:48:03.128909 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/792f503e-c34b-4c30-9c9e-70bdea2f2629-config\") pod \"controller-manager-597dfbc64c-f55p7\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:03.129610 master-0 kubenswrapper[7547]: I0308 03:48:03.129568 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/792f503e-c34b-4c30-9c9e-70bdea2f2629-proxy-ca-bundles\") pod \"controller-manager-597dfbc64c-f55p7\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:03.130874 master-0 kubenswrapper[7547]: I0308 03:48:03.130816 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/792f503e-c34b-4c30-9c9e-70bdea2f2629-serving-cert\") pod \"controller-manager-597dfbc64c-f55p7\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:03.145456 master-0 kubenswrapper[7547]: I0308 03:48:03.145422 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk72v\" (UniqueName: \"kubernetes.io/projected/792f503e-c34b-4c30-9c9e-70bdea2f2629-kube-api-access-bk72v\") pod \"controller-manager-597dfbc64c-f55p7\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:03.189318 master-0 kubenswrapper[7547]: I0308 03:48:03.189278 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:03.240508 master-0 kubenswrapper[7547]: I0308 03:48:03.240463 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d4a450c-dac8-445d-abfc-75e4920b676b" path="/var/lib/kubelet/pods/0d4a450c-dac8-445d-abfc-75e4920b676b/volumes" Mar 08 03:48:03.240888 master-0 kubenswrapper[7547]: I0308 03:48:03.240862 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22ebd67b-43b2-4f9d-955b-eb848d9d55d4" path="/var/lib/kubelet/pods/22ebd67b-43b2-4f9d-955b-eb848d9d55d4/volumes" Mar 08 03:48:03.711746 master-0 kubenswrapper[7547]: I0308 03:48:03.711647 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 08 03:48:03.714006 master-0 kubenswrapper[7547]: I0308 03:48:03.713704 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:48:03.716649 master-0 kubenswrapper[7547]: I0308 03:48:03.716608 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 08 03:48:03.836850 master-0 kubenswrapper[7547]: I0308 03:48:03.836769 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bf40ef9-a79a-4f5d-933c-5276edcccb4b-kube-api-access\") pod \"installer-3-master-0\" (UID: \"7bf40ef9-a79a-4f5d-933c-5276edcccb4b\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:48:03.836850 master-0 kubenswrapper[7547]: I0308 03:48:03.836850 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bf40ef9-a79a-4f5d-933c-5276edcccb4b-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"7bf40ef9-a79a-4f5d-933c-5276edcccb4b\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:48:03.837227 master-0 kubenswrapper[7547]: I0308 03:48:03.836997 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7bf40ef9-a79a-4f5d-933c-5276edcccb4b-var-lock\") pod \"installer-3-master-0\" (UID: \"7bf40ef9-a79a-4f5d-933c-5276edcccb4b\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:48:03.938012 master-0 kubenswrapper[7547]: I0308 03:48:03.937940 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bf40ef9-a79a-4f5d-933c-5276edcccb4b-kube-api-access\") pod \"installer-3-master-0\" (UID: \"7bf40ef9-a79a-4f5d-933c-5276edcccb4b\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:48:03.938012 master-0 kubenswrapper[7547]: I0308 03:48:03.938011 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bf40ef9-a79a-4f5d-933c-5276edcccb4b-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"7bf40ef9-a79a-4f5d-933c-5276edcccb4b\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:48:03.939272 master-0 kubenswrapper[7547]: I0308 03:48:03.938571 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bf40ef9-a79a-4f5d-933c-5276edcccb4b-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"7bf40ef9-a79a-4f5d-933c-5276edcccb4b\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:48:03.939272 master-0 kubenswrapper[7547]: I0308 03:48:03.938495 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7bf40ef9-a79a-4f5d-933c-5276edcccb4b-var-lock\") pod \"installer-3-master-0\" (UID: \"7bf40ef9-a79a-4f5d-933c-5276edcccb4b\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:48:03.939272 master-0 kubenswrapper[7547]: I0308 03:48:03.938697 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7bf40ef9-a79a-4f5d-933c-5276edcccb4b-var-lock\") pod \"installer-3-master-0\" (UID: \"7bf40ef9-a79a-4f5d-933c-5276edcccb4b\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:48:03.954027 master-0 kubenswrapper[7547]: I0308 03:48:03.953976 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bf40ef9-a79a-4f5d-933c-5276edcccb4b-kube-api-access\") pod \"installer-3-master-0\" (UID: \"7bf40ef9-a79a-4f5d-933c-5276edcccb4b\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:48:04.069773 master-0 kubenswrapper[7547]: I0308 03:48:04.069601 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:48:05.188036 master-0 kubenswrapper[7547]: I0308 03:48:05.187723 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l"] Mar 08 03:48:05.188680 master-0 kubenswrapper[7547]: I0308 03:48:05.188517 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" Mar 08 03:48:05.190480 master-0 kubenswrapper[7547]: I0308 03:48:05.190429 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 03:48:05.190975 master-0 kubenswrapper[7547]: I0308 03:48:05.190946 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 03:48:05.191382 master-0 kubenswrapper[7547]: I0308 03:48:05.191341 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 03:48:05.193796 master-0 kubenswrapper[7547]: I0308 03:48:05.193709 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 03:48:05.193924 master-0 kubenswrapper[7547]: I0308 03:48:05.193860 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 03:48:05.203302 master-0 kubenswrapper[7547]: I0308 03:48:05.203254 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l"] Mar 08 03:48:05.358537 master-0 kubenswrapper[7547]: I0308 03:48:05.358491 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6cf484f-7125-47ee-9e67-a064d044f43d-config\") pod \"route-controller-manager-6994fc9dc8-9w74l\" (UID: \"d6cf484f-7125-47ee-9e67-a064d044f43d\") " pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" Mar 08 03:48:05.358745 master-0 kubenswrapper[7547]: I0308 03:48:05.358545 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6cf484f-7125-47ee-9e67-a064d044f43d-serving-cert\") pod \"route-controller-manager-6994fc9dc8-9w74l\" (UID: \"d6cf484f-7125-47ee-9e67-a064d044f43d\") " pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" Mar 08 03:48:05.358745 master-0 kubenswrapper[7547]: I0308 03:48:05.358640 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vkpr\" (UniqueName: \"kubernetes.io/projected/d6cf484f-7125-47ee-9e67-a064d044f43d-kube-api-access-2vkpr\") pod \"route-controller-manager-6994fc9dc8-9w74l\" (UID: \"d6cf484f-7125-47ee-9e67-a064d044f43d\") " pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" Mar 08 03:48:05.358745 master-0 kubenswrapper[7547]: I0308 03:48:05.358677 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6cf484f-7125-47ee-9e67-a064d044f43d-client-ca\") pod \"route-controller-manager-6994fc9dc8-9w74l\" (UID: \"d6cf484f-7125-47ee-9e67-a064d044f43d\") " pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" Mar 08 03:48:05.460350 master-0 kubenswrapper[7547]: I0308 03:48:05.460182 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6cf484f-7125-47ee-9e67-a064d044f43d-config\") pod \"route-controller-manager-6994fc9dc8-9w74l\" (UID: \"d6cf484f-7125-47ee-9e67-a064d044f43d\") " pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" Mar 08 03:48:05.460350 master-0 kubenswrapper[7547]: I0308 03:48:05.460264 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6cf484f-7125-47ee-9e67-a064d044f43d-serving-cert\") pod \"route-controller-manager-6994fc9dc8-9w74l\" (UID: \"d6cf484f-7125-47ee-9e67-a064d044f43d\") " pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" Mar 08 03:48:05.460350 master-0 kubenswrapper[7547]: I0308 03:48:05.460344 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vkpr\" (UniqueName: \"kubernetes.io/projected/d6cf484f-7125-47ee-9e67-a064d044f43d-kube-api-access-2vkpr\") pod \"route-controller-manager-6994fc9dc8-9w74l\" (UID: \"d6cf484f-7125-47ee-9e67-a064d044f43d\") " pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" Mar 08 03:48:05.460553 master-0 kubenswrapper[7547]: I0308 03:48:05.460394 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6cf484f-7125-47ee-9e67-a064d044f43d-client-ca\") pod \"route-controller-manager-6994fc9dc8-9w74l\" (UID: \"d6cf484f-7125-47ee-9e67-a064d044f43d\") " pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" Mar 08 03:48:05.462138 master-0 kubenswrapper[7547]: I0308 03:48:05.461286 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6cf484f-7125-47ee-9e67-a064d044f43d-config\") pod \"route-controller-manager-6994fc9dc8-9w74l\" (UID: \"d6cf484f-7125-47ee-9e67-a064d044f43d\") " pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" Mar 08 03:48:05.462138 master-0 kubenswrapper[7547]: I0308 03:48:05.461802 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6cf484f-7125-47ee-9e67-a064d044f43d-client-ca\") pod \"route-controller-manager-6994fc9dc8-9w74l\" (UID: \"d6cf484f-7125-47ee-9e67-a064d044f43d\") " pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" Mar 08 03:48:05.474862 master-0 kubenswrapper[7547]: I0308 03:48:05.466124 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6cf484f-7125-47ee-9e67-a064d044f43d-serving-cert\") pod \"route-controller-manager-6994fc9dc8-9w74l\" (UID: \"d6cf484f-7125-47ee-9e67-a064d044f43d\") " pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" Mar 08 03:48:05.480579 master-0 kubenswrapper[7547]: I0308 03:48:05.478765 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vkpr\" (UniqueName: \"kubernetes.io/projected/d6cf484f-7125-47ee-9e67-a064d044f43d-kube-api-access-2vkpr\") pod \"route-controller-manager-6994fc9dc8-9w74l\" (UID: \"d6cf484f-7125-47ee-9e67-a064d044f43d\") " pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" Mar 08 03:48:05.512195 master-0 kubenswrapper[7547]: I0308 03:48:05.512124 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" Mar 08 03:48:05.930269 master-0 kubenswrapper[7547]: I0308 03:48:05.928361 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 08 03:48:06.094865 master-0 kubenswrapper[7547]: I0308 03:48:06.092599 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 08 03:48:06.119219 master-0 kubenswrapper[7547]: I0308 03:48:06.118761 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l"] Mar 08 03:48:06.125437 master-0 kubenswrapper[7547]: W0308 03:48:06.125396 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0f62034a_dae9_46af_8c14_006b728b631f.slice/crio-bef040f7b636d10cc98601c076bcc8520e38eceadefefdcd82bcf0929743a68c WatchSource:0}: Error finding container bef040f7b636d10cc98601c076bcc8520e38eceadefefdcd82bcf0929743a68c: Status 404 returned error can't find the container with id bef040f7b636d10cc98601c076bcc8520e38eceadefefdcd82bcf0929743a68c Mar 08 03:48:06.156966 master-0 kubenswrapper[7547]: I0308 03:48:06.153977 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-597dfbc64c-f55p7"] Mar 08 03:48:06.315918 master-0 kubenswrapper[7547]: I0308 03:48:06.315106 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 08 03:48:06.319870 master-0 kubenswrapper[7547]: I0308 03:48:06.318453 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 08 03:48:06.658793 master-0 kubenswrapper[7547]: I0308 03:48:06.654295 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-bpwdb"] Mar 08 03:48:06.658793 master-0 kubenswrapper[7547]: I0308 03:48:06.655653 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.784010 master-0 kubenswrapper[7547]: I0308 03:48:06.783545 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-run\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.784010 master-0 kubenswrapper[7547]: I0308 03:48:06.783597 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-sysconfig\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.784010 master-0 kubenswrapper[7547]: I0308 03:48:06.783651 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-sysctl-d\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.784010 master-0 kubenswrapper[7547]: I0308 03:48:06.783673 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-systemd\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.784010 master-0 kubenswrapper[7547]: I0308 03:48:06.783698 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-lib-modules\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.784010 master-0 kubenswrapper[7547]: I0308 03:48:06.783728 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-host\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.784010 master-0 kubenswrapper[7547]: I0308 03:48:06.783746 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-sys\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.784010 master-0 kubenswrapper[7547]: I0308 03:48:06.783760 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-var-lib-kubelet\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.784010 master-0 kubenswrapper[7547]: I0308 03:48:06.783780 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg9kg\" (UniqueName: \"kubernetes.io/projected/e187516f-8f33-4c17-81d6-60c10b580bb0-kube-api-access-vg9kg\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.784010 master-0 kubenswrapper[7547]: I0308 03:48:06.783798 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-sysctl-conf\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.784010 master-0 kubenswrapper[7547]: I0308 03:48:06.783814 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-tuned\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.784010 master-0 kubenswrapper[7547]: I0308 03:48:06.783847 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-modprobe-d\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.784010 master-0 kubenswrapper[7547]: I0308 03:48:06.783869 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-kubernetes\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.784010 master-0 kubenswrapper[7547]: I0308 03:48:06.783895 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e187516f-8f33-4c17-81d6-60c10b580bb0-tmp\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.849863 master-0 kubenswrapper[7547]: I0308 03:48:06.846106 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" event={"ID":"6cde5024-edf7-4fa4-8964-cabe7899578b","Type":"ContainerStarted","Data":"4289a059254f4a4a862cd6214157717de8463d23473ef07f55fa6a246d122a6b"} Mar 08 03:48:06.849863 master-0 kubenswrapper[7547]: I0308 03:48:06.846731 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:48:06.887093 master-0 kubenswrapper[7547]: I0308 03:48:06.885345 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-sys\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.887093 master-0 kubenswrapper[7547]: I0308 03:48:06.885385 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-var-lib-kubelet\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.887093 master-0 kubenswrapper[7547]: I0308 03:48:06.885422 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg9kg\" (UniqueName: \"kubernetes.io/projected/e187516f-8f33-4c17-81d6-60c10b580bb0-kube-api-access-vg9kg\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.887093 master-0 kubenswrapper[7547]: I0308 03:48:06.885447 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-sysctl-conf\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.887093 master-0 kubenswrapper[7547]: I0308 03:48:06.885466 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-tuned\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.887093 master-0 kubenswrapper[7547]: I0308 03:48:06.885484 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-modprobe-d\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.887093 master-0 kubenswrapper[7547]: I0308 03:48:06.885506 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-kubernetes\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.887093 master-0 kubenswrapper[7547]: I0308 03:48:06.885523 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e187516f-8f33-4c17-81d6-60c10b580bb0-tmp\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.887093 master-0 kubenswrapper[7547]: I0308 03:48:06.885542 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-run\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.887093 master-0 kubenswrapper[7547]: I0308 03:48:06.885576 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-sysconfig\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.887093 master-0 kubenswrapper[7547]: I0308 03:48:06.885590 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-sysctl-d\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.887093 master-0 kubenswrapper[7547]: I0308 03:48:06.885606 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-systemd\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.887093 master-0 kubenswrapper[7547]: I0308 03:48:06.885625 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-lib-modules\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.887093 master-0 kubenswrapper[7547]: I0308 03:48:06.885646 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-host\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.887093 master-0 kubenswrapper[7547]: I0308 03:48:06.885717 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-host\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.887093 master-0 kubenswrapper[7547]: I0308 03:48:06.885759 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-sys\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.887093 master-0 kubenswrapper[7547]: I0308 03:48:06.885797 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-var-lib-kubelet\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.887093 master-0 kubenswrapper[7547]: I0308 03:48:06.886197 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-sysctl-conf\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.888132 master-0 kubenswrapper[7547]: I0308 03:48:06.887781 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-run\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.888132 master-0 kubenswrapper[7547]: I0308 03:48:06.887869 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-modprobe-d\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.888132 master-0 kubenswrapper[7547]: I0308 03:48:06.887919 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-kubernetes\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.892953 master-0 kubenswrapper[7547]: I0308 03:48:06.890467 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-sysctl-d\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.892953 master-0 kubenswrapper[7547]: I0308 03:48:06.890529 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-sysconfig\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.892953 master-0 kubenswrapper[7547]: I0308 03:48:06.890565 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-systemd\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.892953 master-0 kubenswrapper[7547]: I0308 03:48:06.890775 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-lib-modules\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.892953 master-0 kubenswrapper[7547]: I0308 03:48:06.892490 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" event={"ID":"c9de4939-680a-4e3e-89fd-e20ecb8b10f2","Type":"ContainerStarted","Data":"cb9b29deb9396e5cbdda3fe84303a3191fbf957a4fdd12ac5ba3d966332c03ca"} Mar 08 03:48:06.892953 master-0 kubenswrapper[7547]: I0308 03:48:06.892520 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" event={"ID":"c9de4939-680a-4e3e-89fd-e20ecb8b10f2","Type":"ContainerStarted","Data":"db28e69e1ea518493719876e18b2faf675fe251b59f240840b24dd0b6d115924"} Mar 08 03:48:06.902087 master-0 kubenswrapper[7547]: I0308 03:48:06.894022 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" event={"ID":"d6cf484f-7125-47ee-9e67-a064d044f43d","Type":"ContainerStarted","Data":"e3468db4b4de37474c23de3dc1335ad27787122b8d8693bf97799316dadf6a7f"} Mar 08 03:48:06.902087 master-0 kubenswrapper[7547]: I0308 03:48:06.899401 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" event={"ID":"2dd4279d-a1a9-450a-a061-9008cd1ea8e0","Type":"ContainerStarted","Data":"de78e21e3fd3bd535ad442d5ffc62bf9b325e4da59f203bcfb8b59015e498d7c"} Mar 08 03:48:06.902087 master-0 kubenswrapper[7547]: I0308 03:48:06.900393 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:48:06.902087 master-0 kubenswrapper[7547]: I0308 03:48:06.902024 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e187516f-8f33-4c17-81d6-60c10b580bb0-tmp\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.910130 master-0 kubenswrapper[7547]: I0308 03:48:06.910088 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:48:06.914312 master-0 kubenswrapper[7547]: I0308 03:48:06.914266 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-tuned\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.926077 master-0 kubenswrapper[7547]: I0308 03:48:06.925663 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"0f62034a-dae9-46af-8c14-006b728b631f","Type":"ContainerStarted","Data":"d54a64defa77630a2cfba1757b5211284714af7095323e2acde0e62e40e90243"} Mar 08 03:48:06.926077 master-0 kubenswrapper[7547]: I0308 03:48:06.925707 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"0f62034a-dae9-46af-8c14-006b728b631f","Type":"ContainerStarted","Data":"bef040f7b636d10cc98601c076bcc8520e38eceadefefdcd82bcf0929743a68c"} Mar 08 03:48:06.966950 master-0 kubenswrapper[7547]: I0308 03:48:06.965615 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg9kg\" (UniqueName: \"kubernetes.io/projected/e187516f-8f33-4c17-81d6-60c10b580bb0-kube-api-access-vg9kg\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:06.966950 master-0 kubenswrapper[7547]: I0308 03:48:06.965677 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" event={"ID":"0418ff42-7eac-4266-97b5-4df88623d066","Type":"ContainerStarted","Data":"6f424f234c58c111f2d58f1dceb247c74351083f1153019cd243a3c9e75815a0"} Mar 08 03:48:07.012864 master-0 kubenswrapper[7547]: I0308 03:48:07.012742 7547 generic.go:334] "Generic (PLEG): container finished" podID="76ba45a2-8945-4afe-b913-126c26725867" containerID="41eba15c47abd981b40ebf82cbf86f9a574f89d62b79b6b757b4ed7a35e235d0" exitCode=0 Mar 08 03:48:07.013010 master-0 kubenswrapper[7547]: I0308 03:48:07.012863 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" event={"ID":"76ba45a2-8945-4afe-b913-126c26725867","Type":"ContainerDied","Data":"41eba15c47abd981b40ebf82cbf86f9a574f89d62b79b6b757b4ed7a35e235d0"} Mar 08 03:48:07.018762 master-0 kubenswrapper[7547]: I0308 03:48:07.017531 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" event={"ID":"1482d789-884b-4337-b598-f0e2b71eb9f2","Type":"ContainerStarted","Data":"ed1866a5a44738d232bf246a80b33a5e6ec108ac42f1049c7f739fd0e59ccb8b"} Mar 08 03:48:07.018762 master-0 kubenswrapper[7547]: I0308 03:48:07.018132 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:48:07.022847 master-0 kubenswrapper[7547]: I0308 03:48:07.019059 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:48:07.033406 master-0 kubenswrapper[7547]: I0308 03:48:07.033333 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" event={"ID":"8efdcef9-9b31-4567-b7f9-cb59a894273d","Type":"ContainerStarted","Data":"0e9a7f2244e7afee708e7bbbd846eea539ac85e6cd49fbdfdeb084b608494fbc"} Mar 08 03:48:07.040857 master-0 kubenswrapper[7547]: I0308 03:48:07.038139 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:48:07.063861 master-0 kubenswrapper[7547]: I0308 03:48:07.061587 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c","Type":"ContainerStarted","Data":"fc483f86e93eec0ff1cf269db68228377fe786ff06140957feac39bbb0a44268"} Mar 08 03:48:07.063861 master-0 kubenswrapper[7547]: I0308 03:48:07.061733 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-2-master-0" podUID="0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c" containerName="installer" containerID="cri-o://91b72f3b71a1862364600baae9b283064d51be7b70bb871f478d63107b7adf7f" gracePeriod=30 Mar 08 03:48:07.077933 master-0 kubenswrapper[7547]: I0308 03:48:07.077896 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" event={"ID":"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f","Type":"ContainerStarted","Data":"382089094d1cac3801cb503b8c6386b9568112979a76f0ebac335186d88c682f"} Mar 08 03:48:07.087710 master-0 kubenswrapper[7547]: I0308 03:48:07.085095 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" event={"ID":"232c421d-96f0-4894-b8d8-74f43d02bbd3","Type":"ContainerStarted","Data":"5e8c9b1535f9cad7551aed34744e6915c79ee203218b5581b4dba33e8c42901e"} Mar 08 03:48:07.087710 master-0 kubenswrapper[7547]: I0308 03:48:07.085277 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-1-master-0" podStartSLOduration=5.085256681 podStartE2EDuration="5.085256681s" podCreationTimestamp="2026-03-08 03:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:48:07.05634574 +0000 UTC m=+50.002030253" watchObservedRunningTime="2026-03-08 03:48:07.085256681 +0000 UTC m=+50.030941194" Mar 08 03:48:07.100055 master-0 kubenswrapper[7547]: I0308 03:48:07.092700 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" event={"ID":"792f503e-c34b-4c30-9c9e-70bdea2f2629","Type":"ContainerStarted","Data":"35ad5b11950f0afd864c047e90541d0fc2348ab048da8315900ff77325b59f5c"} Mar 08 03:48:07.101785 master-0 kubenswrapper[7547]: I0308 03:48:07.101754 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" event={"ID":"1b69fbf6-1ca5-413e-bffd-965730bcec1b","Type":"ContainerStarted","Data":"55aa7553b7b737c589cdd0270a8ec23cc64ce136f8130219ce1dabd7e976b992"} Mar 08 03:48:07.104304 master-0 kubenswrapper[7547]: I0308 03:48:07.102258 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:48:07.116899 master-0 kubenswrapper[7547]: I0308 03:48:07.114454 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" event={"ID":"349d438d-d124-4d34-a172-4160e766c680","Type":"ContainerStarted","Data":"7000ce3c427519e0de59e1941d9bb87835a2ee0fa70ad24b6c24c11e5207d4d2"} Mar 08 03:48:07.116899 master-0 kubenswrapper[7547]: I0308 03:48:07.116417 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" event={"ID":"d831cb23-7411-4072-8273-c167d9afca28","Type":"ContainerStarted","Data":"c6d836aca437370d9d4780c05d27513e6c7ad8ace7bcbb1e511556447a43308e"} Mar 08 03:48:07.116899 master-0 kubenswrapper[7547]: I0308 03:48:07.116440 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" event={"ID":"d831cb23-7411-4072-8273-c167d9afca28","Type":"ContainerStarted","Data":"712603a1b97b084eebc58893e05cde574b9f0f2e5360a98b0fe0e6acfea60707"} Mar 08 03:48:07.130045 master-0 kubenswrapper[7547]: I0308 03:48:07.124423 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b","Type":"ContainerStarted","Data":"e137d58d0275a0b444de45e72047e1d303bf2156296279cd1f222cf4c2e05cac"} Mar 08 03:48:07.130045 master-0 kubenswrapper[7547]: I0308 03:48:07.124441 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b","Type":"ContainerStarted","Data":"0ed8c48f565d4b8be16c6ba185f91ba3e8904463e008be6f2e6c969571e27427"} Mar 08 03:48:07.144043 master-0 kubenswrapper[7547]: I0308 03:48:07.143645 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" event={"ID":"1eb851be-f157-48ea-9a39-1361b68d2639","Type":"ContainerStarted","Data":"5b490ed7d49134874203db2b969a8e813e09c1e907556c2039144c5be0ec90cb"} Mar 08 03:48:07.156630 master-0 kubenswrapper[7547]: I0308 03:48:07.151654 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" event={"ID":"54ad284e-d40e-4e69-b898-f5093952a0e6","Type":"ContainerStarted","Data":"791aee9d23f28d5b9bc6bbbcd3f26705c245a61021bebb20a57835608ad72cab"} Mar 08 03:48:07.156630 master-0 kubenswrapper[7547]: I0308 03:48:07.151838 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:48:07.156848 master-0 kubenswrapper[7547]: I0308 03:48:07.156755 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:48:07.158190 master-0 kubenswrapper[7547]: I0308 03:48:07.157803 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-schjl" event={"ID":"d5044ffd-0686-4679-9894-e696faf33699","Type":"ContainerStarted","Data":"874c6e1ba49d27ef39dd5fc4f51fa52ffd667015c3adfc4324ec85ae7c224481"} Mar 08 03:48:07.158190 master-0 kubenswrapper[7547]: I0308 03:48:07.157846 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-schjl" event={"ID":"d5044ffd-0686-4679-9894-e696faf33699","Type":"ContainerStarted","Data":"63340a8ebe09cbc671d17171d3199699ec57e9b840c3e40776e56bf675738687"} Mar 08 03:48:07.183806 master-0 kubenswrapper[7547]: I0308 03:48:07.183746 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"7bf40ef9-a79a-4f5d-933c-5276edcccb4b","Type":"ContainerStarted","Data":"6174c0cced28744679d07cd6bfda1e5016fe917384d58e904dd1b71ae6c4d184"} Mar 08 03:48:07.296927 master-0 kubenswrapper[7547]: I0308 03:48:07.293211 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4pjsn"] Mar 08 03:48:07.296927 master-0 kubenswrapper[7547]: I0308 03:48:07.294311 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4pjsn" Mar 08 03:48:07.296927 master-0 kubenswrapper[7547]: I0308 03:48:07.293925 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-3-master-0" podStartSLOduration=4.293903838 podStartE2EDuration="4.293903838s" podCreationTimestamp="2026-03-08 03:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:48:07.292488114 +0000 UTC m=+50.238172627" watchObservedRunningTime="2026-03-08 03:48:07.293903838 +0000 UTC m=+50.239588351" Mar 08 03:48:07.306855 master-0 kubenswrapper[7547]: I0308 03:48:07.306098 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 03:48:07.306855 master-0 kubenswrapper[7547]: I0308 03:48:07.306134 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4pjsn"] Mar 08 03:48:07.306855 master-0 kubenswrapper[7547]: I0308 03:48:07.306245 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 03:48:07.306855 master-0 kubenswrapper[7547]: I0308 03:48:07.306429 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 03:48:07.306855 master-0 kubenswrapper[7547]: I0308 03:48:07.306459 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 03:48:07.401740 master-0 kubenswrapper[7547]: I0308 03:48:07.401330 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6-metrics-tls\") pod \"dns-default-4pjsn\" (UID: \"7b485db9-29b5-45a1-a4fb-b4264c6bf2d6\") " pod="openshift-dns/dns-default-4pjsn" Mar 08 03:48:07.401740 master-0 kubenswrapper[7547]: I0308 03:48:07.401378 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfz6w\" (UniqueName: \"kubernetes.io/projected/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6-kube-api-access-nfz6w\") pod \"dns-default-4pjsn\" (UID: \"7b485db9-29b5-45a1-a4fb-b4264c6bf2d6\") " pod="openshift-dns/dns-default-4pjsn" Mar 08 03:48:07.401740 master-0 kubenswrapper[7547]: I0308 03:48:07.401433 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6-config-volume\") pod \"dns-default-4pjsn\" (UID: \"7b485db9-29b5-45a1-a4fb-b4264c6bf2d6\") " pod="openshift-dns/dns-default-4pjsn" Mar 08 03:48:07.419624 master-0 kubenswrapper[7547]: I0308 03:48:07.419327 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" podStartSLOduration=23.419299503 podStartE2EDuration="23.419299503s" podCreationTimestamp="2026-03-08 03:47:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:48:07.41243811 +0000 UTC m=+50.358122623" watchObservedRunningTime="2026-03-08 03:48:07.419299503 +0000 UTC m=+50.364984016" Mar 08 03:48:07.423664 master-0 kubenswrapper[7547]: I0308 03:48:07.423627 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c88pb"] Mar 08 03:48:07.428039 master-0 kubenswrapper[7547]: I0308 03:48:07.425233 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c88pb" Mar 08 03:48:07.458276 master-0 kubenswrapper[7547]: I0308 03:48:07.456528 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c88pb"] Mar 08 03:48:07.460810 master-0 kubenswrapper[7547]: I0308 03:48:07.460170 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-1-master-0" podStartSLOduration=13.460149945 podStartE2EDuration="13.460149945s" podCreationTimestamp="2026-03-08 03:47:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:48:07.445418858 +0000 UTC m=+50.391103371" watchObservedRunningTime="2026-03-08 03:48:07.460149945 +0000 UTC m=+50.405834458" Mar 08 03:48:07.504893 master-0 kubenswrapper[7547]: I0308 03:48:07.504480 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6-metrics-tls\") pod \"dns-default-4pjsn\" (UID: \"7b485db9-29b5-45a1-a4fb-b4264c6bf2d6\") " pod="openshift-dns/dns-default-4pjsn" Mar 08 03:48:07.504893 master-0 kubenswrapper[7547]: I0308 03:48:07.504532 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfz6w\" (UniqueName: \"kubernetes.io/projected/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6-kube-api-access-nfz6w\") pod \"dns-default-4pjsn\" (UID: \"7b485db9-29b5-45a1-a4fb-b4264c6bf2d6\") " pod="openshift-dns/dns-default-4pjsn" Mar 08 03:48:07.504893 master-0 kubenswrapper[7547]: I0308 03:48:07.504589 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6-config-volume\") pod \"dns-default-4pjsn\" (UID: \"7b485db9-29b5-45a1-a4fb-b4264c6bf2d6\") " pod="openshift-dns/dns-default-4pjsn" Mar 08 03:48:07.512009 master-0 kubenswrapper[7547]: I0308 03:48:07.511900 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6-config-volume\") pod \"dns-default-4pjsn\" (UID: \"7b485db9-29b5-45a1-a4fb-b4264c6bf2d6\") " pod="openshift-dns/dns-default-4pjsn" Mar 08 03:48:07.534239 master-0 kubenswrapper[7547]: I0308 03:48:07.533347 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6-metrics-tls\") pod \"dns-default-4pjsn\" (UID: \"7b485db9-29b5-45a1-a4fb-b4264c6bf2d6\") " pod="openshift-dns/dns-default-4pjsn" Mar 08 03:48:07.537670 master-0 kubenswrapper[7547]: I0308 03:48:07.537633 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfz6w\" (UniqueName: \"kubernetes.io/projected/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6-kube-api-access-nfz6w\") pod \"dns-default-4pjsn\" (UID: \"7b485db9-29b5-45a1-a4fb-b4264c6bf2d6\") " pod="openshift-dns/dns-default-4pjsn" Mar 08 03:48:07.609015 master-0 kubenswrapper[7547]: I0308 03:48:07.608971 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgbvc\" (UniqueName: \"kubernetes.io/projected/47b7e26d-8fb3-4749-a544-c86c3a06e439-kube-api-access-tgbvc\") pod \"community-operators-c88pb\" (UID: \"47b7e26d-8fb3-4749-a544-c86c3a06e439\") " pod="openshift-marketplace/community-operators-c88pb" Mar 08 03:48:07.609175 master-0 kubenswrapper[7547]: I0308 03:48:07.609157 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47b7e26d-8fb3-4749-a544-c86c3a06e439-catalog-content\") pod \"community-operators-c88pb\" (UID: \"47b7e26d-8fb3-4749-a544-c86c3a06e439\") " pod="openshift-marketplace/community-operators-c88pb" Mar 08 03:48:07.609365 master-0 kubenswrapper[7547]: I0308 03:48:07.609351 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47b7e26d-8fb3-4749-a544-c86c3a06e439-utilities\") pod \"community-operators-c88pb\" (UID: \"47b7e26d-8fb3-4749-a544-c86c3a06e439\") " pod="openshift-marketplace/community-operators-c88pb" Mar 08 03:48:07.609473 master-0 kubenswrapper[7547]: I0308 03:48:07.609173 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-2-master-0" podStartSLOduration=14.609157076 podStartE2EDuration="14.609157076s" podCreationTimestamp="2026-03-08 03:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:48:07.609052343 +0000 UTC m=+50.554736856" watchObservedRunningTime="2026-03-08 03:48:07.609157076 +0000 UTC m=+50.554841589" Mar 08 03:48:07.620131 master-0 kubenswrapper[7547]: I0308 03:48:07.617754 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c/installer/0.log" Mar 08 03:48:07.620131 master-0 kubenswrapper[7547]: I0308 03:48:07.617810 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:48:07.637978 master-0 kubenswrapper[7547]: I0308 03:48:07.637943 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4pjsn" Mar 08 03:48:07.710474 master-0 kubenswrapper[7547]: I0308 03:48:07.710422 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c-kubelet-dir\") pod \"0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c\" (UID: \"0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c\") " Mar 08 03:48:07.710693 master-0 kubenswrapper[7547]: I0308 03:48:07.710512 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c-var-lock\") pod \"0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c\" (UID: \"0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c\") " Mar 08 03:48:07.710693 master-0 kubenswrapper[7547]: I0308 03:48:07.710552 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c-kube-api-access\") pod \"0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c\" (UID: \"0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c\") " Mar 08 03:48:07.710754 master-0 kubenswrapper[7547]: I0308 03:48:07.710739 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47b7e26d-8fb3-4749-a544-c86c3a06e439-utilities\") pod \"community-operators-c88pb\" (UID: \"47b7e26d-8fb3-4749-a544-c86c3a06e439\") " pod="openshift-marketplace/community-operators-c88pb" Mar 08 03:48:07.710784 master-0 kubenswrapper[7547]: I0308 03:48:07.710771 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgbvc\" (UniqueName: \"kubernetes.io/projected/47b7e26d-8fb3-4749-a544-c86c3a06e439-kube-api-access-tgbvc\") pod \"community-operators-c88pb\" (UID: \"47b7e26d-8fb3-4749-a544-c86c3a06e439\") " pod="openshift-marketplace/community-operators-c88pb" Mar 08 03:48:07.710831 master-0 kubenswrapper[7547]: I0308 03:48:07.710792 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47b7e26d-8fb3-4749-a544-c86c3a06e439-catalog-content\") pod \"community-operators-c88pb\" (UID: \"47b7e26d-8fb3-4749-a544-c86c3a06e439\") " pod="openshift-marketplace/community-operators-c88pb" Mar 08 03:48:07.711021 master-0 kubenswrapper[7547]: I0308 03:48:07.710981 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c" (UID: "0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:48:07.711133 master-0 kubenswrapper[7547]: I0308 03:48:07.711080 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c-var-lock" (OuterVolumeSpecName: "var-lock") pod "0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c" (UID: "0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:48:07.711304 master-0 kubenswrapper[7547]: I0308 03:48:07.711273 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47b7e26d-8fb3-4749-a544-c86c3a06e439-utilities\") pod \"community-operators-c88pb\" (UID: \"47b7e26d-8fb3-4749-a544-c86c3a06e439\") " pod="openshift-marketplace/community-operators-c88pb" Mar 08 03:48:07.712686 master-0 kubenswrapper[7547]: I0308 03:48:07.712450 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47b7e26d-8fb3-4749-a544-c86c3a06e439-catalog-content\") pod \"community-operators-c88pb\" (UID: \"47b7e26d-8fb3-4749-a544-c86c3a06e439\") " pod="openshift-marketplace/community-operators-c88pb" Mar 08 03:48:07.812471 master-0 kubenswrapper[7547]: I0308 03:48:07.812413 7547 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:07.812471 master-0 kubenswrapper[7547]: I0308 03:48:07.812447 7547 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:08.006637 master-0 kubenswrapper[7547]: I0308 03:48:08.006575 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c" (UID: "0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:48:08.035079 master-0 kubenswrapper[7547]: I0308 03:48:08.028724 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:08.059843 master-0 kubenswrapper[7547]: I0308 03:48:08.059761 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgbvc\" (UniqueName: \"kubernetes.io/projected/47b7e26d-8fb3-4749-a544-c86c3a06e439-kube-api-access-tgbvc\") pod \"community-operators-c88pb\" (UID: \"47b7e26d-8fb3-4749-a544-c86c3a06e439\") " pod="openshift-marketplace/community-operators-c88pb" Mar 08 03:48:08.101897 master-0 kubenswrapper[7547]: I0308 03:48:08.099335 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w2lsp"] Mar 08 03:48:08.101897 master-0 kubenswrapper[7547]: E0308 03:48:08.099593 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c" containerName="installer" Mar 08 03:48:08.101897 master-0 kubenswrapper[7547]: I0308 03:48:08.099604 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c" containerName="installer" Mar 08 03:48:08.101897 master-0 kubenswrapper[7547]: I0308 03:48:08.099720 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c" containerName="installer" Mar 08 03:48:08.101897 master-0 kubenswrapper[7547]: I0308 03:48:08.100937 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w2lsp" Mar 08 03:48:08.117163 master-0 kubenswrapper[7547]: I0308 03:48:08.116505 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w2lsp"] Mar 08 03:48:08.208161 master-0 kubenswrapper[7547]: I0308 03:48:08.207884 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" event={"ID":"1eb851be-f157-48ea-9a39-1361b68d2639","Type":"ContainerStarted","Data":"b6a9bf08942a2e11233d13bcdf99f3b818825bc97027c58eaa8bb54d09fc4200"} Mar 08 03:48:08.220992 master-0 kubenswrapper[7547]: I0308 03:48:08.220930 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-wjl9v"] Mar 08 03:48:08.221857 master-0 kubenswrapper[7547]: I0308 03:48:08.221627 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wjl9v" Mar 08 03:48:08.246870 master-0 kubenswrapper[7547]: I0308 03:48:08.246451 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4ff897a-ac47-45e0-aa7d-88c5aea50b70-catalog-content\") pod \"redhat-marketplace-w2lsp\" (UID: \"a4ff897a-ac47-45e0-aa7d-88c5aea50b70\") " pod="openshift-marketplace/redhat-marketplace-w2lsp" Mar 08 03:48:08.246870 master-0 kubenswrapper[7547]: I0308 03:48:08.246489 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwqlx\" (UniqueName: \"kubernetes.io/projected/a4ff897a-ac47-45e0-aa7d-88c5aea50b70-kube-api-access-rwqlx\") pod \"redhat-marketplace-w2lsp\" (UID: \"a4ff897a-ac47-45e0-aa7d-88c5aea50b70\") " pod="openshift-marketplace/redhat-marketplace-w2lsp" Mar 08 03:48:08.246870 master-0 kubenswrapper[7547]: I0308 03:48:08.246525 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4ff897a-ac47-45e0-aa7d-88c5aea50b70-utilities\") pod \"redhat-marketplace-w2lsp\" (UID: \"a4ff897a-ac47-45e0-aa7d-88c5aea50b70\") " pod="openshift-marketplace/redhat-marketplace-w2lsp" Mar 08 03:48:08.251842 master-0 kubenswrapper[7547]: I0308 03:48:08.248182 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" event={"ID":"76ba45a2-8945-4afe-b913-126c26725867","Type":"ContainerStarted","Data":"f53211a59c4b9ec045e75b214bfa158414d4ff8c34df21d3001e9c7bfb4576f1"} Mar 08 03:48:08.267960 master-0 kubenswrapper[7547]: I0308 03:48:08.267787 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" event={"ID":"e187516f-8f33-4c17-81d6-60c10b580bb0","Type":"ContainerStarted","Data":"de51dab7cd1c886dd631e57708e7989fc1ecc3e2aee240ab8ef3755098f85a98"} Mar 08 03:48:08.267960 master-0 kubenswrapper[7547]: I0308 03:48:08.267847 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" event={"ID":"e187516f-8f33-4c17-81d6-60c10b580bb0","Type":"ContainerStarted","Data":"5f430f414a185fad6a2a35137d0c5f44e5f065f4beac554aa96b4675b34ca457"} Mar 08 03:48:08.303944 master-0 kubenswrapper[7547]: I0308 03:48:08.303289 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" event={"ID":"8efdcef9-9b31-4567-b7f9-cb59a894273d","Type":"ContainerStarted","Data":"6e0424b49db60e8cb0bf25a3bcd684afa62ff0a796052679e471c42c86c40318"} Mar 08 03:48:08.304896 master-0 kubenswrapper[7547]: I0308 03:48:08.304858 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c/installer/0.log" Mar 08 03:48:08.304951 master-0 kubenswrapper[7547]: I0308 03:48:08.304911 7547 generic.go:334] "Generic (PLEG): container finished" podID="0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c" containerID="91b72f3b71a1862364600baae9b283064d51be7b70bb871f478d63107b7adf7f" exitCode=2 Mar 08 03:48:08.304991 master-0 kubenswrapper[7547]: I0308 03:48:08.304961 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c","Type":"ContainerDied","Data":"fc483f86e93eec0ff1cf269db68228377fe786ff06140957feac39bbb0a44268"} Mar 08 03:48:08.304991 master-0 kubenswrapper[7547]: I0308 03:48:08.304987 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c","Type":"ContainerDied","Data":"91b72f3b71a1862364600baae9b283064d51be7b70bb871f478d63107b7adf7f"} Mar 08 03:48:08.305049 master-0 kubenswrapper[7547]: I0308 03:48:08.305015 7547 scope.go:117] "RemoveContainer" containerID="91b72f3b71a1862364600baae9b283064d51be7b70bb871f478d63107b7adf7f" Mar 08 03:48:08.305080 master-0 kubenswrapper[7547]: I0308 03:48:08.304957 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:48:08.321933 master-0 kubenswrapper[7547]: I0308 03:48:08.320406 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"7bf40ef9-a79a-4f5d-933c-5276edcccb4b","Type":"ContainerStarted","Data":"20197cef49bb05fb75f2e7eda65c3e92dc7a4af95343b25ff91e78b1d42be6fb"} Mar 08 03:48:08.333133 master-0 kubenswrapper[7547]: I0308 03:48:08.333089 7547 scope.go:117] "RemoveContainer" containerID="91b72f3b71a1862364600baae9b283064d51be7b70bb871f478d63107b7adf7f" Mar 08 03:48:08.339516 master-0 kubenswrapper[7547]: E0308 03:48:08.339023 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91b72f3b71a1862364600baae9b283064d51be7b70bb871f478d63107b7adf7f\": container with ID starting with 91b72f3b71a1862364600baae9b283064d51be7b70bb871f478d63107b7adf7f not found: ID does not exist" containerID="91b72f3b71a1862364600baae9b283064d51be7b70bb871f478d63107b7adf7f" Mar 08 03:48:08.339516 master-0 kubenswrapper[7547]: I0308 03:48:08.339072 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91b72f3b71a1862364600baae9b283064d51be7b70bb871f478d63107b7adf7f"} err="failed to get container status \"91b72f3b71a1862364600baae9b283064d51be7b70bb871f478d63107b7adf7f\": rpc error: code = NotFound desc = could not find container \"91b72f3b71a1862364600baae9b283064d51be7b70bb871f478d63107b7adf7f\": container with ID starting with 91b72f3b71a1862364600baae9b283064d51be7b70bb871f478d63107b7adf7f not found: ID does not exist" Mar 08 03:48:08.346588 master-0 kubenswrapper[7547]: I0308 03:48:08.346345 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" podStartSLOduration=3.807022925 podStartE2EDuration="17.346327126s" podCreationTimestamp="2026-03-08 03:47:51 +0000 UTC" firstStartedPulling="2026-03-08 03:47:52.153666483 +0000 UTC m=+35.099350986" lastFinishedPulling="2026-03-08 03:48:05.692970664 +0000 UTC m=+48.638655187" observedRunningTime="2026-03-08 03:48:08.318960561 +0000 UTC m=+51.264645074" watchObservedRunningTime="2026-03-08 03:48:08.346327126 +0000 UTC m=+51.292011639" Mar 08 03:48:08.347083 master-0 kubenswrapper[7547]: I0308 03:48:08.347055 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4pjsn"] Mar 08 03:48:08.347969 master-0 kubenswrapper[7547]: I0308 03:48:08.347939 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jljzc\" (UniqueName: \"kubernetes.io/projected/d2cd5b23-e622-4b96-aee8-dbc942b73b4a-kube-api-access-jljzc\") pod \"node-resolver-wjl9v\" (UID: \"d2cd5b23-e622-4b96-aee8-dbc942b73b4a\") " pod="openshift-dns/node-resolver-wjl9v" Mar 08 03:48:08.348012 master-0 kubenswrapper[7547]: I0308 03:48:08.347975 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4ff897a-ac47-45e0-aa7d-88c5aea50b70-catalog-content\") pod \"redhat-marketplace-w2lsp\" (UID: \"a4ff897a-ac47-45e0-aa7d-88c5aea50b70\") " pod="openshift-marketplace/redhat-marketplace-w2lsp" Mar 08 03:48:08.348012 master-0 kubenswrapper[7547]: I0308 03:48:08.347997 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwqlx\" (UniqueName: \"kubernetes.io/projected/a4ff897a-ac47-45e0-aa7d-88c5aea50b70-kube-api-access-rwqlx\") pod \"redhat-marketplace-w2lsp\" (UID: \"a4ff897a-ac47-45e0-aa7d-88c5aea50b70\") " pod="openshift-marketplace/redhat-marketplace-w2lsp" Mar 08 03:48:08.350449 master-0 kubenswrapper[7547]: I0308 03:48:08.350414 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d2cd5b23-e622-4b96-aee8-dbc942b73b4a-hosts-file\") pod \"node-resolver-wjl9v\" (UID: \"d2cd5b23-e622-4b96-aee8-dbc942b73b4a\") " pod="openshift-dns/node-resolver-wjl9v" Mar 08 03:48:08.350536 master-0 kubenswrapper[7547]: I0308 03:48:08.350461 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4ff897a-ac47-45e0-aa7d-88c5aea50b70-utilities\") pod \"redhat-marketplace-w2lsp\" (UID: \"a4ff897a-ac47-45e0-aa7d-88c5aea50b70\") " pod="openshift-marketplace/redhat-marketplace-w2lsp" Mar 08 03:48:08.350681 master-0 kubenswrapper[7547]: I0308 03:48:08.350644 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4ff897a-ac47-45e0-aa7d-88c5aea50b70-catalog-content\") pod \"redhat-marketplace-w2lsp\" (UID: \"a4ff897a-ac47-45e0-aa7d-88c5aea50b70\") " pod="openshift-marketplace/redhat-marketplace-w2lsp" Mar 08 03:48:08.351376 master-0 kubenswrapper[7547]: I0308 03:48:08.351345 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4ff897a-ac47-45e0-aa7d-88c5aea50b70-utilities\") pod \"redhat-marketplace-w2lsp\" (UID: \"a4ff897a-ac47-45e0-aa7d-88c5aea50b70\") " pod="openshift-marketplace/redhat-marketplace-w2lsp" Mar 08 03:48:08.352982 master-0 kubenswrapper[7547]: I0308 03:48:08.352919 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c88pb" Mar 08 03:48:08.362284 master-0 kubenswrapper[7547]: I0308 03:48:08.362211 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" podStartSLOduration=2.36219166 podStartE2EDuration="2.36219166s" podCreationTimestamp="2026-03-08 03:48:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:48:08.360618923 +0000 UTC m=+51.306303426" watchObservedRunningTime="2026-03-08 03:48:08.36219166 +0000 UTC m=+51.307876173" Mar 08 03:48:08.369852 master-0 kubenswrapper[7547]: I0308 03:48:08.369789 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwqlx\" (UniqueName: \"kubernetes.io/projected/a4ff897a-ac47-45e0-aa7d-88c5aea50b70-kube-api-access-rwqlx\") pod \"redhat-marketplace-w2lsp\" (UID: \"a4ff897a-ac47-45e0-aa7d-88c5aea50b70\") " pod="openshift-marketplace/redhat-marketplace-w2lsp" Mar 08 03:48:08.377550 master-0 kubenswrapper[7547]: I0308 03:48:08.375630 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 08 03:48:08.383295 master-0 kubenswrapper[7547]: I0308 03:48:08.383222 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 08 03:48:08.422358 master-0 kubenswrapper[7547]: I0308 03:48:08.421363 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w2lsp" Mar 08 03:48:08.462170 master-0 kubenswrapper[7547]: I0308 03:48:08.462109 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jljzc\" (UniqueName: \"kubernetes.io/projected/d2cd5b23-e622-4b96-aee8-dbc942b73b4a-kube-api-access-jljzc\") pod \"node-resolver-wjl9v\" (UID: \"d2cd5b23-e622-4b96-aee8-dbc942b73b4a\") " pod="openshift-dns/node-resolver-wjl9v" Mar 08 03:48:08.463672 master-0 kubenswrapper[7547]: I0308 03:48:08.463628 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d2cd5b23-e622-4b96-aee8-dbc942b73b4a-hosts-file\") pod \"node-resolver-wjl9v\" (UID: \"d2cd5b23-e622-4b96-aee8-dbc942b73b4a\") " pod="openshift-dns/node-resolver-wjl9v" Mar 08 03:48:08.464883 master-0 kubenswrapper[7547]: I0308 03:48:08.463999 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d2cd5b23-e622-4b96-aee8-dbc942b73b4a-hosts-file\") pod \"node-resolver-wjl9v\" (UID: \"d2cd5b23-e622-4b96-aee8-dbc942b73b4a\") " pod="openshift-dns/node-resolver-wjl9v" Mar 08 03:48:08.492311 master-0 kubenswrapper[7547]: I0308 03:48:08.490856 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jljzc\" (UniqueName: \"kubernetes.io/projected/d2cd5b23-e622-4b96-aee8-dbc942b73b4a-kube-api-access-jljzc\") pod \"node-resolver-wjl9v\" (UID: \"d2cd5b23-e622-4b96-aee8-dbc942b73b4a\") " pod="openshift-dns/node-resolver-wjl9v" Mar 08 03:48:08.588221 master-0 kubenswrapper[7547]: I0308 03:48:08.588179 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c88pb"] Mar 08 03:48:08.603497 master-0 kubenswrapper[7547]: I0308 03:48:08.603468 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wjl9v" Mar 08 03:48:08.643389 master-0 kubenswrapper[7547]: W0308 03:48:08.643331 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2cd5b23_e622_4b96_aee8_dbc942b73b4a.slice/crio-2221ffbcc38435886b634ed23b63c6c48586f85323431d02b258500a200b9a2b WatchSource:0}: Error finding container 2221ffbcc38435886b634ed23b63c6c48586f85323431d02b258500a200b9a2b: Status 404 returned error can't find the container with id 2221ffbcc38435886b634ed23b63c6c48586f85323431d02b258500a200b9a2b Mar 08 03:48:08.654105 master-0 kubenswrapper[7547]: I0308 03:48:08.650644 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb"] Mar 08 03:48:08.654105 master-0 kubenswrapper[7547]: I0308 03:48:08.652320 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:48:08.654673 master-0 kubenswrapper[7547]: I0308 03:48:08.654640 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 03:48:08.668928 master-0 kubenswrapper[7547]: I0308 03:48:08.665291 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb"] Mar 08 03:48:08.668928 master-0 kubenswrapper[7547]: I0308 03:48:08.665397 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fzmf\" (UniqueName: \"kubernetes.io/projected/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-kube-api-access-7fzmf\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:48:08.668928 master-0 kubenswrapper[7547]: I0308 03:48:08.665435 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-webhook-cert\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:48:08.668928 master-0 kubenswrapper[7547]: I0308 03:48:08.665457 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-apiservice-cert\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:48:08.668928 master-0 kubenswrapper[7547]: I0308 03:48:08.665504 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-tmpfs\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:48:08.706732 master-0 kubenswrapper[7547]: I0308 03:48:08.705268 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w2lsp"] Mar 08 03:48:08.722659 master-0 kubenswrapper[7547]: W0308 03:48:08.722622 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4ff897a_ac47_45e0_aa7d_88c5aea50b70.slice/crio-3ef1230e8d3c1752f9176b32c87038d18542085bad33cf4cad2b423f622615a4 WatchSource:0}: Error finding container 3ef1230e8d3c1752f9176b32c87038d18542085bad33cf4cad2b423f622615a4: Status 404 returned error can't find the container with id 3ef1230e8d3c1752f9176b32c87038d18542085bad33cf4cad2b423f622615a4 Mar 08 03:48:08.767472 master-0 kubenswrapper[7547]: I0308 03:48:08.767439 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-tmpfs\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:48:08.767605 master-0 kubenswrapper[7547]: I0308 03:48:08.767513 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fzmf\" (UniqueName: \"kubernetes.io/projected/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-kube-api-access-7fzmf\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:48:08.767605 master-0 kubenswrapper[7547]: I0308 03:48:08.767542 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-webhook-cert\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:48:08.767605 master-0 kubenswrapper[7547]: I0308 03:48:08.767572 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-apiservice-cert\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:48:08.777217 master-0 kubenswrapper[7547]: I0308 03:48:08.777174 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-tmpfs\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:48:08.786240 master-0 kubenswrapper[7547]: I0308 03:48:08.786211 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-apiservice-cert\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:48:08.790485 master-0 kubenswrapper[7547]: I0308 03:48:08.790450 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-webhook-cert\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:48:08.795787 master-0 kubenswrapper[7547]: I0308 03:48:08.795752 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fzmf\" (UniqueName: \"kubernetes.io/projected/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-kube-api-access-7fzmf\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:48:08.971781 master-0 kubenswrapper[7547]: I0308 03:48:08.971158 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:48:09.243259 master-0 kubenswrapper[7547]: I0308 03:48:09.242693 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c" path="/var/lib/kubelet/pods/0e9feea5-1d1a-4a7b-a9be-6a46cedfdd3c/volumes" Mar 08 03:48:09.280732 master-0 kubenswrapper[7547]: I0308 03:48:09.279059 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fcdn7"] Mar 08 03:48:09.280732 master-0 kubenswrapper[7547]: I0308 03:48:09.280685 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcdn7" Mar 08 03:48:09.288841 master-0 kubenswrapper[7547]: I0308 03:48:09.288364 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fcdn7"] Mar 08 03:48:09.308495 master-0 kubenswrapper[7547]: I0308 03:48:09.308398 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 08 03:48:09.326211 master-0 kubenswrapper[7547]: I0308 03:48:09.326129 7547 generic.go:334] "Generic (PLEG): container finished" podID="47b7e26d-8fb3-4749-a544-c86c3a06e439" containerID="6a4685509ca0ee5089a0299c404ad04268f955ec57dcda13dc166fb32adff441" exitCode=0 Mar 08 03:48:09.326461 master-0 kubenswrapper[7547]: I0308 03:48:09.326240 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c88pb" event={"ID":"47b7e26d-8fb3-4749-a544-c86c3a06e439","Type":"ContainerDied","Data":"6a4685509ca0ee5089a0299c404ad04268f955ec57dcda13dc166fb32adff441"} Mar 08 03:48:09.326461 master-0 kubenswrapper[7547]: I0308 03:48:09.326309 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c88pb" event={"ID":"47b7e26d-8fb3-4749-a544-c86c3a06e439","Type":"ContainerStarted","Data":"2dd21a72fca6329403b8f2781feadc418ad77fca18e9f0a797d7ec6d0c5d0b5b"} Mar 08 03:48:09.332733 master-0 kubenswrapper[7547]: I0308 03:48:09.332685 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4pjsn" event={"ID":"7b485db9-29b5-45a1-a4fb-b4264c6bf2d6","Type":"ContainerStarted","Data":"b6803c40b59bc228703c8b2ce51f78af3f050cad56ceb99d544a076dbfccb803"} Mar 08 03:48:09.334756 master-0 kubenswrapper[7547]: I0308 03:48:09.334285 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w2lsp" event={"ID":"a4ff897a-ac47-45e0-aa7d-88c5aea50b70","Type":"ContainerStarted","Data":"f3a3a964f5dd60ec9c11603375242a49c3896a4ba1562bbbb0ef714e0b475500"} Mar 08 03:48:09.334756 master-0 kubenswrapper[7547]: I0308 03:48:09.334336 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w2lsp" event={"ID":"a4ff897a-ac47-45e0-aa7d-88c5aea50b70","Type":"ContainerStarted","Data":"3ef1230e8d3c1752f9176b32c87038d18542085bad33cf4cad2b423f622615a4"} Mar 08 03:48:09.336505 master-0 kubenswrapper[7547]: I0308 03:48:09.336400 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wjl9v" event={"ID":"d2cd5b23-e622-4b96-aee8-dbc942b73b4a","Type":"ContainerStarted","Data":"4d4732ceed80f8438afc8c4863467e7b03b7842cb7d44beb277eb245e9214b07"} Mar 08 03:48:09.336505 master-0 kubenswrapper[7547]: I0308 03:48:09.336436 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wjl9v" event={"ID":"d2cd5b23-e622-4b96-aee8-dbc942b73b4a","Type":"ContainerStarted","Data":"2221ffbcc38435886b634ed23b63c6c48586f85323431d02b258500a200b9a2b"} Mar 08 03:48:09.361834 master-0 kubenswrapper[7547]: I0308 03:48:09.361730 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wjl9v" podStartSLOduration=1.361705702 podStartE2EDuration="1.361705702s" podCreationTimestamp="2026-03-08 03:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:48:09.357714358 +0000 UTC m=+52.303398861" watchObservedRunningTime="2026-03-08 03:48:09.361705702 +0000 UTC m=+52.307390215" Mar 08 03:48:09.375419 master-0 kubenswrapper[7547]: I0308 03:48:09.375336 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4818bf75-506a-4b39-bb9b-8067a02d4a51-utilities\") pod \"redhat-operators-fcdn7\" (UID: \"4818bf75-506a-4b39-bb9b-8067a02d4a51\") " pod="openshift-marketplace/redhat-operators-fcdn7" Mar 08 03:48:09.375774 master-0 kubenswrapper[7547]: I0308 03:48:09.375744 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpkxw\" (UniqueName: \"kubernetes.io/projected/4818bf75-506a-4b39-bb9b-8067a02d4a51-kube-api-access-cpkxw\") pod \"redhat-operators-fcdn7\" (UID: \"4818bf75-506a-4b39-bb9b-8067a02d4a51\") " pod="openshift-marketplace/redhat-operators-fcdn7" Mar 08 03:48:09.376033 master-0 kubenswrapper[7547]: I0308 03:48:09.376007 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4818bf75-506a-4b39-bb9b-8067a02d4a51-catalog-content\") pod \"redhat-operators-fcdn7\" (UID: \"4818bf75-506a-4b39-bb9b-8067a02d4a51\") " pod="openshift-marketplace/redhat-operators-fcdn7" Mar 08 03:48:09.477612 master-0 kubenswrapper[7547]: I0308 03:48:09.477555 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4818bf75-506a-4b39-bb9b-8067a02d4a51-catalog-content\") pod \"redhat-operators-fcdn7\" (UID: \"4818bf75-506a-4b39-bb9b-8067a02d4a51\") " pod="openshift-marketplace/redhat-operators-fcdn7" Mar 08 03:48:09.477612 master-0 kubenswrapper[7547]: I0308 03:48:09.477617 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4818bf75-506a-4b39-bb9b-8067a02d4a51-utilities\") pod \"redhat-operators-fcdn7\" (UID: \"4818bf75-506a-4b39-bb9b-8067a02d4a51\") " pod="openshift-marketplace/redhat-operators-fcdn7" Mar 08 03:48:09.478980 master-0 kubenswrapper[7547]: I0308 03:48:09.477644 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpkxw\" (UniqueName: \"kubernetes.io/projected/4818bf75-506a-4b39-bb9b-8067a02d4a51-kube-api-access-cpkxw\") pod \"redhat-operators-fcdn7\" (UID: \"4818bf75-506a-4b39-bb9b-8067a02d4a51\") " pod="openshift-marketplace/redhat-operators-fcdn7" Mar 08 03:48:09.478980 master-0 kubenswrapper[7547]: I0308 03:48:09.478286 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4818bf75-506a-4b39-bb9b-8067a02d4a51-catalog-content\") pod \"redhat-operators-fcdn7\" (UID: \"4818bf75-506a-4b39-bb9b-8067a02d4a51\") " pod="openshift-marketplace/redhat-operators-fcdn7" Mar 08 03:48:09.478980 master-0 kubenswrapper[7547]: I0308 03:48:09.478479 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4818bf75-506a-4b39-bb9b-8067a02d4a51-utilities\") pod \"redhat-operators-fcdn7\" (UID: \"4818bf75-506a-4b39-bb9b-8067a02d4a51\") " pod="openshift-marketplace/redhat-operators-fcdn7" Mar 08 03:48:09.512177 master-0 kubenswrapper[7547]: I0308 03:48:09.502836 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpkxw\" (UniqueName: \"kubernetes.io/projected/4818bf75-506a-4b39-bb9b-8067a02d4a51-kube-api-access-cpkxw\") pod \"redhat-operators-fcdn7\" (UID: \"4818bf75-506a-4b39-bb9b-8067a02d4a51\") " pod="openshift-marketplace/redhat-operators-fcdn7" Mar 08 03:48:09.595725 master-0 kubenswrapper[7547]: I0308 03:48:09.595645 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcdn7" Mar 08 03:48:10.001812 master-0 kubenswrapper[7547]: I0308 03:48:10.001753 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fcdn7"] Mar 08 03:48:10.343279 master-0 kubenswrapper[7547]: I0308 03:48:10.343224 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" event={"ID":"d6cf484f-7125-47ee-9e67-a064d044f43d","Type":"ContainerStarted","Data":"9e40d2d9dedbe70637d2521ba59ce8d2051ddfcf5801584decfddb1f76e27439"} Mar 08 03:48:10.343572 master-0 kubenswrapper[7547]: I0308 03:48:10.343546 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" Mar 08 03:48:10.344814 master-0 kubenswrapper[7547]: I0308 03:48:10.344787 7547 generic.go:334] "Generic (PLEG): container finished" podID="a4ff897a-ac47-45e0-aa7d-88c5aea50b70" containerID="f3a3a964f5dd60ec9c11603375242a49c3896a4ba1562bbbb0ef714e0b475500" exitCode=0 Mar 08 03:48:10.344977 master-0 kubenswrapper[7547]: I0308 03:48:10.344943 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w2lsp" event={"ID":"a4ff897a-ac47-45e0-aa7d-88c5aea50b70","Type":"ContainerDied","Data":"f3a3a964f5dd60ec9c11603375242a49c3896a4ba1562bbbb0ef714e0b475500"} Mar 08 03:48:10.344977 master-0 kubenswrapper[7547]: I0308 03:48:10.344959 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-3-master-0" podUID="7bf40ef9-a79a-4f5d-933c-5276edcccb4b" containerName="installer" containerID="cri-o://20197cef49bb05fb75f2e7eda65c3e92dc7a4af95343b25ff91e78b1d42be6fb" gracePeriod=30 Mar 08 03:48:10.534298 master-0 kubenswrapper[7547]: I0308 03:48:10.534179 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb"] Mar 08 03:48:10.678977 master-0 kubenswrapper[7547]: I0308 03:48:10.671697 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" podStartSLOduration=5.975642339 podStartE2EDuration="9.671678809s" podCreationTimestamp="2026-03-08 03:48:01 +0000 UTC" firstStartedPulling="2026-03-08 03:48:06.128404535 +0000 UTC m=+49.074089048" lastFinishedPulling="2026-03-08 03:48:09.824441005 +0000 UTC m=+52.770125518" observedRunningTime="2026-03-08 03:48:10.668347301 +0000 UTC m=+53.614031814" watchObservedRunningTime="2026-03-08 03:48:10.671678809 +0000 UTC m=+53.617363322" Mar 08 03:48:10.678977 master-0 kubenswrapper[7547]: I0308 03:48:10.674614 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9ljn9"] Mar 08 03:48:10.678977 master-0 kubenswrapper[7547]: I0308 03:48:10.675468 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ljn9" Mar 08 03:48:10.691675 master-0 kubenswrapper[7547]: I0308 03:48:10.691627 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" Mar 08 03:48:10.704255 master-0 kubenswrapper[7547]: I0308 03:48:10.704023 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b8dcd07-f245-4783-ba40-521a14e96043-utilities\") pod \"certified-operators-9ljn9\" (UID: \"8b8dcd07-f245-4783-ba40-521a14e96043\") " pod="openshift-marketplace/certified-operators-9ljn9" Mar 08 03:48:10.704381 master-0 kubenswrapper[7547]: I0308 03:48:10.704329 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9smb9\" (UniqueName: \"kubernetes.io/projected/8b8dcd07-f245-4783-ba40-521a14e96043-kube-api-access-9smb9\") pod \"certified-operators-9ljn9\" (UID: \"8b8dcd07-f245-4783-ba40-521a14e96043\") " pod="openshift-marketplace/certified-operators-9ljn9" Mar 08 03:48:10.704417 master-0 kubenswrapper[7547]: I0308 03:48:10.704369 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b8dcd07-f245-4783-ba40-521a14e96043-catalog-content\") pod \"certified-operators-9ljn9\" (UID: \"8b8dcd07-f245-4783-ba40-521a14e96043\") " pod="openshift-marketplace/certified-operators-9ljn9" Mar 08 03:48:10.805085 master-0 kubenswrapper[7547]: I0308 03:48:10.805022 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b8dcd07-f245-4783-ba40-521a14e96043-catalog-content\") pod \"certified-operators-9ljn9\" (UID: \"8b8dcd07-f245-4783-ba40-521a14e96043\") " pod="openshift-marketplace/certified-operators-9ljn9" Mar 08 03:48:10.805085 master-0 kubenswrapper[7547]: I0308 03:48:10.805095 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b8dcd07-f245-4783-ba40-521a14e96043-utilities\") pod \"certified-operators-9ljn9\" (UID: \"8b8dcd07-f245-4783-ba40-521a14e96043\") " pod="openshift-marketplace/certified-operators-9ljn9" Mar 08 03:48:10.805297 master-0 kubenswrapper[7547]: I0308 03:48:10.805153 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9smb9\" (UniqueName: \"kubernetes.io/projected/8b8dcd07-f245-4783-ba40-521a14e96043-kube-api-access-9smb9\") pod \"certified-operators-9ljn9\" (UID: \"8b8dcd07-f245-4783-ba40-521a14e96043\") " pod="openshift-marketplace/certified-operators-9ljn9" Mar 08 03:48:10.805543 master-0 kubenswrapper[7547]: I0308 03:48:10.805508 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b8dcd07-f245-4783-ba40-521a14e96043-catalog-content\") pod \"certified-operators-9ljn9\" (UID: \"8b8dcd07-f245-4783-ba40-521a14e96043\") " pod="openshift-marketplace/certified-operators-9ljn9" Mar 08 03:48:10.805625 master-0 kubenswrapper[7547]: I0308 03:48:10.805515 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b8dcd07-f245-4783-ba40-521a14e96043-utilities\") pod \"certified-operators-9ljn9\" (UID: \"8b8dcd07-f245-4783-ba40-521a14e96043\") " pod="openshift-marketplace/certified-operators-9ljn9" Mar 08 03:48:10.906086 master-0 kubenswrapper[7547]: I0308 03:48:10.873662 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9ljn9"] Mar 08 03:48:11.349723 master-0 kubenswrapper[7547]: I0308 03:48:11.349678 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9smb9\" (UniqueName: \"kubernetes.io/projected/8b8dcd07-f245-4783-ba40-521a14e96043-kube-api-access-9smb9\") pod \"certified-operators-9ljn9\" (UID: \"8b8dcd07-f245-4783-ba40-521a14e96043\") " pod="openshift-marketplace/certified-operators-9ljn9" Mar 08 03:48:11.590123 master-0 kubenswrapper[7547]: I0308 03:48:11.590054 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ljn9" Mar 08 03:48:11.883774 master-0 kubenswrapper[7547]: I0308 03:48:11.883698 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:48:11.884475 master-0 kubenswrapper[7547]: I0308 03:48:11.884420 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:48:11.950116 master-0 kubenswrapper[7547]: W0308 03:48:11.950069 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c165f1_53b4_4e7e_9cd1_00bb4d9cbc79.slice/crio-cbd8c33fbce7b1c9cf78530bc91c2ad9c46d9601ea6ef0914dea487c85e63f0d WatchSource:0}: Error finding container cbd8c33fbce7b1c9cf78530bc91c2ad9c46d9601ea6ef0914dea487c85e63f0d: Status 404 returned error can't find the container with id cbd8c33fbce7b1c9cf78530bc91c2ad9c46d9601ea6ef0914dea487c85e63f0d Mar 08 03:48:11.963646 master-0 kubenswrapper[7547]: W0308 03:48:11.963610 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4818bf75_506a_4b39_bb9b_8067a02d4a51.slice/crio-e336c6a6f7f015b88a354964bc81c85ce7b4f460b1abcd96eda1c1313b7e0178 WatchSource:0}: Error finding container e336c6a6f7f015b88a354964bc81c85ce7b4f460b1abcd96eda1c1313b7e0178: Status 404 returned error can't find the container with id e336c6a6f7f015b88a354964bc81c85ce7b4f460b1abcd96eda1c1313b7e0178 Mar 08 03:48:12.390130 master-0 kubenswrapper[7547]: I0308 03:48:12.386347 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:48:12.427038 master-0 kubenswrapper[7547]: I0308 03:48:12.426974 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" event={"ID":"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79","Type":"ContainerStarted","Data":"cbd8c33fbce7b1c9cf78530bc91c2ad9c46d9601ea6ef0914dea487c85e63f0d"} Mar 08 03:48:12.428981 master-0 kubenswrapper[7547]: I0308 03:48:12.428930 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcdn7" event={"ID":"4818bf75-506a-4b39-bb9b-8067a02d4a51","Type":"ContainerStarted","Data":"e336c6a6f7f015b88a354964bc81c85ce7b4f460b1abcd96eda1c1313b7e0178"} Mar 08 03:48:12.432882 master-0 kubenswrapper[7547]: I0308 03:48:12.430738 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_7bf40ef9-a79a-4f5d-933c-5276edcccb4b/installer/0.log" Mar 08 03:48:12.432882 master-0 kubenswrapper[7547]: I0308 03:48:12.430796 7547 generic.go:334] "Generic (PLEG): container finished" podID="7bf40ef9-a79a-4f5d-933c-5276edcccb4b" containerID="20197cef49bb05fb75f2e7eda65c3e92dc7a4af95343b25ff91e78b1d42be6fb" exitCode=1 Mar 08 03:48:12.432882 master-0 kubenswrapper[7547]: I0308 03:48:12.430893 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"7bf40ef9-a79a-4f5d-933c-5276edcccb4b","Type":"ContainerDied","Data":"20197cef49bb05fb75f2e7eda65c3e92dc7a4af95343b25ff91e78b1d42be6fb"} Mar 08 03:48:12.601345 master-0 kubenswrapper[7547]: I0308 03:48:12.601281 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:48:12.614401 master-0 kubenswrapper[7547]: I0308 03:48:12.614327 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 08 03:48:12.615055 master-0 kubenswrapper[7547]: I0308 03:48:12.615031 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:48:12.669552 master-0 kubenswrapper[7547]: I0308 03:48:12.669268 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9fbe4302-8264-4b6c-ae3f-0c6d981bc998-kube-api-access\") pod \"installer-4-master-0\" (UID: \"9fbe4302-8264-4b6c-ae3f-0c6d981bc998\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:48:12.669552 master-0 kubenswrapper[7547]: I0308 03:48:12.669325 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9fbe4302-8264-4b6c-ae3f-0c6d981bc998-var-lock\") pod \"installer-4-master-0\" (UID: \"9fbe4302-8264-4b6c-ae3f-0c6d981bc998\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:48:12.669552 master-0 kubenswrapper[7547]: I0308 03:48:12.669365 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9fbe4302-8264-4b6c-ae3f-0c6d981bc998-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"9fbe4302-8264-4b6c-ae3f-0c6d981bc998\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:48:12.770643 master-0 kubenswrapper[7547]: I0308 03:48:12.770573 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9fbe4302-8264-4b6c-ae3f-0c6d981bc998-kube-api-access\") pod \"installer-4-master-0\" (UID: \"9fbe4302-8264-4b6c-ae3f-0c6d981bc998\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:48:12.770867 master-0 kubenswrapper[7547]: I0308 03:48:12.770733 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9fbe4302-8264-4b6c-ae3f-0c6d981bc998-var-lock\") pod \"installer-4-master-0\" (UID: \"9fbe4302-8264-4b6c-ae3f-0c6d981bc998\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:48:12.770867 master-0 kubenswrapper[7547]: I0308 03:48:12.770759 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9fbe4302-8264-4b6c-ae3f-0c6d981bc998-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"9fbe4302-8264-4b6c-ae3f-0c6d981bc998\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:48:12.770867 master-0 kubenswrapper[7547]: I0308 03:48:12.770858 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9fbe4302-8264-4b6c-ae3f-0c6d981bc998-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"9fbe4302-8264-4b6c-ae3f-0c6d981bc998\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:48:12.771068 master-0 kubenswrapper[7547]: I0308 03:48:12.770897 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9fbe4302-8264-4b6c-ae3f-0c6d981bc998-var-lock\") pod \"installer-4-master-0\" (UID: \"9fbe4302-8264-4b6c-ae3f-0c6d981bc998\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:48:12.881441 master-0 kubenswrapper[7547]: I0308 03:48:12.877910 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 08 03:48:12.885721 master-0 kubenswrapper[7547]: I0308 03:48:12.885675 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9fbe4302-8264-4b6c-ae3f-0c6d981bc998-kube-api-access\") pod \"installer-4-master-0\" (UID: \"9fbe4302-8264-4b6c-ae3f-0c6d981bc998\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:48:12.927415 master-0 kubenswrapper[7547]: I0308 03:48:12.927295 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:48:13.436573 master-0 kubenswrapper[7547]: I0308 03:48:13.436535 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_7bf40ef9-a79a-4f5d-933c-5276edcccb4b/installer/0.log" Mar 08 03:48:13.437557 master-0 kubenswrapper[7547]: I0308 03:48:13.437526 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"7bf40ef9-a79a-4f5d-933c-5276edcccb4b","Type":"ContainerDied","Data":"6174c0cced28744679d07cd6bfda1e5016fe917384d58e904dd1b71ae6c4d184"} Mar 08 03:48:13.437557 master-0 kubenswrapper[7547]: I0308 03:48:13.437553 7547 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6174c0cced28744679d07cd6bfda1e5016fe917384d58e904dd1b71ae6c4d184" Mar 08 03:48:13.441162 master-0 kubenswrapper[7547]: I0308 03:48:13.441123 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:48:13.464227 master-0 kubenswrapper[7547]: I0308 03:48:13.464184 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_7bf40ef9-a79a-4f5d-933c-5276edcccb4b/installer/0.log" Mar 08 03:48:13.464380 master-0 kubenswrapper[7547]: I0308 03:48:13.464264 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:48:13.498577 master-0 kubenswrapper[7547]: I0308 03:48:13.497286 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bf40ef9-a79a-4f5d-933c-5276edcccb4b-kubelet-dir\") pod \"7bf40ef9-a79a-4f5d-933c-5276edcccb4b\" (UID: \"7bf40ef9-a79a-4f5d-933c-5276edcccb4b\") " Mar 08 03:48:13.498577 master-0 kubenswrapper[7547]: I0308 03:48:13.497356 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7bf40ef9-a79a-4f5d-933c-5276edcccb4b-var-lock\") pod \"7bf40ef9-a79a-4f5d-933c-5276edcccb4b\" (UID: \"7bf40ef9-a79a-4f5d-933c-5276edcccb4b\") " Mar 08 03:48:13.498577 master-0 kubenswrapper[7547]: I0308 03:48:13.497391 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bf40ef9-a79a-4f5d-933c-5276edcccb4b-kube-api-access\") pod \"7bf40ef9-a79a-4f5d-933c-5276edcccb4b\" (UID: \"7bf40ef9-a79a-4f5d-933c-5276edcccb4b\") " Mar 08 03:48:13.498577 master-0 kubenswrapper[7547]: I0308 03:48:13.497909 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bf40ef9-a79a-4f5d-933c-5276edcccb4b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7bf40ef9-a79a-4f5d-933c-5276edcccb4b" (UID: "7bf40ef9-a79a-4f5d-933c-5276edcccb4b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:48:13.498577 master-0 kubenswrapper[7547]: I0308 03:48:13.497932 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bf40ef9-a79a-4f5d-933c-5276edcccb4b-var-lock" (OuterVolumeSpecName: "var-lock") pod "7bf40ef9-a79a-4f5d-933c-5276edcccb4b" (UID: "7bf40ef9-a79a-4f5d-933c-5276edcccb4b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:48:13.526398 master-0 kubenswrapper[7547]: I0308 03:48:13.513799 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bf40ef9-a79a-4f5d-933c-5276edcccb4b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7bf40ef9-a79a-4f5d-933c-5276edcccb4b" (UID: "7bf40ef9-a79a-4f5d-933c-5276edcccb4b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:48:13.598053 master-0 kubenswrapper[7547]: I0308 03:48:13.598018 7547 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7bf40ef9-a79a-4f5d-933c-5276edcccb4b-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:13.598255 master-0 kubenswrapper[7547]: I0308 03:48:13.598241 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bf40ef9-a79a-4f5d-933c-5276edcccb4b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:13.598314 master-0 kubenswrapper[7547]: I0308 03:48:13.598304 7547 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bf40ef9-a79a-4f5d-933c-5276edcccb4b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:14.416614 master-0 kubenswrapper[7547]: I0308 03:48:14.415580 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 08 03:48:14.420861 master-0 kubenswrapper[7547]: I0308 03:48:14.417949 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9ljn9"] Mar 08 03:48:14.442612 master-0 kubenswrapper[7547]: I0308 03:48:14.442564 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcdn7" event={"ID":"4818bf75-506a-4b39-bb9b-8067a02d4a51","Type":"ContainerStarted","Data":"7d24a93a94248357a5a5f5d3f147fa35c7c76b7e48473be4b7ed880f3c7c6c1a"} Mar 08 03:48:14.446042 master-0 kubenswrapper[7547]: I0308 03:48:14.445980 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" event={"ID":"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79","Type":"ContainerStarted","Data":"2b4acaaf56ed8bdc6cdd7d0259d748d16cdb49497bc1b41e30ed64b78e49092c"} Mar 08 03:48:14.446042 master-0 kubenswrapper[7547]: I0308 03:48:14.446018 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:48:14.446042 master-0 kubenswrapper[7547]: I0308 03:48:14.446049 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:48:14.734859 master-0 kubenswrapper[7547]: I0308 03:48:14.731104 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-slm72_a60bc804-52e7-422a-87fd-ac4c5aa90cb3/authentication-operator/0.log" Mar 08 03:48:14.783704 master-0 kubenswrapper[7547]: I0308 03:48:14.783268 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 08 03:48:14.783704 master-0 kubenswrapper[7547]: E0308 03:48:14.783504 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bf40ef9-a79a-4f5d-933c-5276edcccb4b" containerName="installer" Mar 08 03:48:14.783704 master-0 kubenswrapper[7547]: I0308 03:48:14.783514 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bf40ef9-a79a-4f5d-933c-5276edcccb4b" containerName="installer" Mar 08 03:48:14.783704 master-0 kubenswrapper[7547]: I0308 03:48:14.783600 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bf40ef9-a79a-4f5d-933c-5276edcccb4b" containerName="installer" Mar 08 03:48:14.784049 master-0 kubenswrapper[7547]: I0308 03:48:14.783951 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:48:14.784670 master-0 kubenswrapper[7547]: I0308 03:48:14.784604 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-7695b9f8b5-4jpgl_76ba45a2-8945-4afe-b913-126c26725867/fix-audit-permissions/0.log" Mar 08 03:48:14.798928 master-0 kubenswrapper[7547]: I0308 03:48:14.798423 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 03:48:14.818558 master-0 kubenswrapper[7547]: I0308 03:48:14.817663 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-7695b9f8b5-4jpgl_76ba45a2-8945-4afe-b913-126c26725867/oauth-apiserver/0.log" Mar 08 03:48:14.818558 master-0 kubenswrapper[7547]: I0308 03:48:14.818194 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 08 03:48:14.894850 master-0 kubenswrapper[7547]: I0308 03:48:14.886860 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-vzms7_5a7752f9-7b9a-451f-997a-e9f696d38b34/etcd-operator/0.log" Mar 08 03:48:14.922634 master-0 kubenswrapper[7547]: I0308 03:48:14.922428 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/baab6171-046d-4fc9-b7d7-ff2fd12f185f-var-lock\") pod \"installer-1-master-0\" (UID: \"baab6171-046d-4fc9-b7d7-ff2fd12f185f\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:48:14.922634 master-0 kubenswrapper[7547]: I0308 03:48:14.922493 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baab6171-046d-4fc9-b7d7-ff2fd12f185f-kube-api-access\") pod \"installer-1-master-0\" (UID: \"baab6171-046d-4fc9-b7d7-ff2fd12f185f\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:48:14.922634 master-0 kubenswrapper[7547]: I0308 03:48:14.922536 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baab6171-046d-4fc9-b7d7-ff2fd12f185f-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"baab6171-046d-4fc9-b7d7-ff2fd12f185f\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:48:14.933072 master-0 kubenswrapper[7547]: I0308 03:48:14.932966 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_354f29997baa583b6238f7de9108ee10/etcdctl/0.log" Mar 08 03:48:14.973662 master-0 kubenswrapper[7547]: I0308 03:48:14.973504 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_354f29997baa583b6238f7de9108ee10/etcd/0.log" Mar 08 03:48:14.973662 master-0 kubenswrapper[7547]: I0308 03:48:14.973618 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:48:14.997063 master-0 kubenswrapper[7547]: I0308 03:48:14.996934 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b/installer/0.log" Mar 08 03:48:15.013963 master-0 kubenswrapper[7547]: I0308 03:48:15.013913 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-68bd585b-8gfmf_1cbcb403-a424-4496-8c5c-5eb5e42dfb93/kube-apiserver-operator/0.log" Mar 08 03:48:15.022038 master-0 kubenswrapper[7547]: I0308 03:48:15.021999 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 08 03:48:15.030215 master-0 kubenswrapper[7547]: I0308 03:48:15.027132 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baab6171-046d-4fc9-b7d7-ff2fd12f185f-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"baab6171-046d-4fc9-b7d7-ff2fd12f185f\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:48:15.030215 master-0 kubenswrapper[7547]: I0308 03:48:15.027242 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/baab6171-046d-4fc9-b7d7-ff2fd12f185f-var-lock\") pod \"installer-1-master-0\" (UID: \"baab6171-046d-4fc9-b7d7-ff2fd12f185f\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:48:15.030215 master-0 kubenswrapper[7547]: I0308 03:48:15.027306 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/baab6171-046d-4fc9-b7d7-ff2fd12f185f-var-lock\") pod \"installer-1-master-0\" (UID: \"baab6171-046d-4fc9-b7d7-ff2fd12f185f\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:48:15.030215 master-0 kubenswrapper[7547]: I0308 03:48:15.027318 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baab6171-046d-4fc9-b7d7-ff2fd12f185f-kube-api-access\") pod \"installer-1-master-0\" (UID: \"baab6171-046d-4fc9-b7d7-ff2fd12f185f\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:48:15.030215 master-0 kubenswrapper[7547]: I0308 03:48:15.027267 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baab6171-046d-4fc9-b7d7-ff2fd12f185f-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"baab6171-046d-4fc9-b7d7-ff2fd12f185f\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:48:15.060483 master-0 kubenswrapper[7547]: I0308 03:48:15.060408 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baab6171-046d-4fc9-b7d7-ff2fd12f185f-kube-api-access\") pod \"installer-1-master-0\" (UID: \"baab6171-046d-4fc9-b7d7-ff2fd12f185f\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:48:15.060631 master-0 kubenswrapper[7547]: I0308 03:48:15.060423 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 08 03:48:15.066686 master-0 kubenswrapper[7547]: I0308 03:48:15.066247 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_5f77c8e18b751d90bc0dfe2d4e304050/setup/0.log" Mar 08 03:48:15.075886 master-0 kubenswrapper[7547]: I0308 03:48:15.075839 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_5f77c8e18b751d90bc0dfe2d4e304050/kube-apiserver/0.log" Mar 08 03:48:15.085881 master-0 kubenswrapper[7547]: I0308 03:48:15.083598 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" podStartSLOduration=7.083554088 podStartE2EDuration="7.083554088s" podCreationTimestamp="2026-03-08 03:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:48:15.082144594 +0000 UTC m=+58.027829107" watchObservedRunningTime="2026-03-08 03:48:15.083554088 +0000 UTC m=+58.029238601" Mar 08 03:48:15.095983 master-0 kubenswrapper[7547]: I0308 03:48:15.093577 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_5f77c8e18b751d90bc0dfe2d4e304050/kube-apiserver-insecure-readyz/0.log" Mar 08 03:48:15.136954 master-0 kubenswrapper[7547]: I0308 03:48:15.136881 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_0f62034a-dae9-46af-8c14-006b728b631f/installer/0.log" Mar 08 03:48:15.150730 master-0 kubenswrapper[7547]: I0308 03:48:15.149793 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-86d7cdfdfb-chpl6_26180f77-0b1a-4d0f-9ed0-a12fdee69817/kube-controller-manager-operator/0.log" Mar 08 03:48:15.183884 master-0 kubenswrapper[7547]: I0308 03:48:15.183852 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:48:15.240445 master-0 kubenswrapper[7547]: I0308 03:48:15.240296 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bf40ef9-a79a-4f5d-933c-5276edcccb4b" path="/var/lib/kubelet/pods/7bf40ef9-a79a-4f5d-933c-5276edcccb4b/volumes" Mar 08 03:48:15.253447 master-0 kubenswrapper[7547]: I0308 03:48:15.253415 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_70c6db8e-1612-4da7-84ad-0750258e310e/installer/0.log" Mar 08 03:48:15.253519 master-0 kubenswrapper[7547]: I0308 03:48:15.253481 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:48:15.374105 master-0 kubenswrapper[7547]: I0308 03:48:15.374058 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-controller-manager-master-0_f78c05e1499b533b83f091333d61f045/kube-controller-manager/0.log" Mar 08 03:48:15.432734 master-0 kubenswrapper[7547]: I0308 03:48:15.432694 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/70c6db8e-1612-4da7-84ad-0750258e310e-var-lock\") pod \"70c6db8e-1612-4da7-84ad-0750258e310e\" (UID: \"70c6db8e-1612-4da7-84ad-0750258e310e\") " Mar 08 03:48:15.433226 master-0 kubenswrapper[7547]: I0308 03:48:15.432745 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70c6db8e-1612-4da7-84ad-0750258e310e-kubelet-dir\") pod \"70c6db8e-1612-4da7-84ad-0750258e310e\" (UID: \"70c6db8e-1612-4da7-84ad-0750258e310e\") " Mar 08 03:48:15.433226 master-0 kubenswrapper[7547]: I0308 03:48:15.432838 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70c6db8e-1612-4da7-84ad-0750258e310e-kube-api-access\") pod \"70c6db8e-1612-4da7-84ad-0750258e310e\" (UID: \"70c6db8e-1612-4da7-84ad-0750258e310e\") " Mar 08 03:48:15.433636 master-0 kubenswrapper[7547]: I0308 03:48:15.433593 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70c6db8e-1612-4da7-84ad-0750258e310e-var-lock" (OuterVolumeSpecName: "var-lock") pod "70c6db8e-1612-4da7-84ad-0750258e310e" (UID: "70c6db8e-1612-4da7-84ad-0750258e310e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:48:15.433636 master-0 kubenswrapper[7547]: I0308 03:48:15.433628 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70c6db8e-1612-4da7-84ad-0750258e310e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "70c6db8e-1612-4da7-84ad-0750258e310e" (UID: "70c6db8e-1612-4da7-84ad-0750258e310e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:48:15.436585 master-0 kubenswrapper[7547]: I0308 03:48:15.436534 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c6db8e-1612-4da7-84ad-0750258e310e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "70c6db8e-1612-4da7-84ad-0750258e310e" (UID: "70c6db8e-1612-4da7-84ad-0750258e310e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:48:15.462061 master-0 kubenswrapper[7547]: I0308 03:48:15.461792 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"9fbe4302-8264-4b6c-ae3f-0c6d981bc998","Type":"ContainerStarted","Data":"6c87b0d61a0be181b1c9aaa5d8b34b8c3ddfab52941ef9b130c9b50919085c6e"} Mar 08 03:48:15.462061 master-0 kubenswrapper[7547]: I0308 03:48:15.461848 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"9fbe4302-8264-4b6c-ae3f-0c6d981bc998","Type":"ContainerStarted","Data":"c5d13c3d50d361afd8177f873e9bf70ee9d1066fdbbb27d8ba1706b4283439f3"} Mar 08 03:48:15.469148 master-0 kubenswrapper[7547]: I0308 03:48:15.469118 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_70c6db8e-1612-4da7-84ad-0750258e310e/installer/0.log" Mar 08 03:48:15.469266 master-0 kubenswrapper[7547]: I0308 03:48:15.469155 7547 generic.go:334] "Generic (PLEG): container finished" podID="70c6db8e-1612-4da7-84ad-0750258e310e" containerID="cef32873cbe3d6b97478c715fa10b9fc2b7f6472cbc949089347f2bceb34bb4a" exitCode=1 Mar 08 03:48:15.469266 master-0 kubenswrapper[7547]: I0308 03:48:15.469199 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"70c6db8e-1612-4da7-84ad-0750258e310e","Type":"ContainerDied","Data":"cef32873cbe3d6b97478c715fa10b9fc2b7f6472cbc949089347f2bceb34bb4a"} Mar 08 03:48:15.469266 master-0 kubenswrapper[7547]: I0308 03:48:15.469218 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"70c6db8e-1612-4da7-84ad-0750258e310e","Type":"ContainerDied","Data":"7d660df37cf1eac55715dfa6930009397ac7a30a1e9f2938482021ef45c4c112"} Mar 08 03:48:15.469266 master-0 kubenswrapper[7547]: I0308 03:48:15.469233 7547 scope.go:117] "RemoveContainer" containerID="cef32873cbe3d6b97478c715fa10b9fc2b7f6472cbc949089347f2bceb34bb4a" Mar 08 03:48:15.469386 master-0 kubenswrapper[7547]: I0308 03:48:15.469318 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:48:15.492148 master-0 kubenswrapper[7547]: I0308 03:48:15.492107 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4pjsn" event={"ID":"7b485db9-29b5-45a1-a4fb-b4264c6bf2d6","Type":"ContainerStarted","Data":"b33caf5ea44d94ac8da239dcaa5776d88828b16dc2657eb04f31bad18c98def6"} Mar 08 03:48:15.496534 master-0 kubenswrapper[7547]: I0308 03:48:15.496389 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-master-0" podStartSLOduration=3.496376755 podStartE2EDuration="3.496376755s" podCreationTimestamp="2026-03-08 03:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:48:15.495466984 +0000 UTC m=+58.441151497" watchObservedRunningTime="2026-03-08 03:48:15.496376755 +0000 UTC m=+58.442061268" Mar 08 03:48:15.502561 master-0 kubenswrapper[7547]: I0308 03:48:15.502466 7547 generic.go:334] "Generic (PLEG): container finished" podID="4818bf75-506a-4b39-bb9b-8067a02d4a51" containerID="7d24a93a94248357a5a5f5d3f147fa35c7c76b7e48473be4b7ed880f3c7c6c1a" exitCode=0 Mar 08 03:48:15.502561 master-0 kubenswrapper[7547]: I0308 03:48:15.502517 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcdn7" event={"ID":"4818bf75-506a-4b39-bb9b-8067a02d4a51","Type":"ContainerDied","Data":"7d24a93a94248357a5a5f5d3f147fa35c7c76b7e48473be4b7ed880f3c7c6c1a"} Mar 08 03:48:15.515419 master-0 kubenswrapper[7547]: I0308 03:48:15.515377 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 08 03:48:15.528713 master-0 kubenswrapper[7547]: I0308 03:48:15.528411 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 08 03:48:15.534733 master-0 kubenswrapper[7547]: I0308 03:48:15.534160 7547 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/70c6db8e-1612-4da7-84ad-0750258e310e-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:15.534733 master-0 kubenswrapper[7547]: I0308 03:48:15.534183 7547 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70c6db8e-1612-4da7-84ad-0750258e310e-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:15.534733 master-0 kubenswrapper[7547]: I0308 03:48:15.534193 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70c6db8e-1612-4da7-84ad-0750258e310e-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:15.537151 master-0 kubenswrapper[7547]: I0308 03:48:15.537073 7547 scope.go:117] "RemoveContainer" containerID="cef32873cbe3d6b97478c715fa10b9fc2b7f6472cbc949089347f2bceb34bb4a" Mar 08 03:48:15.544887 master-0 kubenswrapper[7547]: I0308 03:48:15.544837 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" event={"ID":"792f503e-c34b-4c30-9c9e-70bdea2f2629","Type":"ContainerStarted","Data":"fd093a4b12530917708bd6b4190b1961b64cf67482f3b37bc37161fad3d593fd"} Mar 08 03:48:15.544983 master-0 kubenswrapper[7547]: E0308 03:48:15.544918 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cef32873cbe3d6b97478c715fa10b9fc2b7f6472cbc949089347f2bceb34bb4a\": container with ID starting with cef32873cbe3d6b97478c715fa10b9fc2b7f6472cbc949089347f2bceb34bb4a not found: ID does not exist" containerID="cef32873cbe3d6b97478c715fa10b9fc2b7f6472cbc949089347f2bceb34bb4a" Mar 08 03:48:15.544983 master-0 kubenswrapper[7547]: I0308 03:48:15.544963 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef32873cbe3d6b97478c715fa10b9fc2b7f6472cbc949089347f2bceb34bb4a"} err="failed to get container status \"cef32873cbe3d6b97478c715fa10b9fc2b7f6472cbc949089347f2bceb34bb4a\": rpc error: code = NotFound desc = could not find container \"cef32873cbe3d6b97478c715fa10b9fc2b7f6472cbc949089347f2bceb34bb4a\": container with ID starting with cef32873cbe3d6b97478c715fa10b9fc2b7f6472cbc949089347f2bceb34bb4a not found: ID does not exist" Mar 08 03:48:15.546597 master-0 kubenswrapper[7547]: I0308 03:48:15.545767 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:15.566144 master-0 kubenswrapper[7547]: I0308 03:48:15.566066 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-controller-manager-master-0_f78c05e1499b533b83f091333d61f045/cluster-policy-controller/0.log" Mar 08 03:48:15.569047 master-0 kubenswrapper[7547]: I0308 03:48:15.569001 7547 generic.go:334] "Generic (PLEG): container finished" podID="8b8dcd07-f245-4783-ba40-521a14e96043" containerID="49114ad9d3b7bdc277da0ec77422d819922e98024efb0cc791aad1d20f5d05e5" exitCode=0 Mar 08 03:48:15.569871 master-0 kubenswrapper[7547]: I0308 03:48:15.569846 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ljn9" event={"ID":"8b8dcd07-f245-4783-ba40-521a14e96043","Type":"ContainerDied","Data":"49114ad9d3b7bdc277da0ec77422d819922e98024efb0cc791aad1d20f5d05e5"} Mar 08 03:48:15.569924 master-0 kubenswrapper[7547]: I0308 03:48:15.569875 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ljn9" event={"ID":"8b8dcd07-f245-4783-ba40-521a14e96043","Type":"ContainerStarted","Data":"bf04badf3d7dec96704972bbd5d79b0249130f03eebcc3996b0e1ce42c352e5e"} Mar 08 03:48:15.592929 master-0 kubenswrapper[7547]: I0308 03:48:15.592302 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:15.612340 master-0 kubenswrapper[7547]: I0308 03:48:15.612164 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" podStartSLOduration=5.938961804 podStartE2EDuration="14.612147973s" podCreationTimestamp="2026-03-08 03:48:01 +0000 UTC" firstStartedPulling="2026-03-08 03:48:06.231362811 +0000 UTC m=+49.177047324" lastFinishedPulling="2026-03-08 03:48:14.90454898 +0000 UTC m=+57.850233493" observedRunningTime="2026-03-08 03:48:15.585013514 +0000 UTC m=+58.530698027" watchObservedRunningTime="2026-03-08 03:48:15.612147973 +0000 UTC m=+58.557832476" Mar 08 03:48:15.684669 master-0 kubenswrapper[7547]: I0308 03:48:15.684572 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 08 03:48:15.703794 master-0 kubenswrapper[7547]: W0308 03:48:15.703745 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbaab6171_046d_4fc9_b7d7_ff2fd12f185f.slice/crio-f2dd6f50bb704e85814a264a6b3647cea280a2063c980541df1082c59aa92b82 WatchSource:0}: Error finding container f2dd6f50bb704e85814a264a6b3647cea280a2063c980541df1082c59aa92b82: Status 404 returned error can't find the container with id f2dd6f50bb704e85814a264a6b3647cea280a2063c980541df1082c59aa92b82 Mar 08 03:48:15.750252 master-0 kubenswrapper[7547]: I0308 03:48:15.750229 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-scheduler-master-0_a1a56802af72ce1aac6b5077f1695ac0/kube-scheduler/0.log" Mar 08 03:48:16.549922 master-0 kubenswrapper[7547]: I0308 03:48:16.549603 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5c74bfc494-g6n58_e4541b7b-3f7f-4851-9bd9-26fcda5cab13/kube-scheduler-operator-container/0.log" Mar 08 03:48:16.569760 master-0 kubenswrapper[7547]: I0308 03:48:16.569230 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 08 03:48:16.569760 master-0 kubenswrapper[7547]: I0308 03:48:16.569427 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-1-master-0" podUID="0f62034a-dae9-46af-8c14-006b728b631f" containerName="installer" containerID="cri-o://d54a64defa77630a2cfba1757b5211284714af7095323e2acde0e62e40e90243" gracePeriod=30 Mar 08 03:48:16.602916 master-0 kubenswrapper[7547]: I0308 03:48:16.587478 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"baab6171-046d-4fc9-b7d7-ff2fd12f185f","Type":"ContainerStarted","Data":"5a92eed331c18522564f92e3e6e14d9dcb5be24514d5ff22fbf01a140de4cfee"} Mar 08 03:48:16.602916 master-0 kubenswrapper[7547]: I0308 03:48:16.587521 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"baab6171-046d-4fc9-b7d7-ff2fd12f185f","Type":"ContainerStarted","Data":"f2dd6f50bb704e85814a264a6b3647cea280a2063c980541df1082c59aa92b82"} Mar 08 03:48:16.617854 master-0 kubenswrapper[7547]: I0308 03:48:16.612200 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-master-0" podStartSLOduration=2.612181397 podStartE2EDuration="2.612181397s" podCreationTimestamp="2026-03-08 03:48:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:48:16.612123355 +0000 UTC m=+59.557807868" watchObservedRunningTime="2026-03-08 03:48:16.612181397 +0000 UTC m=+59.557865910" Mar 08 03:48:16.619613 master-0 kubenswrapper[7547]: I0308 03:48:16.619464 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4pjsn" event={"ID":"7b485db9-29b5-45a1-a4fb-b4264c6bf2d6","Type":"ContainerStarted","Data":"35a34603fdd487a1f199c78f6e46379265a9432b5c6f10bd9d0175e7f68e2dea"} Mar 08 03:48:16.639099 master-0 kubenswrapper[7547]: I0308 03:48:16.639063 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4pjsn" Mar 08 03:48:16.646216 master-0 kubenswrapper[7547]: I0308 03:48:16.645566 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4pjsn" podStartSLOduration=3.080857797 podStartE2EDuration="9.645544303s" podCreationTimestamp="2026-03-08 03:48:07 +0000 UTC" firstStartedPulling="2026-03-08 03:48:08.352229265 +0000 UTC m=+51.297913768" lastFinishedPulling="2026-03-08 03:48:14.916915761 +0000 UTC m=+57.862600274" observedRunningTime="2026-03-08 03:48:16.641266963 +0000 UTC m=+59.586951476" watchObservedRunningTime="2026-03-08 03:48:16.645544303 +0000 UTC m=+59.591228816" Mar 08 03:48:16.749992 master-0 kubenswrapper[7547]: I0308 03:48:16.749877 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-8h6fj_1b69fbf6-1ca5-413e-bffd-965730bcec1b/kube-rbac-proxy/0.log" Mar 08 03:48:16.954423 master-0 kubenswrapper[7547]: I0308 03:48:16.951361 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-8h6fj_1b69fbf6-1ca5-413e-bffd-965730bcec1b/manager/0.log" Mar 08 03:48:17.152975 master-0 kubenswrapper[7547]: I0308 03:48:17.152806 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-75qmb_2f59fe81-deee-4ced-ae9d-f17752c82c4b/manager/0.log" Mar 08 03:48:17.251030 master-0 kubenswrapper[7547]: I0308 03:48:17.250960 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c6db8e-1612-4da7-84ad-0750258e310e" path="/var/lib/kubelet/pods/70c6db8e-1612-4da7-84ad-0750258e310e/volumes" Mar 08 03:48:17.366868 master-0 kubenswrapper[7547]: I0308 03:48:17.353702 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-75qmb_2f59fe81-deee-4ced-ae9d-f17752c82c4b/kube-rbac-proxy/0.log" Mar 08 03:48:17.546236 master-0 kubenswrapper[7547]: I0308 03:48:17.546127 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-77899cf6d-x9h9q_7ff63c73-62a3-44b4-acd3-1b3df175794f/copy-catalogd-manifests/0.log" Mar 08 03:48:17.747563 master-0 kubenswrapper[7547]: I0308 03:48:17.747529 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-77899cf6d-x9h9q_7ff63c73-62a3-44b4-acd3-1b3df175794f/copy-operator-controller-manifests/0.log" Mar 08 03:48:17.952903 master-0 kubenswrapper[7547]: I0308 03:48:17.952734 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-77899cf6d-x9h9q_7ff63c73-62a3-44b4-acd3-1b3df175794f/cluster-olm-operator/0.log" Mar 08 03:48:18.070401 master-0 kubenswrapper[7547]: I0308 03:48:18.070358 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fcdn7"] Mar 08 03:48:18.151774 master-0 kubenswrapper[7547]: I0308 03:48:18.147965 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-799b6db4d7-75682_30211469-7108-4820-a988-26fc4ced734e/openshift-apiserver-operator/0.log" Mar 08 03:48:18.348969 master-0 kubenswrapper[7547]: I0308 03:48:18.348163 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-6b779d99b8-7kmck_c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f/fix-audit-permissions/0.log" Mar 08 03:48:18.473855 master-0 kubenswrapper[7547]: I0308 03:48:18.473792 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rnz4w"] Mar 08 03:48:18.474023 master-0 kubenswrapper[7547]: E0308 03:48:18.473977 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c6db8e-1612-4da7-84ad-0750258e310e" containerName="installer" Mar 08 03:48:18.474023 master-0 kubenswrapper[7547]: I0308 03:48:18.473988 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c6db8e-1612-4da7-84ad-0750258e310e" containerName="installer" Mar 08 03:48:18.474239 master-0 kubenswrapper[7547]: I0308 03:48:18.474113 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c6db8e-1612-4da7-84ad-0750258e310e" containerName="installer" Mar 08 03:48:18.475248 master-0 kubenswrapper[7547]: I0308 03:48:18.474846 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:48:18.477005 master-0 kubenswrapper[7547]: I0308 03:48:18.476970 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-g8pwp" Mar 08 03:48:18.487350 master-0 kubenswrapper[7547]: I0308 03:48:18.487305 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rnz4w"] Mar 08 03:48:18.550240 master-0 kubenswrapper[7547]: I0308 03:48:18.550204 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-6b779d99b8-7kmck_c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f/openshift-apiserver/0.log" Mar 08 03:48:18.594057 master-0 kubenswrapper[7547]: I0308 03:48:18.594013 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp98d\" (UniqueName: \"kubernetes.io/projected/6bee226a-2a66-4032-8aba-2c8b82abcb6a-kube-api-access-tp98d\") pod \"redhat-operators-rnz4w\" (UID: \"6bee226a-2a66-4032-8aba-2c8b82abcb6a\") " pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:48:18.594293 master-0 kubenswrapper[7547]: I0308 03:48:18.594121 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bee226a-2a66-4032-8aba-2c8b82abcb6a-utilities\") pod \"redhat-operators-rnz4w\" (UID: \"6bee226a-2a66-4032-8aba-2c8b82abcb6a\") " pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:48:18.594293 master-0 kubenswrapper[7547]: I0308 03:48:18.594181 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bee226a-2a66-4032-8aba-2c8b82abcb6a-catalog-content\") pod \"redhat-operators-rnz4w\" (UID: \"6bee226a-2a66-4032-8aba-2c8b82abcb6a\") " pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:48:18.695361 master-0 kubenswrapper[7547]: I0308 03:48:18.695271 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bee226a-2a66-4032-8aba-2c8b82abcb6a-utilities\") pod \"redhat-operators-rnz4w\" (UID: \"6bee226a-2a66-4032-8aba-2c8b82abcb6a\") " pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:48:18.695517 master-0 kubenswrapper[7547]: I0308 03:48:18.695465 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bee226a-2a66-4032-8aba-2c8b82abcb6a-catalog-content\") pod \"redhat-operators-rnz4w\" (UID: \"6bee226a-2a66-4032-8aba-2c8b82abcb6a\") " pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:48:18.695552 master-0 kubenswrapper[7547]: I0308 03:48:18.695514 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp98d\" (UniqueName: \"kubernetes.io/projected/6bee226a-2a66-4032-8aba-2c8b82abcb6a-kube-api-access-tp98d\") pod \"redhat-operators-rnz4w\" (UID: \"6bee226a-2a66-4032-8aba-2c8b82abcb6a\") " pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:48:18.695751 master-0 kubenswrapper[7547]: I0308 03:48:18.695721 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bee226a-2a66-4032-8aba-2c8b82abcb6a-utilities\") pod \"redhat-operators-rnz4w\" (UID: \"6bee226a-2a66-4032-8aba-2c8b82abcb6a\") " pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:48:18.696054 master-0 kubenswrapper[7547]: I0308 03:48:18.696022 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bee226a-2a66-4032-8aba-2c8b82abcb6a-catalog-content\") pod \"redhat-operators-rnz4w\" (UID: \"6bee226a-2a66-4032-8aba-2c8b82abcb6a\") " pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:48:18.711211 master-0 kubenswrapper[7547]: I0308 03:48:18.711180 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp98d\" (UniqueName: \"kubernetes.io/projected/6bee226a-2a66-4032-8aba-2c8b82abcb6a-kube-api-access-tp98d\") pod \"redhat-operators-rnz4w\" (UID: \"6bee226a-2a66-4032-8aba-2c8b82abcb6a\") " pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:48:18.747560 master-0 kubenswrapper[7547]: I0308 03:48:18.747455 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-6b779d99b8-7kmck_c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f/openshift-apiserver-check-endpoints/0.log" Mar 08 03:48:18.793429 master-0 kubenswrapper[7547]: I0308 03:48:18.793378 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:48:18.952895 master-0 kubenswrapper[7547]: I0308 03:48:18.952787 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-vzms7_5a7752f9-7b9a-451f-997a-e9f696d38b34/etcd-operator/0.log" Mar 08 03:48:19.153623 master-0 kubenswrapper[7547]: I0308 03:48:19.153570 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-kt66j_0ebf1330-e044-4ff5-8b48-2d667e0c5625/openshift-controller-manager-operator/0.log" Mar 08 03:48:19.159644 master-0 kubenswrapper[7547]: I0308 03:48:19.159612 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 08 03:48:19.160347 master-0 kubenswrapper[7547]: I0308 03:48:19.160332 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:48:19.162685 master-0 kubenswrapper[7547]: I0308 03:48:19.162648 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-7rcx9" Mar 08 03:48:19.176926 master-0 kubenswrapper[7547]: I0308 03:48:19.174466 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 08 03:48:19.267070 master-0 kubenswrapper[7547]: I0308 03:48:19.266759 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9ljn9"] Mar 08 03:48:19.302562 master-0 kubenswrapper[7547]: I0308 03:48:19.302512 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47643289-ac4b-425d-8ea1-913b6ca39ee0-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"47643289-ac4b-425d-8ea1-913b6ca39ee0\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:48:19.302796 master-0 kubenswrapper[7547]: I0308 03:48:19.302651 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47643289-ac4b-425d-8ea1-913b6ca39ee0-var-lock\") pod \"installer-2-master-0\" (UID: \"47643289-ac4b-425d-8ea1-913b6ca39ee0\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:48:19.302796 master-0 kubenswrapper[7547]: I0308 03:48:19.302740 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47643289-ac4b-425d-8ea1-913b6ca39ee0-kube-api-access\") pod \"installer-2-master-0\" (UID: \"47643289-ac4b-425d-8ea1-913b6ca39ee0\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:48:19.348195 master-0 kubenswrapper[7547]: I0308 03:48:19.348142 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-597dfbc64c-f55p7_792f503e-c34b-4c30-9c9e-70bdea2f2629/controller-manager/0.log" Mar 08 03:48:19.413352 master-0 kubenswrapper[7547]: I0308 03:48:19.413306 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47643289-ac4b-425d-8ea1-913b6ca39ee0-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"47643289-ac4b-425d-8ea1-913b6ca39ee0\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:48:19.421445 master-0 kubenswrapper[7547]: I0308 03:48:19.413048 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47643289-ac4b-425d-8ea1-913b6ca39ee0-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"47643289-ac4b-425d-8ea1-913b6ca39ee0\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:48:19.421917 master-0 kubenswrapper[7547]: I0308 03:48:19.421900 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47643289-ac4b-425d-8ea1-913b6ca39ee0-var-lock\") pod \"installer-2-master-0\" (UID: \"47643289-ac4b-425d-8ea1-913b6ca39ee0\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:48:19.422156 master-0 kubenswrapper[7547]: I0308 03:48:19.422003 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47643289-ac4b-425d-8ea1-913b6ca39ee0-kube-api-access\") pod \"installer-2-master-0\" (UID: \"47643289-ac4b-425d-8ea1-913b6ca39ee0\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:48:19.422202 master-0 kubenswrapper[7547]: I0308 03:48:19.422135 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47643289-ac4b-425d-8ea1-913b6ca39ee0-var-lock\") pod \"installer-2-master-0\" (UID: \"47643289-ac4b-425d-8ea1-913b6ca39ee0\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:48:19.438594 master-0 kubenswrapper[7547]: I0308 03:48:19.438557 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47643289-ac4b-425d-8ea1-913b6ca39ee0-kube-api-access\") pod \"installer-2-master-0\" (UID: \"47643289-ac4b-425d-8ea1-913b6ca39ee0\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:48:19.483389 master-0 kubenswrapper[7547]: I0308 03:48:19.483329 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:48:19.572911 master-0 kubenswrapper[7547]: I0308 03:48:19.568637 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6994fc9dc8-9w74l_d6cf484f-7125-47ee-9e67-a064d044f43d/route-controller-manager/0.log" Mar 08 03:48:19.674078 master-0 kubenswrapper[7547]: I0308 03:48:19.674017 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p8nq8"] Mar 08 03:48:19.674851 master-0 kubenswrapper[7547]: I0308 03:48:19.674809 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:48:19.677256 master-0 kubenswrapper[7547]: I0308 03:48:19.677218 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-7dkp2" Mar 08 03:48:19.687161 master-0 kubenswrapper[7547]: I0308 03:48:19.687099 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p8nq8"] Mar 08 03:48:19.829653 master-0 kubenswrapper[7547]: I0308 03:48:19.829554 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0031e3a9-b253-4dda-a890-bf3e4d8737e8-utilities\") pod \"certified-operators-p8nq8\" (UID: \"0031e3a9-b253-4dda-a890-bf3e4d8737e8\") " pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:48:19.829653 master-0 kubenswrapper[7547]: I0308 03:48:19.829607 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0031e3a9-b253-4dda-a890-bf3e4d8737e8-catalog-content\") pod \"certified-operators-p8nq8\" (UID: \"0031e3a9-b253-4dda-a890-bf3e4d8737e8\") " pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:48:19.829653 master-0 kubenswrapper[7547]: I0308 03:48:19.829656 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhms8\" (UniqueName: \"kubernetes.io/projected/0031e3a9-b253-4dda-a890-bf3e4d8737e8-kube-api-access-qhms8\") pod \"certified-operators-p8nq8\" (UID: \"0031e3a9-b253-4dda-a890-bf3e4d8737e8\") " pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:48:19.936275 master-0 kubenswrapper[7547]: I0308 03:48:19.930745 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhms8\" (UniqueName: \"kubernetes.io/projected/0031e3a9-b253-4dda-a890-bf3e4d8737e8-kube-api-access-qhms8\") pod \"certified-operators-p8nq8\" (UID: \"0031e3a9-b253-4dda-a890-bf3e4d8737e8\") " pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:48:19.936275 master-0 kubenswrapper[7547]: I0308 03:48:19.932808 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0031e3a9-b253-4dda-a890-bf3e4d8737e8-utilities\") pod \"certified-operators-p8nq8\" (UID: \"0031e3a9-b253-4dda-a890-bf3e4d8737e8\") " pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:48:19.936275 master-0 kubenswrapper[7547]: I0308 03:48:19.932865 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0031e3a9-b253-4dda-a890-bf3e4d8737e8-catalog-content\") pod \"certified-operators-p8nq8\" (UID: \"0031e3a9-b253-4dda-a890-bf3e4d8737e8\") " pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:48:19.936275 master-0 kubenswrapper[7547]: I0308 03:48:19.933289 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0031e3a9-b253-4dda-a890-bf3e4d8737e8-catalog-content\") pod \"certified-operators-p8nq8\" (UID: \"0031e3a9-b253-4dda-a890-bf3e4d8737e8\") " pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:48:19.936275 master-0 kubenswrapper[7547]: I0308 03:48:19.933787 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0031e3a9-b253-4dda-a890-bf3e4d8737e8-utilities\") pod \"certified-operators-p8nq8\" (UID: \"0031e3a9-b253-4dda-a890-bf3e4d8737e8\") " pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:48:19.952003 master-0 kubenswrapper[7547]: I0308 03:48:19.951383 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhms8\" (UniqueName: \"kubernetes.io/projected/0031e3a9-b253-4dda-a890-bf3e4d8737e8-kube-api-access-qhms8\") pod \"certified-operators-p8nq8\" (UID: \"0031e3a9-b253-4dda-a890-bf3e4d8737e8\") " pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:48:20.043654 master-0 kubenswrapper[7547]: I0308 03:48:20.043612 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:48:20.870217 master-0 kubenswrapper[7547]: I0308 03:48:20.869959 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c88pb"] Mar 08 03:48:21.276758 master-0 kubenswrapper[7547]: I0308 03:48:21.276483 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lwt58"] Mar 08 03:48:21.280615 master-0 kubenswrapper[7547]: I0308 03:48:21.278045 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:48:21.282058 master-0 kubenswrapper[7547]: I0308 03:48:21.282030 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-7cjnr" Mar 08 03:48:21.290094 master-0 kubenswrapper[7547]: I0308 03:48:21.290041 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lwt58"] Mar 08 03:48:21.356366 master-0 kubenswrapper[7547]: I0308 03:48:21.356270 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfgc6\" (UniqueName: \"kubernetes.io/projected/2b3d1dc7-22f9-4c0c-802a-d7314894b255-kube-api-access-zfgc6\") pod \"community-operators-lwt58\" (UID: \"2b3d1dc7-22f9-4c0c-802a-d7314894b255\") " pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:48:21.356366 master-0 kubenswrapper[7547]: I0308 03:48:21.356359 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3d1dc7-22f9-4c0c-802a-d7314894b255-catalog-content\") pod \"community-operators-lwt58\" (UID: \"2b3d1dc7-22f9-4c0c-802a-d7314894b255\") " pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:48:21.356804 master-0 kubenswrapper[7547]: I0308 03:48:21.356438 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3d1dc7-22f9-4c0c-802a-d7314894b255-utilities\") pod \"community-operators-lwt58\" (UID: \"2b3d1dc7-22f9-4c0c-802a-d7314894b255\") " pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:48:21.457241 master-0 kubenswrapper[7547]: I0308 03:48:21.457105 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3d1dc7-22f9-4c0c-802a-d7314894b255-utilities\") pod \"community-operators-lwt58\" (UID: \"2b3d1dc7-22f9-4c0c-802a-d7314894b255\") " pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:48:21.458211 master-0 kubenswrapper[7547]: I0308 03:48:21.457901 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfgc6\" (UniqueName: \"kubernetes.io/projected/2b3d1dc7-22f9-4c0c-802a-d7314894b255-kube-api-access-zfgc6\") pod \"community-operators-lwt58\" (UID: \"2b3d1dc7-22f9-4c0c-802a-d7314894b255\") " pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:48:21.458211 master-0 kubenswrapper[7547]: I0308 03:48:21.457949 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3d1dc7-22f9-4c0c-802a-d7314894b255-catalog-content\") pod \"community-operators-lwt58\" (UID: \"2b3d1dc7-22f9-4c0c-802a-d7314894b255\") " pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:48:21.458211 master-0 kubenswrapper[7547]: I0308 03:48:21.457686 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3d1dc7-22f9-4c0c-802a-d7314894b255-utilities\") pod \"community-operators-lwt58\" (UID: \"2b3d1dc7-22f9-4c0c-802a-d7314894b255\") " pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:48:21.458575 master-0 kubenswrapper[7547]: I0308 03:48:21.458541 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3d1dc7-22f9-4c0c-802a-d7314894b255-catalog-content\") pod \"community-operators-lwt58\" (UID: \"2b3d1dc7-22f9-4c0c-802a-d7314894b255\") " pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:48:21.486491 master-0 kubenswrapper[7547]: I0308 03:48:21.486454 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfgc6\" (UniqueName: \"kubernetes.io/projected/2b3d1dc7-22f9-4c0c-802a-d7314894b255-kube-api-access-zfgc6\") pod \"community-operators-lwt58\" (UID: \"2b3d1dc7-22f9-4c0c-802a-d7314894b255\") " pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:48:21.596217 master-0 kubenswrapper[7547]: I0308 03:48:21.596106 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:48:22.073955 master-0 kubenswrapper[7547]: I0308 03:48:22.073897 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w2lsp"] Mar 08 03:48:22.121550 master-0 kubenswrapper[7547]: I0308 03:48:22.117580 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-597dfbc64c-f55p7"] Mar 08 03:48:22.121550 master-0 kubenswrapper[7547]: I0308 03:48:22.117810 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" podUID="792f503e-c34b-4c30-9c9e-70bdea2f2629" containerName="controller-manager" containerID="cri-o://fd093a4b12530917708bd6b4190b1961b64cf67482f3b37bc37161fad3d593fd" gracePeriod=30 Mar 08 03:48:22.189859 master-0 kubenswrapper[7547]: I0308 03:48:22.189770 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l"] Mar 08 03:48:22.190109 master-0 kubenswrapper[7547]: I0308 03:48:22.190013 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" podUID="d6cf484f-7125-47ee-9e67-a064d044f43d" containerName="route-controller-manager" containerID="cri-o://9e40d2d9dedbe70637d2521ba59ce8d2051ddfcf5801584decfddb1f76e27439" gracePeriod=30 Mar 08 03:48:22.268646 master-0 kubenswrapper[7547]: I0308 03:48:22.268485 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp"] Mar 08 03:48:22.268861 master-0 kubenswrapper[7547]: I0308 03:48:22.268787 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" podUID="349d438d-d124-4d34-a172-4160e766c680" containerName="cluster-version-operator" containerID="cri-o://7000ce3c427519e0de59e1941d9bb87835a2ee0fa70ad24b6c24c11e5207d4d2" gracePeriod=130 Mar 08 03:48:22.274832 master-0 kubenswrapper[7547]: I0308 03:48:22.274762 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:48:22.274998 master-0 kubenswrapper[7547]: E0308 03:48:22.274942 7547 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: configmap "kube-rbac-proxy" not found Mar 08 03:48:22.274998 master-0 kubenswrapper[7547]: E0308 03:48:22.274987 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:49:26.274974408 +0000 UTC m=+129.220658921 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : configmap "kube-rbac-proxy" not found Mar 08 03:48:22.480942 master-0 kubenswrapper[7547]: I0308 03:48:22.480043 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4h8qm"] Mar 08 03:48:22.481161 master-0 kubenswrapper[7547]: I0308 03:48:22.481028 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:48:22.482883 master-0 kubenswrapper[7547]: I0308 03:48:22.482850 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-qfnls" Mar 08 03:48:22.491716 master-0 kubenswrapper[7547]: I0308 03:48:22.490774 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4h8qm"] Mar 08 03:48:22.681382 master-0 kubenswrapper[7547]: I0308 03:48:22.681285 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e283f49-b85d-4789-a71f-3fcb5033cdf0-catalog-content\") pod \"redhat-marketplace-4h8qm\" (UID: \"8e283f49-b85d-4789-a71f-3fcb5033cdf0\") " pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:48:22.681583 master-0 kubenswrapper[7547]: I0308 03:48:22.681463 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e283f49-b85d-4789-a71f-3fcb5033cdf0-utilities\") pod \"redhat-marketplace-4h8qm\" (UID: \"8e283f49-b85d-4789-a71f-3fcb5033cdf0\") " pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:48:22.681583 master-0 kubenswrapper[7547]: I0308 03:48:22.681578 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm59c\" (UniqueName: \"kubernetes.io/projected/8e283f49-b85d-4789-a71f-3fcb5033cdf0-kube-api-access-zm59c\") pod \"redhat-marketplace-4h8qm\" (UID: \"8e283f49-b85d-4789-a71f-3fcb5033cdf0\") " pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:48:22.782611 master-0 kubenswrapper[7547]: I0308 03:48:22.782360 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e283f49-b85d-4789-a71f-3fcb5033cdf0-utilities\") pod \"redhat-marketplace-4h8qm\" (UID: \"8e283f49-b85d-4789-a71f-3fcb5033cdf0\") " pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:48:22.782611 master-0 kubenswrapper[7547]: I0308 03:48:22.782471 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm59c\" (UniqueName: \"kubernetes.io/projected/8e283f49-b85d-4789-a71f-3fcb5033cdf0-kube-api-access-zm59c\") pod \"redhat-marketplace-4h8qm\" (UID: \"8e283f49-b85d-4789-a71f-3fcb5033cdf0\") " pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:48:22.782866 master-0 kubenswrapper[7547]: I0308 03:48:22.782733 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e283f49-b85d-4789-a71f-3fcb5033cdf0-catalog-content\") pod \"redhat-marketplace-4h8qm\" (UID: \"8e283f49-b85d-4789-a71f-3fcb5033cdf0\") " pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:48:22.783067 master-0 kubenswrapper[7547]: I0308 03:48:22.783018 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e283f49-b85d-4789-a71f-3fcb5033cdf0-utilities\") pod \"redhat-marketplace-4h8qm\" (UID: \"8e283f49-b85d-4789-a71f-3fcb5033cdf0\") " pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:48:22.783361 master-0 kubenswrapper[7547]: I0308 03:48:22.783321 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e283f49-b85d-4789-a71f-3fcb5033cdf0-catalog-content\") pod \"redhat-marketplace-4h8qm\" (UID: \"8e283f49-b85d-4789-a71f-3fcb5033cdf0\") " pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:48:22.800964 master-0 kubenswrapper[7547]: I0308 03:48:22.800657 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm59c\" (UniqueName: \"kubernetes.io/projected/8e283f49-b85d-4789-a71f-3fcb5033cdf0-kube-api-access-zm59c\") pod \"redhat-marketplace-4h8qm\" (UID: \"8e283f49-b85d-4789-a71f-3fcb5033cdf0\") " pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:48:22.811111 master-0 kubenswrapper[7547]: I0308 03:48:22.811084 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:48:23.190365 master-0 kubenswrapper[7547]: I0308 03:48:23.190306 7547 patch_prober.go:28] interesting pod/controller-manager-597dfbc64c-f55p7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.47:8443/healthz\": dial tcp 10.128.0.47:8443: connect: connection refused" start-of-body= Mar 08 03:48:23.190879 master-0 kubenswrapper[7547]: I0308 03:48:23.190378 7547 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" podUID="792f503e-c34b-4c30-9c9e-70bdea2f2629" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.47:8443/healthz\": dial tcp 10.128.0.47:8443: connect: connection refused" Mar 08 03:48:25.307441 master-0 kubenswrapper[7547]: I0308 03:48:25.307389 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 08 03:48:25.307898 master-0 kubenswrapper[7547]: I0308 03:48:25.307632 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-4-master-0" podUID="9fbe4302-8264-4b6c-ae3f-0c6d981bc998" containerName="installer" containerID="cri-o://6c87b0d61a0be181b1c9aaa5d8b34b8c3ddfab52941ef9b130c9b50919085c6e" gracePeriod=30 Mar 08 03:48:25.641489 master-0 kubenswrapper[7547]: I0308 03:48:25.641341 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4pjsn" Mar 08 03:48:25.698859 master-0 kubenswrapper[7547]: I0308 03:48:25.695630 7547 generic.go:334] "Generic (PLEG): container finished" podID="349d438d-d124-4d34-a172-4160e766c680" containerID="7000ce3c427519e0de59e1941d9bb87835a2ee0fa70ad24b6c24c11e5207d4d2" exitCode=0 Mar 08 03:48:25.698859 master-0 kubenswrapper[7547]: I0308 03:48:25.695706 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" event={"ID":"349d438d-d124-4d34-a172-4160e766c680","Type":"ContainerDied","Data":"7000ce3c427519e0de59e1941d9bb87835a2ee0fa70ad24b6c24c11e5207d4d2"} Mar 08 03:48:25.698859 master-0 kubenswrapper[7547]: I0308 03:48:25.698250 7547 generic.go:334] "Generic (PLEG): container finished" podID="792f503e-c34b-4c30-9c9e-70bdea2f2629" containerID="fd093a4b12530917708bd6b4190b1961b64cf67482f3b37bc37161fad3d593fd" exitCode=0 Mar 08 03:48:25.698859 master-0 kubenswrapper[7547]: I0308 03:48:25.698307 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" event={"ID":"792f503e-c34b-4c30-9c9e-70bdea2f2629","Type":"ContainerDied","Data":"fd093a4b12530917708bd6b4190b1961b64cf67482f3b37bc37161fad3d593fd"} Mar 08 03:48:25.702868 master-0 kubenswrapper[7547]: I0308 03:48:25.702778 7547 generic.go:334] "Generic (PLEG): container finished" podID="d6cf484f-7125-47ee-9e67-a064d044f43d" containerID="9e40d2d9dedbe70637d2521ba59ce8d2051ddfcf5801584decfddb1f76e27439" exitCode=0 Mar 08 03:48:25.703022 master-0 kubenswrapper[7547]: I0308 03:48:25.702866 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" event={"ID":"d6cf484f-7125-47ee-9e67-a064d044f43d","Type":"ContainerDied","Data":"9e40d2d9dedbe70637d2521ba59ce8d2051ddfcf5801584decfddb1f76e27439"} Mar 08 03:48:25.705488 master-0 kubenswrapper[7547]: I0308 03:48:25.704702 7547 generic.go:334] "Generic (PLEG): container finished" podID="1cbcb403-a424-4496-8c5c-5eb5e42dfb93" containerID="3b576ae60c0b63ec0db45afc74d3ab2b7a31ef872c28479883b2bca1465128e0" exitCode=0 Mar 08 03:48:25.705488 master-0 kubenswrapper[7547]: I0308 03:48:25.704729 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" event={"ID":"1cbcb403-a424-4496-8c5c-5eb5e42dfb93","Type":"ContainerDied","Data":"3b576ae60c0b63ec0db45afc74d3ab2b7a31ef872c28479883b2bca1465128e0"} Mar 08 03:48:25.705488 master-0 kubenswrapper[7547]: I0308 03:48:25.705161 7547 scope.go:117] "RemoveContainer" containerID="3b576ae60c0b63ec0db45afc74d3ab2b7a31ef872c28479883b2bca1465128e0" Mar 08 03:48:26.518913 master-0 kubenswrapper[7547]: I0308 03:48:26.514092 7547 patch_prober.go:28] interesting pod/route-controller-manager-6994fc9dc8-9w74l container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:48:26.518913 master-0 kubenswrapper[7547]: I0308 03:48:26.514190 7547 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" podUID="d6cf484f-7125-47ee-9e67-a064d044f43d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:48:27.392688 master-0 kubenswrapper[7547]: I0308 03:48:27.392598 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 08 03:48:27.393436 master-0 kubenswrapper[7547]: I0308 03:48:27.393388 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:48:27.397337 master-0 kubenswrapper[7547]: I0308 03:48:27.397287 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-stjc6" Mar 08 03:48:27.459122 master-0 kubenswrapper[7547]: I0308 03:48:27.459057 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f865279-e751-456d-8c96-6381f8b45ce1-var-lock\") pod \"installer-5-master-0\" (UID: \"0f865279-e751-456d-8c96-6381f8b45ce1\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:48:27.459122 master-0 kubenswrapper[7547]: I0308 03:48:27.459099 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f865279-e751-456d-8c96-6381f8b45ce1-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"0f865279-e751-456d-8c96-6381f8b45ce1\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:48:27.459357 master-0 kubenswrapper[7547]: I0308 03:48:27.459196 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f865279-e751-456d-8c96-6381f8b45ce1-kube-api-access\") pod \"installer-5-master-0\" (UID: \"0f865279-e751-456d-8c96-6381f8b45ce1\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:48:27.560102 master-0 kubenswrapper[7547]: I0308 03:48:27.559982 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f865279-e751-456d-8c96-6381f8b45ce1-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"0f865279-e751-456d-8c96-6381f8b45ce1\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:48:27.560903 master-0 kubenswrapper[7547]: I0308 03:48:27.560111 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f865279-e751-456d-8c96-6381f8b45ce1-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"0f865279-e751-456d-8c96-6381f8b45ce1\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:48:27.560903 master-0 kubenswrapper[7547]: I0308 03:48:27.560311 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f865279-e751-456d-8c96-6381f8b45ce1-kube-api-access\") pod \"installer-5-master-0\" (UID: \"0f865279-e751-456d-8c96-6381f8b45ce1\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:48:27.560903 master-0 kubenswrapper[7547]: I0308 03:48:27.560392 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f865279-e751-456d-8c96-6381f8b45ce1-var-lock\") pod \"installer-5-master-0\" (UID: \"0f865279-e751-456d-8c96-6381f8b45ce1\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:48:27.560903 master-0 kubenswrapper[7547]: I0308 03:48:27.560505 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f865279-e751-456d-8c96-6381f8b45ce1-var-lock\") pod \"installer-5-master-0\" (UID: \"0f865279-e751-456d-8c96-6381f8b45ce1\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:48:27.672697 master-0 kubenswrapper[7547]: I0308 03:48:27.671578 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 08 03:48:27.916903 master-0 kubenswrapper[7547]: I0308 03:48:27.916784 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f865279-e751-456d-8c96-6381f8b45ce1-kube-api-access\") pod \"installer-5-master-0\" (UID: \"0f865279-e751-456d-8c96-6381f8b45ce1\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:48:28.014065 master-0 kubenswrapper[7547]: I0308 03:48:28.013992 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:48:34.190944 master-0 kubenswrapper[7547]: I0308 03:48:34.190809 7547 patch_prober.go:28] interesting pod/controller-manager-597dfbc64c-f55p7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:48:34.192320 master-0 kubenswrapper[7547]: I0308 03:48:34.190941 7547 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" podUID="792f503e-c34b-4c30-9c9e-70bdea2f2629" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:48:35.209773 master-0 kubenswrapper[7547]: I0308 03:48:35.203107 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:48:35.256060 master-0 kubenswrapper[7547]: I0308 03:48:35.256011 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" Mar 08 03:48:35.267244 master-0 kubenswrapper[7547]: I0308 03:48:35.267213 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:35.271029 master-0 kubenswrapper[7547]: I0308 03:48:35.270549 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/349d438d-d124-4d34-a172-4160e766c680-etc-ssl-certs\") pod \"349d438d-d124-4d34-a172-4160e766c680\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " Mar 08 03:48:35.271029 master-0 kubenswrapper[7547]: I0308 03:48:35.270967 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/792f503e-c34b-4c30-9c9e-70bdea2f2629-serving-cert\") pod \"792f503e-c34b-4c30-9c9e-70bdea2f2629\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " Mar 08 03:48:35.271029 master-0 kubenswrapper[7547]: I0308 03:48:35.271007 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/792f503e-c34b-4c30-9c9e-70bdea2f2629-client-ca\") pod \"792f503e-c34b-4c30-9c9e-70bdea2f2629\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " Mar 08 03:48:35.271179 master-0 kubenswrapper[7547]: I0308 03:48:35.270635 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/349d438d-d124-4d34-a172-4160e766c680-etc-ssl-certs" (OuterVolumeSpecName: "etc-ssl-certs") pod "349d438d-d124-4d34-a172-4160e766c680" (UID: "349d438d-d124-4d34-a172-4160e766c680"). InnerVolumeSpecName "etc-ssl-certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:48:35.271842 master-0 kubenswrapper[7547]: I0308 03:48:35.271801 7547 reconciler_common.go:293] "Volume detached for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/349d438d-d124-4d34-a172-4160e766c680-etc-ssl-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:35.271911 master-0 kubenswrapper[7547]: I0308 03:48:35.271863 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/792f503e-c34b-4c30-9c9e-70bdea2f2629-client-ca" (OuterVolumeSpecName: "client-ca") pod "792f503e-c34b-4c30-9c9e-70bdea2f2629" (UID: "792f503e-c34b-4c30-9c9e-70bdea2f2629"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:48:35.274072 master-0 kubenswrapper[7547]: I0308 03:48:35.274032 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/792f503e-c34b-4c30-9c9e-70bdea2f2629-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "792f503e-c34b-4c30-9c9e-70bdea2f2629" (UID: "792f503e-c34b-4c30-9c9e-70bdea2f2629"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:48:35.375921 master-0 kubenswrapper[7547]: I0308 03:48:35.375757 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6cf484f-7125-47ee-9e67-a064d044f43d-config\") pod \"d6cf484f-7125-47ee-9e67-a064d044f43d\" (UID: \"d6cf484f-7125-47ee-9e67-a064d044f43d\") " Mar 08 03:48:35.375921 master-0 kubenswrapper[7547]: I0308 03:48:35.375817 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/349d438d-d124-4d34-a172-4160e766c680-etc-cvo-updatepayloads\") pod \"349d438d-d124-4d34-a172-4160e766c680\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " Mar 08 03:48:35.376662 master-0 kubenswrapper[7547]: I0308 03:48:35.376616 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/349d438d-d124-4d34-a172-4160e766c680-etc-cvo-updatepayloads" (OuterVolumeSpecName: "etc-cvo-updatepayloads") pod "349d438d-d124-4d34-a172-4160e766c680" (UID: "349d438d-d124-4d34-a172-4160e766c680"). InnerVolumeSpecName "etc-cvo-updatepayloads". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:48:35.376955 master-0 kubenswrapper[7547]: I0308 03:48:35.376923 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/349d438d-d124-4d34-a172-4160e766c680-service-ca\") pod \"349d438d-d124-4d34-a172-4160e766c680\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " Mar 08 03:48:35.377012 master-0 kubenswrapper[7547]: I0308 03:48:35.376973 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6cf484f-7125-47ee-9e67-a064d044f43d-serving-cert\") pod \"d6cf484f-7125-47ee-9e67-a064d044f43d\" (UID: \"d6cf484f-7125-47ee-9e67-a064d044f43d\") " Mar 08 03:48:35.377130 master-0 kubenswrapper[7547]: I0308 03:48:35.377010 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vkpr\" (UniqueName: \"kubernetes.io/projected/d6cf484f-7125-47ee-9e67-a064d044f43d-kube-api-access-2vkpr\") pod \"d6cf484f-7125-47ee-9e67-a064d044f43d\" (UID: \"d6cf484f-7125-47ee-9e67-a064d044f43d\") " Mar 08 03:48:35.377130 master-0 kubenswrapper[7547]: I0308 03:48:35.377005 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6cf484f-7125-47ee-9e67-a064d044f43d-config" (OuterVolumeSpecName: "config") pod "d6cf484f-7125-47ee-9e67-a064d044f43d" (UID: "d6cf484f-7125-47ee-9e67-a064d044f43d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:48:35.377130 master-0 kubenswrapper[7547]: I0308 03:48:35.377067 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6cf484f-7125-47ee-9e67-a064d044f43d-client-ca\") pod \"d6cf484f-7125-47ee-9e67-a064d044f43d\" (UID: \"d6cf484f-7125-47ee-9e67-a064d044f43d\") " Mar 08 03:48:35.377130 master-0 kubenswrapper[7547]: I0308 03:48:35.377104 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk72v\" (UniqueName: \"kubernetes.io/projected/792f503e-c34b-4c30-9c9e-70bdea2f2629-kube-api-access-bk72v\") pod \"792f503e-c34b-4c30-9c9e-70bdea2f2629\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " Mar 08 03:48:35.377494 master-0 kubenswrapper[7547]: I0308 03:48:35.377466 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/349d438d-d124-4d34-a172-4160e766c680-service-ca" (OuterVolumeSpecName: "service-ca") pod "349d438d-d124-4d34-a172-4160e766c680" (UID: "349d438d-d124-4d34-a172-4160e766c680"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:48:35.378403 master-0 kubenswrapper[7547]: I0308 03:48:35.378364 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6cf484f-7125-47ee-9e67-a064d044f43d-client-ca" (OuterVolumeSpecName: "client-ca") pod "d6cf484f-7125-47ee-9e67-a064d044f43d" (UID: "d6cf484f-7125-47ee-9e67-a064d044f43d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:48:35.380119 master-0 kubenswrapper[7547]: I0308 03:48:35.380093 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") pod \"349d438d-d124-4d34-a172-4160e766c680\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " Mar 08 03:48:35.380192 master-0 kubenswrapper[7547]: I0308 03:48:35.380131 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/792f503e-c34b-4c30-9c9e-70bdea2f2629-proxy-ca-bundles\") pod \"792f503e-c34b-4c30-9c9e-70bdea2f2629\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " Mar 08 03:48:35.380192 master-0 kubenswrapper[7547]: I0308 03:48:35.380169 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/349d438d-d124-4d34-a172-4160e766c680-kube-api-access\") pod \"349d438d-d124-4d34-a172-4160e766c680\" (UID: \"349d438d-d124-4d34-a172-4160e766c680\") " Mar 08 03:48:35.380278 master-0 kubenswrapper[7547]: I0308 03:48:35.380215 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/792f503e-c34b-4c30-9c9e-70bdea2f2629-config\") pod \"792f503e-c34b-4c30-9c9e-70bdea2f2629\" (UID: \"792f503e-c34b-4c30-9c9e-70bdea2f2629\") " Mar 08 03:48:35.380654 master-0 kubenswrapper[7547]: I0308 03:48:35.380613 7547 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6cf484f-7125-47ee-9e67-a064d044f43d-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:35.380654 master-0 kubenswrapper[7547]: I0308 03:48:35.380648 7547 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/792f503e-c34b-4c30-9c9e-70bdea2f2629-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:35.380654 master-0 kubenswrapper[7547]: I0308 03:48:35.380661 7547 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/792f503e-c34b-4c30-9c9e-70bdea2f2629-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:35.380853 master-0 kubenswrapper[7547]: I0308 03:48:35.380676 7547 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6cf484f-7125-47ee-9e67-a064d044f43d-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:35.380853 master-0 kubenswrapper[7547]: I0308 03:48:35.380689 7547 reconciler_common.go:293] "Volume detached for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/349d438d-d124-4d34-a172-4160e766c680-etc-cvo-updatepayloads\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:35.380853 master-0 kubenswrapper[7547]: I0308 03:48:35.380701 7547 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/349d438d-d124-4d34-a172-4160e766c680-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:35.381223 master-0 kubenswrapper[7547]: I0308 03:48:35.381042 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/792f503e-c34b-4c30-9c9e-70bdea2f2629-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "792f503e-c34b-4c30-9c9e-70bdea2f2629" (UID: "792f503e-c34b-4c30-9c9e-70bdea2f2629"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:48:35.381285 master-0 kubenswrapper[7547]: I0308 03:48:35.381066 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/792f503e-c34b-4c30-9c9e-70bdea2f2629-config" (OuterVolumeSpecName: "config") pod "792f503e-c34b-4c30-9c9e-70bdea2f2629" (UID: "792f503e-c34b-4c30-9c9e-70bdea2f2629"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:48:35.381319 master-0 kubenswrapper[7547]: I0308 03:48:35.381157 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6cf484f-7125-47ee-9e67-a064d044f43d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d6cf484f-7125-47ee-9e67-a064d044f43d" (UID: "d6cf484f-7125-47ee-9e67-a064d044f43d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:48:35.382906 master-0 kubenswrapper[7547]: I0308 03:48:35.382815 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "349d438d-d124-4d34-a172-4160e766c680" (UID: "349d438d-d124-4d34-a172-4160e766c680"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:48:35.383459 master-0 kubenswrapper[7547]: I0308 03:48:35.383415 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/349d438d-d124-4d34-a172-4160e766c680-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "349d438d-d124-4d34-a172-4160e766c680" (UID: "349d438d-d124-4d34-a172-4160e766c680"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:48:35.385265 master-0 kubenswrapper[7547]: I0308 03:48:35.385147 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/792f503e-c34b-4c30-9c9e-70bdea2f2629-kube-api-access-bk72v" (OuterVolumeSpecName: "kube-api-access-bk72v") pod "792f503e-c34b-4c30-9c9e-70bdea2f2629" (UID: "792f503e-c34b-4c30-9c9e-70bdea2f2629"). InnerVolumeSpecName "kube-api-access-bk72v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:48:35.398359 master-0 kubenswrapper[7547]: I0308 03:48:35.398300 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6cf484f-7125-47ee-9e67-a064d044f43d-kube-api-access-2vkpr" (OuterVolumeSpecName: "kube-api-access-2vkpr") pod "d6cf484f-7125-47ee-9e67-a064d044f43d" (UID: "d6cf484f-7125-47ee-9e67-a064d044f43d"). InnerVolumeSpecName "kube-api-access-2vkpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:48:35.482294 master-0 kubenswrapper[7547]: I0308 03:48:35.482230 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk72v\" (UniqueName: \"kubernetes.io/projected/792f503e-c34b-4c30-9c9e-70bdea2f2629-kube-api-access-bk72v\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:35.482294 master-0 kubenswrapper[7547]: I0308 03:48:35.482283 7547 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349d438d-d124-4d34-a172-4160e766c680-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:35.482294 master-0 kubenswrapper[7547]: I0308 03:48:35.482297 7547 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/792f503e-c34b-4c30-9c9e-70bdea2f2629-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:35.482294 master-0 kubenswrapper[7547]: I0308 03:48:35.482309 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/349d438d-d124-4d34-a172-4160e766c680-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:35.482694 master-0 kubenswrapper[7547]: I0308 03:48:35.482323 7547 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/792f503e-c34b-4c30-9c9e-70bdea2f2629-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:35.482694 master-0 kubenswrapper[7547]: I0308 03:48:35.482337 7547 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6cf484f-7125-47ee-9e67-a064d044f43d-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:35.482694 master-0 kubenswrapper[7547]: I0308 03:48:35.482349 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vkpr\" (UniqueName: \"kubernetes.io/projected/d6cf484f-7125-47ee-9e67-a064d044f43d-kube-api-access-2vkpr\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:35.760144 master-0 kubenswrapper[7547]: I0308 03:48:35.760089 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" event={"ID":"349d438d-d124-4d34-a172-4160e766c680","Type":"ContainerDied","Data":"f8e5340277c14fbc3c84261d569623ece5bc22cf9c8aa4ec6d8e525cc14aeb64"} Mar 08 03:48:35.760144 master-0 kubenswrapper[7547]: I0308 03:48:35.760146 7547 scope.go:117] "RemoveContainer" containerID="7000ce3c427519e0de59e1941d9bb87835a2ee0fa70ad24b6c24c11e5207d4d2" Mar 08 03:48:35.760298 master-0 kubenswrapper[7547]: I0308 03:48:35.760244 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp" Mar 08 03:48:35.770719 master-0 kubenswrapper[7547]: I0308 03:48:35.768022 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" event={"ID":"792f503e-c34b-4c30-9c9e-70bdea2f2629","Type":"ContainerDied","Data":"35ad5b11950f0afd864c047e90541d0fc2348ab048da8315900ff77325b59f5c"} Mar 08 03:48:35.770719 master-0 kubenswrapper[7547]: I0308 03:48:35.768047 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-597dfbc64c-f55p7" Mar 08 03:48:35.772208 master-0 kubenswrapper[7547]: I0308 03:48:35.772026 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" event={"ID":"d6cf484f-7125-47ee-9e67-a064d044f43d","Type":"ContainerDied","Data":"e3468db4b4de37474c23de3dc1335ad27787122b8d8693bf97799316dadf6a7f"} Mar 08 03:48:35.772208 master-0 kubenswrapper[7547]: I0308 03:48:35.772092 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l" Mar 08 03:48:35.795742 master-0 kubenswrapper[7547]: I0308 03:48:35.795703 7547 scope.go:117] "RemoveContainer" containerID="fd093a4b12530917708bd6b4190b1961b64cf67482f3b37bc37161fad3d593fd" Mar 08 03:48:35.825444 master-0 kubenswrapper[7547]: I0308 03:48:35.825106 7547 scope.go:117] "RemoveContainer" containerID="9e40d2d9dedbe70637d2521ba59ce8d2051ddfcf5801584decfddb1f76e27439" Mar 08 03:48:36.308141 master-0 kubenswrapper[7547]: I0308 03:48:36.305819 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rnz4w"] Mar 08 03:48:36.590877 master-0 kubenswrapper[7547]: I0308 03:48:36.589637 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 08 03:48:36.593922 master-0 kubenswrapper[7547]: I0308 03:48:36.593878 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 08 03:48:36.595707 master-0 kubenswrapper[7547]: I0308 03:48:36.595676 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p8nq8"] Mar 08 03:48:36.598327 master-0 kubenswrapper[7547]: I0308 03:48:36.598196 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4h8qm"] Mar 08 03:48:36.603164 master-0 kubenswrapper[7547]: I0308 03:48:36.600709 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lwt58"] Mar 08 03:48:36.632947 master-0 kubenswrapper[7547]: I0308 03:48:36.632052 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-597dfbc64c-f55p7"] Mar 08 03:48:36.636340 master-0 kubenswrapper[7547]: I0308 03:48:36.636283 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-597dfbc64c-f55p7"] Mar 08 03:48:36.671097 master-0 kubenswrapper[7547]: I0308 03:48:36.667976 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l"] Mar 08 03:48:36.673979 master-0 kubenswrapper[7547]: I0308 03:48:36.673948 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6994fc9dc8-9w74l"] Mar 08 03:48:36.692489 master-0 kubenswrapper[7547]: I0308 03:48:36.692467 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp"] Mar 08 03:48:36.698880 master-0 kubenswrapper[7547]: I0308 03:48:36.698809 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-gvmnp"] Mar 08 03:48:36.736416 master-0 kubenswrapper[7547]: I0308 03:48:36.736383 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp"] Mar 08 03:48:36.736715 master-0 kubenswrapper[7547]: E0308 03:48:36.736700 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6cf484f-7125-47ee-9e67-a064d044f43d" containerName="route-controller-manager" Mar 08 03:48:36.736796 master-0 kubenswrapper[7547]: I0308 03:48:36.736783 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6cf484f-7125-47ee-9e67-a064d044f43d" containerName="route-controller-manager" Mar 08 03:48:36.736888 master-0 kubenswrapper[7547]: E0308 03:48:36.736878 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349d438d-d124-4d34-a172-4160e766c680" containerName="cluster-version-operator" Mar 08 03:48:36.736969 master-0 kubenswrapper[7547]: I0308 03:48:36.736956 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="349d438d-d124-4d34-a172-4160e766c680" containerName="cluster-version-operator" Mar 08 03:48:36.737048 master-0 kubenswrapper[7547]: E0308 03:48:36.737037 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792f503e-c34b-4c30-9c9e-70bdea2f2629" containerName="controller-manager" Mar 08 03:48:36.737099 master-0 kubenswrapper[7547]: I0308 03:48:36.737091 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="792f503e-c34b-4c30-9c9e-70bdea2f2629" containerName="controller-manager" Mar 08 03:48:36.737261 master-0 kubenswrapper[7547]: I0308 03:48:36.737249 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6cf484f-7125-47ee-9e67-a064d044f43d" containerName="route-controller-manager" Mar 08 03:48:36.737319 master-0 kubenswrapper[7547]: I0308 03:48:36.737311 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="349d438d-d124-4d34-a172-4160e766c680" containerName="cluster-version-operator" Mar 08 03:48:36.737381 master-0 kubenswrapper[7547]: I0308 03:48:36.737371 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="792f503e-c34b-4c30-9c9e-70bdea2f2629" containerName="controller-manager" Mar 08 03:48:36.737771 master-0 kubenswrapper[7547]: I0308 03:48:36.737756 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:48:36.748085 master-0 kubenswrapper[7547]: I0308 03:48:36.748046 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 03:48:36.748393 master-0 kubenswrapper[7547]: I0308 03:48:36.748096 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-hbsn4" Mar 08 03:48:36.748507 master-0 kubenswrapper[7547]: I0308 03:48:36.748480 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 03:48:36.753113 master-0 kubenswrapper[7547]: I0308 03:48:36.752910 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 03:48:36.784198 master-0 kubenswrapper[7547]: I0308 03:48:36.784144 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcdn7" event={"ID":"4818bf75-506a-4b39-bb9b-8067a02d4a51","Type":"ContainerStarted","Data":"5f21b41a33309effcfa7d8641b2cb25646283c6b2dcecea3b71aeb78577ba200"} Mar 08 03:48:36.784787 master-0 kubenswrapper[7547]: I0308 03:48:36.784748 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fcdn7" podUID="4818bf75-506a-4b39-bb9b-8067a02d4a51" containerName="extract-content" containerID="cri-o://5f21b41a33309effcfa7d8641b2cb25646283c6b2dcecea3b71aeb78577ba200" gracePeriod=2 Mar 08 03:48:36.787943 master-0 kubenswrapper[7547]: I0308 03:48:36.787495 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8nq8" event={"ID":"0031e3a9-b253-4dda-a890-bf3e4d8737e8","Type":"ContainerStarted","Data":"15db86a68d791074ec87b05b5a5fc2f19c6862a1ebcfb5de4931251a55e195a3"} Mar 08 03:48:36.788851 master-0 kubenswrapper[7547]: I0308 03:48:36.788812 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"0f865279-e751-456d-8c96-6381f8b45ce1","Type":"ContainerStarted","Data":"46d4f01a8b97928ed12b356249f2c516cd9275fd33a04ced54c1129e7817bd38"} Mar 08 03:48:36.791088 master-0 kubenswrapper[7547]: I0308 03:48:36.791052 7547 generic.go:334] "Generic (PLEG): container finished" podID="a4ff897a-ac47-45e0-aa7d-88c5aea50b70" containerID="3e2b2e09415114bbeaf9e1f998368ceae3dc08609fd38e1dc5b57c9deb896668" exitCode=0 Mar 08 03:48:36.791159 master-0 kubenswrapper[7547]: I0308 03:48:36.791106 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w2lsp" event={"ID":"a4ff897a-ac47-45e0-aa7d-88c5aea50b70","Type":"ContainerDied","Data":"3e2b2e09415114bbeaf9e1f998368ceae3dc08609fd38e1dc5b57c9deb896668"} Mar 08 03:48:36.800088 master-0 kubenswrapper[7547]: I0308 03:48:36.794104 7547 generic.go:334] "Generic (PLEG): container finished" podID="8b8dcd07-f245-4783-ba40-521a14e96043" containerID="d96fc44d30582015395e41e6921f416abacff20675fd3fc0f2542974b5f0627f" exitCode=0 Mar 08 03:48:36.800088 master-0 kubenswrapper[7547]: I0308 03:48:36.794150 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ljn9" event={"ID":"8b8dcd07-f245-4783-ba40-521a14e96043","Type":"ContainerDied","Data":"d96fc44d30582015395e41e6921f416abacff20675fd3fc0f2542974b5f0627f"} Mar 08 03:48:36.800088 master-0 kubenswrapper[7547]: I0308 03:48:36.796423 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" event={"ID":"1cbcb403-a424-4496-8c5c-5eb5e42dfb93","Type":"ContainerStarted","Data":"ee6a6bf31e7af3dec358121d4a82a967c996552057556595b3fa11c8208024b4"} Mar 08 03:48:36.800088 master-0 kubenswrapper[7547]: I0308 03:48:36.799103 7547 generic.go:334] "Generic (PLEG): container finished" podID="6bee226a-2a66-4032-8aba-2c8b82abcb6a" containerID="e70077d0bc2f435dbefd1bd93a5bf3f06dc8fe76044ed3f37fa4f6ef147e9f4c" exitCode=0 Mar 08 03:48:36.800088 master-0 kubenswrapper[7547]: I0308 03:48:36.799181 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnz4w" event={"ID":"6bee226a-2a66-4032-8aba-2c8b82abcb6a","Type":"ContainerDied","Data":"e70077d0bc2f435dbefd1bd93a5bf3f06dc8fe76044ed3f37fa4f6ef147e9f4c"} Mar 08 03:48:36.800088 master-0 kubenswrapper[7547]: I0308 03:48:36.799223 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnz4w" event={"ID":"6bee226a-2a66-4032-8aba-2c8b82abcb6a","Type":"ContainerStarted","Data":"9903faf2a78afa21fe51e82e71e6a8b65942f5c695c1b737493cfec8a1911541"} Mar 08 03:48:36.842865 master-0 kubenswrapper[7547]: I0308 03:48:36.842808 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4h8qm" event={"ID":"8e283f49-b85d-4789-a71f-3fcb5033cdf0","Type":"ContainerStarted","Data":"005a8f12bd7e675962c64889aeb13228b895d026517d2c45f10276c0ab4cd89e"} Mar 08 03:48:36.844536 master-0 kubenswrapper[7547]: I0308 03:48:36.844507 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"47643289-ac4b-425d-8ea1-913b6ca39ee0","Type":"ContainerStarted","Data":"68a9c31781b9210f187f31760c081b7a914e3f7c545e30283d68c9a55506f854"} Mar 08 03:48:36.858177 master-0 kubenswrapper[7547]: I0308 03:48:36.848628 7547 generic.go:334] "Generic (PLEG): container finished" podID="47b7e26d-8fb3-4749-a544-c86c3a06e439" containerID="3e7278dc6682aa0a3aac1f49856a042d60ff9bc59a90715e1a5e79961ebe2037" exitCode=0 Mar 08 03:48:36.858177 master-0 kubenswrapper[7547]: I0308 03:48:36.848689 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c88pb" event={"ID":"47b7e26d-8fb3-4749-a544-c86c3a06e439","Type":"ContainerDied","Data":"3e7278dc6682aa0a3aac1f49856a042d60ff9bc59a90715e1a5e79961ebe2037"} Mar 08 03:48:36.858177 master-0 kubenswrapper[7547]: I0308 03:48:36.851485 7547 generic.go:334] "Generic (PLEG): container finished" podID="26180f77-0b1a-4d0f-9ed0-a12fdee69817" containerID="2b806592f345fd33c0e6baaad7d7fe21c75572bbe4983f5588e4e61c09a25b29" exitCode=0 Mar 08 03:48:36.858177 master-0 kubenswrapper[7547]: I0308 03:48:36.851543 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" event={"ID":"26180f77-0b1a-4d0f-9ed0-a12fdee69817","Type":"ContainerDied","Data":"2b806592f345fd33c0e6baaad7d7fe21c75572bbe4983f5588e4e61c09a25b29"} Mar 08 03:48:36.858177 master-0 kubenswrapper[7547]: I0308 03:48:36.851946 7547 scope.go:117] "RemoveContainer" containerID="2b806592f345fd33c0e6baaad7d7fe21c75572bbe4983f5588e4e61c09a25b29" Mar 08 03:48:36.909905 master-0 kubenswrapper[7547]: I0308 03:48:36.908731 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2262647b-c315-477a-93bd-f168c1810475-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:48:36.909905 master-0 kubenswrapper[7547]: I0308 03:48:36.908777 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2262647b-c315-477a-93bd-f168c1810475-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:48:36.909905 master-0 kubenswrapper[7547]: I0308 03:48:36.908804 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2262647b-c315-477a-93bd-f168c1810475-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:48:36.909905 master-0 kubenswrapper[7547]: I0308 03:48:36.908833 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2262647b-c315-477a-93bd-f168c1810475-serving-cert\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:48:36.909905 master-0 kubenswrapper[7547]: I0308 03:48:36.908854 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2262647b-c315-477a-93bd-f168c1810475-service-ca\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:48:37.010711 master-0 kubenswrapper[7547]: I0308 03:48:37.010041 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2262647b-c315-477a-93bd-f168c1810475-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:48:37.010711 master-0 kubenswrapper[7547]: I0308 03:48:37.010078 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2262647b-c315-477a-93bd-f168c1810475-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:48:37.010711 master-0 kubenswrapper[7547]: I0308 03:48:37.010104 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2262647b-c315-477a-93bd-f168c1810475-serving-cert\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:48:37.010711 master-0 kubenswrapper[7547]: I0308 03:48:37.010119 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2262647b-c315-477a-93bd-f168c1810475-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:48:37.010711 master-0 kubenswrapper[7547]: I0308 03:48:37.010135 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2262647b-c315-477a-93bd-f168c1810475-service-ca\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:48:37.010711 master-0 kubenswrapper[7547]: I0308 03:48:37.010475 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2262647b-c315-477a-93bd-f168c1810475-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:48:37.014574 master-0 kubenswrapper[7547]: I0308 03:48:37.012312 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2262647b-c315-477a-93bd-f168c1810475-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:48:37.023855 master-0 kubenswrapper[7547]: I0308 03:48:37.023803 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2262647b-c315-477a-93bd-f168c1810475-service-ca\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:48:37.029788 master-0 kubenswrapper[7547]: I0308 03:48:37.029744 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2262647b-c315-477a-93bd-f168c1810475-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:48:37.047409 master-0 kubenswrapper[7547]: I0308 03:48:37.047373 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2262647b-c315-477a-93bd-f168c1810475-serving-cert\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:48:37.257471 master-0 kubenswrapper[7547]: I0308 03:48:37.255607 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="349d438d-d124-4d34-a172-4160e766c680" path="/var/lib/kubelet/pods/349d438d-d124-4d34-a172-4160e766c680/volumes" Mar 08 03:48:37.257471 master-0 kubenswrapper[7547]: I0308 03:48:37.256110 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="792f503e-c34b-4c30-9c9e-70bdea2f2629" path="/var/lib/kubelet/pods/792f503e-c34b-4c30-9c9e-70bdea2f2629/volumes" Mar 08 03:48:37.257471 master-0 kubenswrapper[7547]: I0308 03:48:37.256551 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6cf484f-7125-47ee-9e67-a064d044f43d" path="/var/lib/kubelet/pods/d6cf484f-7125-47ee-9e67-a064d044f43d/volumes" Mar 08 03:48:37.342922 master-0 kubenswrapper[7547]: I0308 03:48:37.341180 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:48:37.388123 master-0 kubenswrapper[7547]: W0308 03:48:37.385310 7547 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4ff897a_ac47_45e0_aa7d_88c5aea50b70.slice/crio-3e2b2e09415114bbeaf9e1f998368ceae3dc08609fd38e1dc5b57c9deb896668.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4ff897a_ac47_45e0_aa7d_88c5aea50b70.slice/crio-3e2b2e09415114bbeaf9e1f998368ceae3dc08609fd38e1dc5b57c9deb896668.scope: no such file or directory Mar 08 03:48:37.388123 master-0 kubenswrapper[7547]: W0308 03:48:37.385406 7547 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4818bf75_506a_4b39_bb9b_8067a02d4a51.slice/crio-conmon-5f21b41a33309effcfa7d8641b2cb25646283c6b2dcecea3b71aeb78577ba200.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4818bf75_506a_4b39_bb9b_8067a02d4a51.slice/crio-conmon-5f21b41a33309effcfa7d8641b2cb25646283c6b2dcecea3b71aeb78577ba200.scope: no such file or directory Mar 08 03:48:37.392182 master-0 kubenswrapper[7547]: W0308 03:48:37.392119 7547 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b8dcd07_f245_4783_ba40_521a14e96043.slice/crio-d96fc44d30582015395e41e6921f416abacff20675fd3fc0f2542974b5f0627f.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b8dcd07_f245_4783_ba40_521a14e96043.slice/crio-d96fc44d30582015395e41e6921f416abacff20675fd3fc0f2542974b5f0627f.scope: no such file or directory Mar 08 03:48:37.392182 master-0 kubenswrapper[7547]: W0308 03:48:37.392178 7547 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47b7e26d_8fb3_4749_a544_c86c3a06e439.slice/crio-3e7278dc6682aa0a3aac1f49856a042d60ff9bc59a90715e1a5e79961ebe2037.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47b7e26d_8fb3_4749_a544_c86c3a06e439.slice/crio-3e7278dc6682aa0a3aac1f49856a042d60ff9bc59a90715e1a5e79961ebe2037.scope: no such file or directory Mar 08 03:48:37.392275 master-0 kubenswrapper[7547]: W0308 03:48:37.392204 7547 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4818bf75_506a_4b39_bb9b_8067a02d4a51.slice/crio-5f21b41a33309effcfa7d8641b2cb25646283c6b2dcecea3b71aeb78577ba200.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4818bf75_506a_4b39_bb9b_8067a02d4a51.slice/crio-5f21b41a33309effcfa7d8641b2cb25646283c6b2dcecea3b71aeb78577ba200.scope: no such file or directory Mar 08 03:48:37.392637 master-0 kubenswrapper[7547]: W0308 03:48:37.392592 7547 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bee226a_2a66_4032_8aba_2c8b82abcb6a.slice/crio-conmon-e70077d0bc2f435dbefd1bd93a5bf3f06dc8fe76044ed3f37fa4f6ef147e9f4c.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bee226a_2a66_4032_8aba_2c8b82abcb6a.slice/crio-conmon-e70077d0bc2f435dbefd1bd93a5bf3f06dc8fe76044ed3f37fa4f6ef147e9f4c.scope: no such file or directory Mar 08 03:48:37.392691 master-0 kubenswrapper[7547]: W0308 03:48:37.392646 7547 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bee226a_2a66_4032_8aba_2c8b82abcb6a.slice/crio-e70077d0bc2f435dbefd1bd93a5bf3f06dc8fe76044ed3f37fa4f6ef147e9f4c.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bee226a_2a66_4032_8aba_2c8b82abcb6a.slice/crio-e70077d0bc2f435dbefd1bd93a5bf3f06dc8fe76044ed3f37fa4f6ef147e9f4c.scope: no such file or directory Mar 08 03:48:37.397457 master-0 kubenswrapper[7547]: W0308 03:48:37.397372 7547 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0031e3a9_b253_4dda_a890_bf3e4d8737e8.slice/crio-conmon-c1f98a5af9a428504d2938393c5cc9f6182f4f61d848078dae052d1d8e74b50e.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0031e3a9_b253_4dda_a890_bf3e4d8737e8.slice/crio-conmon-c1f98a5af9a428504d2938393c5cc9f6182f4f61d848078dae052d1d8e74b50e.scope: no such file or directory Mar 08 03:48:37.397595 master-0 kubenswrapper[7547]: W0308 03:48:37.397566 7547 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b3d1dc7_22f9_4c0c_802a_d7314894b255.slice/crio-conmon-3762e6cf17228a56d31f9c4b27fe58a04fecaafe047e1561d4d5135a98072ca1.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b3d1dc7_22f9_4c0c_802a_d7314894b255.slice/crio-conmon-3762e6cf17228a56d31f9c4b27fe58a04fecaafe047e1561d4d5135a98072ca1.scope: no such file or directory Mar 08 03:48:37.398222 master-0 kubenswrapper[7547]: W0308 03:48:37.398171 7547 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e283f49_b85d_4789_a71f_3fcb5033cdf0.slice/crio-conmon-97d43cbd349f40b6639c942720c80d6ef418f56edf2fb6db63bcde4714444dea.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e283f49_b85d_4789_a71f_3fcb5033cdf0.slice/crio-conmon-97d43cbd349f40b6639c942720c80d6ef418f56edf2fb6db63bcde4714444dea.scope: no such file or directory Mar 08 03:48:37.398275 master-0 kubenswrapper[7547]: W0308 03:48:37.398224 7547 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0031e3a9_b253_4dda_a890_bf3e4d8737e8.slice/crio-c1f98a5af9a428504d2938393c5cc9f6182f4f61d848078dae052d1d8e74b50e.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0031e3a9_b253_4dda_a890_bf3e4d8737e8.slice/crio-c1f98a5af9a428504d2938393c5cc9f6182f4f61d848078dae052d1d8e74b50e.scope: no such file or directory Mar 08 03:48:37.402157 master-0 kubenswrapper[7547]: W0308 03:48:37.402108 7547 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b3d1dc7_22f9_4c0c_802a_d7314894b255.slice/crio-3762e6cf17228a56d31f9c4b27fe58a04fecaafe047e1561d4d5135a98072ca1.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b3d1dc7_22f9_4c0c_802a_d7314894b255.slice/crio-3762e6cf17228a56d31f9c4b27fe58a04fecaafe047e1561d4d5135a98072ca1.scope: no such file or directory Mar 08 03:48:37.402157 master-0 kubenswrapper[7547]: W0308 03:48:37.402149 7547 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e283f49_b85d_4789_a71f_3fcb5033cdf0.slice/crio-97d43cbd349f40b6639c942720c80d6ef418f56edf2fb6db63bcde4714444dea.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e283f49_b85d_4789_a71f_3fcb5033cdf0.slice/crio-97d43cbd349f40b6639c942720c80d6ef418f56edf2fb6db63bcde4714444dea.scope: no such file or directory Mar 08 03:48:37.440241 master-0 kubenswrapper[7547]: I0308 03:48:37.440204 7547 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 08 03:48:37.440500 master-0 kubenswrapper[7547]: I0308 03:48:37.440408 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" containerID="cri-o://4573b175e4638284868f035fd979eb84c441b639f2cba6882ebb0bdabc7d53f1" gracePeriod=30 Mar 08 03:48:37.441168 master-0 kubenswrapper[7547]: I0308 03:48:37.441147 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" containerID="cri-o://c18b9f8b4dcef22d65b7b32df1f7077ca430d4d3cab49ca6d36290193d631e27" gracePeriod=30 Mar 08 03:48:37.446283 master-0 kubenswrapper[7547]: I0308 03:48:37.446140 7547 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 08 03:48:37.456117 master-0 kubenswrapper[7547]: E0308 03:48:37.456052 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 08 03:48:37.456117 master-0 kubenswrapper[7547]: I0308 03:48:37.456093 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 08 03:48:37.456219 master-0 kubenswrapper[7547]: E0308 03:48:37.456128 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 08 03:48:37.456219 master-0 kubenswrapper[7547]: I0308 03:48:37.456135 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 08 03:48:37.456318 master-0 kubenswrapper[7547]: I0308 03:48:37.456288 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 08 03:48:37.456318 master-0 kubenswrapper[7547]: I0308 03:48:37.456310 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 08 03:48:37.459848 master-0 kubenswrapper[7547]: I0308 03:48:37.457778 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.468979 master-0 kubenswrapper[7547]: I0308 03:48:37.468947 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w2lsp" Mar 08 03:48:37.479198 master-0 kubenswrapper[7547]: I0308 03:48:37.479167 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ljn9" Mar 08 03:48:37.497508 master-0 kubenswrapper[7547]: I0308 03:48:37.497086 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fcdn7_4818bf75-506a-4b39-bb9b-8067a02d4a51/extract-content/0.log" Mar 08 03:48:37.497907 master-0 kubenswrapper[7547]: I0308 03:48:37.497797 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcdn7" Mar 08 03:48:37.517989 master-0 kubenswrapper[7547]: E0308 03:48:37.517915 7547 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4ff897a_ac47_45e0_aa7d_88c5aea50b70.slice/crio-conmon-3e2b2e09415114bbeaf9e1f998368ceae3dc08609fd38e1dc5b57c9deb896668.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26180f77_0b1a_4d0f_9ed0_a12fdee69817.slice/crio-conmon-2b806592f345fd33c0e6baaad7d7fe21c75572bbe4983f5588e4e61c09a25b29.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47b7e26d_8fb3_4749_a544_c86c3a06e439.slice/crio-conmon-3e7278dc6682aa0a3aac1f49856a042d60ff9bc59a90715e1a5e79961ebe2037.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26180f77_0b1a_4d0f_9ed0_a12fdee69817.slice/crio-2b806592f345fd33c0e6baaad7d7fe21c75572bbe4983f5588e4e61c09a25b29.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod0f62034a_dae9_46af_8c14_006b728b631f.slice/crio-d54a64defa77630a2cfba1757b5211284714af7095323e2acde0e62e40e90243.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6cf484f_7125_47ee_9e67_a064d044f43d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b8dcd07_f245_4783_ba40_521a14e96043.slice/crio-conmon-d96fc44d30582015395e41e6921f416abacff20675fd3fc0f2542974b5f0627f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod792f503e_c34b_4c30_9c9e_70bdea2f2629.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6cf484f_7125_47ee_9e67_a064d044f43d.slice/crio-e3468db4b4de37474c23de3dc1335ad27787122b8d8693bf97799316dadf6a7f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod0f62034a_dae9_46af_8c14_006b728b631f.slice/crio-conmon-d54a64defa77630a2cfba1757b5211284714af7095323e2acde0e62e40e90243.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod792f503e_c34b_4c30_9c9e_70bdea2f2629.slice/crio-35ad5b11950f0afd864c047e90541d0fc2348ab048da8315900ff77325b59f5c\": RecentStats: unable to find data in memory cache]" Mar 08 03:48:37.615060 master-0 kubenswrapper[7547]: I0308 03:48:37.615015 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b8dcd07-f245-4783-ba40-521a14e96043-utilities\") pod \"8b8dcd07-f245-4783-ba40-521a14e96043\" (UID: \"8b8dcd07-f245-4783-ba40-521a14e96043\") " Mar 08 03:48:37.615215 master-0 kubenswrapper[7547]: I0308 03:48:37.615073 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwqlx\" (UniqueName: \"kubernetes.io/projected/a4ff897a-ac47-45e0-aa7d-88c5aea50b70-kube-api-access-rwqlx\") pod \"a4ff897a-ac47-45e0-aa7d-88c5aea50b70\" (UID: \"a4ff897a-ac47-45e0-aa7d-88c5aea50b70\") " Mar 08 03:48:37.615215 master-0 kubenswrapper[7547]: I0308 03:48:37.615119 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9smb9\" (UniqueName: \"kubernetes.io/projected/8b8dcd07-f245-4783-ba40-521a14e96043-kube-api-access-9smb9\") pod \"8b8dcd07-f245-4783-ba40-521a14e96043\" (UID: \"8b8dcd07-f245-4783-ba40-521a14e96043\") " Mar 08 03:48:37.615215 master-0 kubenswrapper[7547]: I0308 03:48:37.615137 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4ff897a-ac47-45e0-aa7d-88c5aea50b70-utilities\") pod \"a4ff897a-ac47-45e0-aa7d-88c5aea50b70\" (UID: \"a4ff897a-ac47-45e0-aa7d-88c5aea50b70\") " Mar 08 03:48:37.615215 master-0 kubenswrapper[7547]: I0308 03:48:37.615160 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4818bf75-506a-4b39-bb9b-8067a02d4a51-utilities\") pod \"4818bf75-506a-4b39-bb9b-8067a02d4a51\" (UID: \"4818bf75-506a-4b39-bb9b-8067a02d4a51\") " Mar 08 03:48:37.615215 master-0 kubenswrapper[7547]: I0308 03:48:37.615181 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4818bf75-506a-4b39-bb9b-8067a02d4a51-catalog-content\") pod \"4818bf75-506a-4b39-bb9b-8067a02d4a51\" (UID: \"4818bf75-506a-4b39-bb9b-8067a02d4a51\") " Mar 08 03:48:37.615646 master-0 kubenswrapper[7547]: I0308 03:48:37.615227 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpkxw\" (UniqueName: \"kubernetes.io/projected/4818bf75-506a-4b39-bb9b-8067a02d4a51-kube-api-access-cpkxw\") pod \"4818bf75-506a-4b39-bb9b-8067a02d4a51\" (UID: \"4818bf75-506a-4b39-bb9b-8067a02d4a51\") " Mar 08 03:48:37.615646 master-0 kubenswrapper[7547]: I0308 03:48:37.615250 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4ff897a-ac47-45e0-aa7d-88c5aea50b70-catalog-content\") pod \"a4ff897a-ac47-45e0-aa7d-88c5aea50b70\" (UID: \"a4ff897a-ac47-45e0-aa7d-88c5aea50b70\") " Mar 08 03:48:37.615646 master-0 kubenswrapper[7547]: I0308 03:48:37.615275 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b8dcd07-f245-4783-ba40-521a14e96043-catalog-content\") pod \"8b8dcd07-f245-4783-ba40-521a14e96043\" (UID: \"8b8dcd07-f245-4783-ba40-521a14e96043\") " Mar 08 03:48:37.615646 master-0 kubenswrapper[7547]: I0308 03:48:37.615389 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.615646 master-0 kubenswrapper[7547]: I0308 03:48:37.615413 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.615646 master-0 kubenswrapper[7547]: I0308 03:48:37.615429 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.615646 master-0 kubenswrapper[7547]: I0308 03:48:37.615467 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.615646 master-0 kubenswrapper[7547]: I0308 03:48:37.615497 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.615646 master-0 kubenswrapper[7547]: I0308 03:48:37.615516 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.616462 master-0 kubenswrapper[7547]: I0308 03:48:37.616229 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4ff897a-ac47-45e0-aa7d-88c5aea50b70-utilities" (OuterVolumeSpecName: "utilities") pod "a4ff897a-ac47-45e0-aa7d-88c5aea50b70" (UID: "a4ff897a-ac47-45e0-aa7d-88c5aea50b70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:48:37.616669 master-0 kubenswrapper[7547]: I0308 03:48:37.616578 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4818bf75-506a-4b39-bb9b-8067a02d4a51-utilities" (OuterVolumeSpecName: "utilities") pod "4818bf75-506a-4b39-bb9b-8067a02d4a51" (UID: "4818bf75-506a-4b39-bb9b-8067a02d4a51"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:48:37.617098 master-0 kubenswrapper[7547]: I0308 03:48:37.617082 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c88pb" Mar 08 03:48:37.619484 master-0 kubenswrapper[7547]: I0308 03:48:37.619435 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b8dcd07-f245-4783-ba40-521a14e96043-kube-api-access-9smb9" (OuterVolumeSpecName: "kube-api-access-9smb9") pod "8b8dcd07-f245-4783-ba40-521a14e96043" (UID: "8b8dcd07-f245-4783-ba40-521a14e96043"). InnerVolumeSpecName "kube-api-access-9smb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:48:37.623294 master-0 kubenswrapper[7547]: I0308 03:48:37.622393 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4818bf75-506a-4b39-bb9b-8067a02d4a51-kube-api-access-cpkxw" (OuterVolumeSpecName: "kube-api-access-cpkxw") pod "4818bf75-506a-4b39-bb9b-8067a02d4a51" (UID: "4818bf75-506a-4b39-bb9b-8067a02d4a51"). InnerVolumeSpecName "kube-api-access-cpkxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:48:37.624273 master-0 kubenswrapper[7547]: I0308 03:48:37.624141 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b8dcd07-f245-4783-ba40-521a14e96043-utilities" (OuterVolumeSpecName: "utilities") pod "8b8dcd07-f245-4783-ba40-521a14e96043" (UID: "8b8dcd07-f245-4783-ba40-521a14e96043"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:48:37.633252 master-0 kubenswrapper[7547]: I0308 03:48:37.633190 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4ff897a-ac47-45e0-aa7d-88c5aea50b70-kube-api-access-rwqlx" (OuterVolumeSpecName: "kube-api-access-rwqlx") pod "a4ff897a-ac47-45e0-aa7d-88c5aea50b70" (UID: "a4ff897a-ac47-45e0-aa7d-88c5aea50b70"). InnerVolumeSpecName "kube-api-access-rwqlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:48:37.664178 master-0 kubenswrapper[7547]: I0308 03:48:37.664091 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4ff897a-ac47-45e0-aa7d-88c5aea50b70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4ff897a-ac47-45e0-aa7d-88c5aea50b70" (UID: "a4ff897a-ac47-45e0-aa7d-88c5aea50b70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:48:37.668518 master-0 kubenswrapper[7547]: I0308 03:48:37.668478 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_0f62034a-dae9-46af-8c14-006b728b631f/installer/0.log" Mar 08 03:48:37.668616 master-0 kubenswrapper[7547]: I0308 03:48:37.668578 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:48:37.689493 master-0 kubenswrapper[7547]: I0308 03:48:37.689446 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b8dcd07-f245-4783-ba40-521a14e96043-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b8dcd07-f245-4783-ba40-521a14e96043" (UID: "8b8dcd07-f245-4783-ba40-521a14e96043"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:48:37.716527 master-0 kubenswrapper[7547]: I0308 03:48:37.716440 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgbvc\" (UniqueName: \"kubernetes.io/projected/47b7e26d-8fb3-4749-a544-c86c3a06e439-kube-api-access-tgbvc\") pod \"47b7e26d-8fb3-4749-a544-c86c3a06e439\" (UID: \"47b7e26d-8fb3-4749-a544-c86c3a06e439\") " Mar 08 03:48:37.716527 master-0 kubenswrapper[7547]: I0308 03:48:37.716485 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47b7e26d-8fb3-4749-a544-c86c3a06e439-catalog-content\") pod \"47b7e26d-8fb3-4749-a544-c86c3a06e439\" (UID: \"47b7e26d-8fb3-4749-a544-c86c3a06e439\") " Mar 08 03:48:37.716527 master-0 kubenswrapper[7547]: I0308 03:48:37.716529 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47b7e26d-8fb3-4749-a544-c86c3a06e439-utilities\") pod \"47b7e26d-8fb3-4749-a544-c86c3a06e439\" (UID: \"47b7e26d-8fb3-4749-a544-c86c3a06e439\") " Mar 08 03:48:37.716736 master-0 kubenswrapper[7547]: I0308 03:48:37.716684 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.716736 master-0 kubenswrapper[7547]: I0308 03:48:37.716718 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.716736 master-0 kubenswrapper[7547]: I0308 03:48:37.716736 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.716889 master-0 kubenswrapper[7547]: I0308 03:48:37.716766 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.716889 master-0 kubenswrapper[7547]: I0308 03:48:37.716785 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.716889 master-0 kubenswrapper[7547]: I0308 03:48:37.716800 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.717034 master-0 kubenswrapper[7547]: I0308 03:48:37.716930 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.717034 master-0 kubenswrapper[7547]: I0308 03:48:37.716955 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.717034 master-0 kubenswrapper[7547]: I0308 03:48:37.716998 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.717537 master-0 kubenswrapper[7547]: I0308 03:48:37.717504 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47b7e26d-8fb3-4749-a544-c86c3a06e439-utilities" (OuterVolumeSpecName: "utilities") pod "47b7e26d-8fb3-4749-a544-c86c3a06e439" (UID: "47b7e26d-8fb3-4749-a544-c86c3a06e439"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:48:37.717605 master-0 kubenswrapper[7547]: I0308 03:48:37.717563 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.717605 master-0 kubenswrapper[7547]: I0308 03:48:37.717585 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.717605 master-0 kubenswrapper[7547]: I0308 03:48:37.717600 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:48:37.717755 master-0 kubenswrapper[7547]: I0308 03:48:37.717634 7547 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4818bf75-506a-4b39-bb9b-8067a02d4a51-utilities\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:37.717755 master-0 kubenswrapper[7547]: I0308 03:48:37.717660 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpkxw\" (UniqueName: \"kubernetes.io/projected/4818bf75-506a-4b39-bb9b-8067a02d4a51-kube-api-access-cpkxw\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:37.717755 master-0 kubenswrapper[7547]: I0308 03:48:37.717670 7547 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4ff897a-ac47-45e0-aa7d-88c5aea50b70-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:37.717755 master-0 kubenswrapper[7547]: I0308 03:48:37.717679 7547 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b8dcd07-f245-4783-ba40-521a14e96043-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:37.717755 master-0 kubenswrapper[7547]: I0308 03:48:37.717688 7547 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b8dcd07-f245-4783-ba40-521a14e96043-utilities\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:37.717755 master-0 kubenswrapper[7547]: I0308 03:48:37.717696 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwqlx\" (UniqueName: \"kubernetes.io/projected/a4ff897a-ac47-45e0-aa7d-88c5aea50b70-kube-api-access-rwqlx\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:37.717755 master-0 kubenswrapper[7547]: I0308 03:48:37.717706 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9smb9\" (UniqueName: \"kubernetes.io/projected/8b8dcd07-f245-4783-ba40-521a14e96043-kube-api-access-9smb9\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:37.717755 master-0 kubenswrapper[7547]: I0308 03:48:37.717715 7547 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4ff897a-ac47-45e0-aa7d-88c5aea50b70-utilities\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:37.720313 master-0 kubenswrapper[7547]: I0308 03:48:37.720259 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47b7e26d-8fb3-4749-a544-c86c3a06e439-kube-api-access-tgbvc" (OuterVolumeSpecName: "kube-api-access-tgbvc") pod "47b7e26d-8fb3-4749-a544-c86c3a06e439" (UID: "47b7e26d-8fb3-4749-a544-c86c3a06e439"). InnerVolumeSpecName "kube-api-access-tgbvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:48:37.732955 master-0 kubenswrapper[7547]: I0308 03:48:37.732891 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4818bf75-506a-4b39-bb9b-8067a02d4a51-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4818bf75-506a-4b39-bb9b-8067a02d4a51" (UID: "4818bf75-506a-4b39-bb9b-8067a02d4a51"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:48:37.782849 master-0 kubenswrapper[7547]: I0308 03:48:37.782755 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47b7e26d-8fb3-4749-a544-c86c3a06e439-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47b7e26d-8fb3-4749-a544-c86c3a06e439" (UID: "47b7e26d-8fb3-4749-a544-c86c3a06e439"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:48:37.818907 master-0 kubenswrapper[7547]: I0308 03:48:37.818842 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f62034a-dae9-46af-8c14-006b728b631f-kubelet-dir\") pod \"0f62034a-dae9-46af-8c14-006b728b631f\" (UID: \"0f62034a-dae9-46af-8c14-006b728b631f\") " Mar 08 03:48:37.819093 master-0 kubenswrapper[7547]: I0308 03:48:37.818926 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f62034a-dae9-46af-8c14-006b728b631f-kube-api-access\") pod \"0f62034a-dae9-46af-8c14-006b728b631f\" (UID: \"0f62034a-dae9-46af-8c14-006b728b631f\") " Mar 08 03:48:37.819093 master-0 kubenswrapper[7547]: I0308 03:48:37.819040 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f62034a-dae9-46af-8c14-006b728b631f-var-lock\") pod \"0f62034a-dae9-46af-8c14-006b728b631f\" (UID: \"0f62034a-dae9-46af-8c14-006b728b631f\") " Mar 08 03:48:37.819292 master-0 kubenswrapper[7547]: I0308 03:48:37.819248 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f62034a-dae9-46af-8c14-006b728b631f-var-lock" (OuterVolumeSpecName: "var-lock") pod "0f62034a-dae9-46af-8c14-006b728b631f" (UID: "0f62034a-dae9-46af-8c14-006b728b631f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:48:37.819385 master-0 kubenswrapper[7547]: I0308 03:48:37.819268 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgbvc\" (UniqueName: \"kubernetes.io/projected/47b7e26d-8fb3-4749-a544-c86c3a06e439-kube-api-access-tgbvc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:37.819385 master-0 kubenswrapper[7547]: I0308 03:48:37.819346 7547 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4818bf75-506a-4b39-bb9b-8067a02d4a51-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:37.819385 master-0 kubenswrapper[7547]: I0308 03:48:37.819371 7547 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47b7e26d-8fb3-4749-a544-c86c3a06e439-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:37.819643 master-0 kubenswrapper[7547]: I0308 03:48:37.819395 7547 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47b7e26d-8fb3-4749-a544-c86c3a06e439-utilities\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:37.819877 master-0 kubenswrapper[7547]: I0308 03:48:37.819764 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f62034a-dae9-46af-8c14-006b728b631f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0f62034a-dae9-46af-8c14-006b728b631f" (UID: "0f62034a-dae9-46af-8c14-006b728b631f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:48:37.823591 master-0 kubenswrapper[7547]: I0308 03:48:37.823522 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f62034a-dae9-46af-8c14-006b728b631f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0f62034a-dae9-46af-8c14-006b728b631f" (UID: "0f62034a-dae9-46af-8c14-006b728b631f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:48:37.862975 master-0 kubenswrapper[7547]: I0308 03:48:37.862911 7547 generic.go:334] "Generic (PLEG): container finished" podID="0031e3a9-b253-4dda-a890-bf3e4d8737e8" containerID="c1f98a5af9a428504d2938393c5cc9f6182f4f61d848078dae052d1d8e74b50e" exitCode=0 Mar 08 03:48:37.863101 master-0 kubenswrapper[7547]: I0308 03:48:37.863023 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8nq8" event={"ID":"0031e3a9-b253-4dda-a890-bf3e4d8737e8","Type":"ContainerDied","Data":"c1f98a5af9a428504d2938393c5cc9f6182f4f61d848078dae052d1d8e74b50e"} Mar 08 03:48:37.868794 master-0 kubenswrapper[7547]: I0308 03:48:37.868696 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"0f865279-e751-456d-8c96-6381f8b45ce1","Type":"ContainerStarted","Data":"9289f2928e2e95c2ade5890aeb0e93be12cf91a6e92bf8866de144086be0fb16"} Mar 08 03:48:37.872023 master-0 kubenswrapper[7547]: I0308 03:48:37.871987 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_0f62034a-dae9-46af-8c14-006b728b631f/installer/0.log" Mar 08 03:48:37.872149 master-0 kubenswrapper[7547]: I0308 03:48:37.872048 7547 generic.go:334] "Generic (PLEG): container finished" podID="0f62034a-dae9-46af-8c14-006b728b631f" containerID="d54a64defa77630a2cfba1757b5211284714af7095323e2acde0e62e40e90243" exitCode=1 Mar 08 03:48:37.872149 master-0 kubenswrapper[7547]: I0308 03:48:37.872113 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"0f62034a-dae9-46af-8c14-006b728b631f","Type":"ContainerDied","Data":"d54a64defa77630a2cfba1757b5211284714af7095323e2acde0e62e40e90243"} Mar 08 03:48:37.872149 master-0 kubenswrapper[7547]: I0308 03:48:37.872145 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"0f62034a-dae9-46af-8c14-006b728b631f","Type":"ContainerDied","Data":"bef040f7b636d10cc98601c076bcc8520e38eceadefefdcd82bcf0929743a68c"} Mar 08 03:48:37.872308 master-0 kubenswrapper[7547]: I0308 03:48:37.872172 7547 scope.go:117] "RemoveContainer" containerID="d54a64defa77630a2cfba1757b5211284714af7095323e2acde0e62e40e90243" Mar 08 03:48:37.872308 master-0 kubenswrapper[7547]: I0308 03:48:37.872278 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:48:37.886754 master-0 kubenswrapper[7547]: I0308 03:48:37.886680 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnz4w" event={"ID":"6bee226a-2a66-4032-8aba-2c8b82abcb6a","Type":"ContainerStarted","Data":"f5a03a43b579d01ff866bab8c195d17a338aa8d73cf258ca83335e51946cf09b"} Mar 08 03:48:37.889248 master-0 kubenswrapper[7547]: I0308 03:48:37.889201 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"47643289-ac4b-425d-8ea1-913b6ca39ee0","Type":"ContainerStarted","Data":"18670cb65f485400f1fdb45bed0a06f4e06d21d135459ea29b3c0fcd10f2d210"} Mar 08 03:48:37.891076 master-0 kubenswrapper[7547]: I0308 03:48:37.891029 7547 generic.go:334] "Generic (PLEG): container finished" podID="2b3d1dc7-22f9-4c0c-802a-d7314894b255" containerID="3762e6cf17228a56d31f9c4b27fe58a04fecaafe047e1561d4d5135a98072ca1" exitCode=0 Mar 08 03:48:37.891190 master-0 kubenswrapper[7547]: I0308 03:48:37.891077 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwt58" event={"ID":"2b3d1dc7-22f9-4c0c-802a-d7314894b255","Type":"ContainerDied","Data":"3762e6cf17228a56d31f9c4b27fe58a04fecaafe047e1561d4d5135a98072ca1"} Mar 08 03:48:37.891190 master-0 kubenswrapper[7547]: I0308 03:48:37.891124 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwt58" event={"ID":"2b3d1dc7-22f9-4c0c-802a-d7314894b255","Type":"ContainerStarted","Data":"c7817ab55e81e53bd9a1c875e0b10710e15527bb4f619ad1dc5011c4087c74fe"} Mar 08 03:48:37.893733 master-0 kubenswrapper[7547]: I0308 03:48:37.893694 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c88pb" event={"ID":"47b7e26d-8fb3-4749-a544-c86c3a06e439","Type":"ContainerDied","Data":"2dd21a72fca6329403b8f2781feadc418ad77fca18e9f0a797d7ec6d0c5d0b5b"} Mar 08 03:48:37.893897 master-0 kubenswrapper[7547]: I0308 03:48:37.893759 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c88pb" Mar 08 03:48:37.896374 master-0 kubenswrapper[7547]: I0308 03:48:37.896295 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w2lsp" event={"ID":"a4ff897a-ac47-45e0-aa7d-88c5aea50b70","Type":"ContainerDied","Data":"3ef1230e8d3c1752f9176b32c87038d18542085bad33cf4cad2b423f622615a4"} Mar 08 03:48:37.896374 master-0 kubenswrapper[7547]: I0308 03:48:37.896339 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w2lsp" Mar 08 03:48:37.898876 master-0 kubenswrapper[7547]: I0308 03:48:37.898605 7547 generic.go:334] "Generic (PLEG): container finished" podID="8e283f49-b85d-4789-a71f-3fcb5033cdf0" containerID="97d43cbd349f40b6639c942720c80d6ef418f56edf2fb6db63bcde4714444dea" exitCode=0 Mar 08 03:48:37.898876 master-0 kubenswrapper[7547]: I0308 03:48:37.898700 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4h8qm" event={"ID":"8e283f49-b85d-4789-a71f-3fcb5033cdf0","Type":"ContainerDied","Data":"97d43cbd349f40b6639c942720c80d6ef418f56edf2fb6db63bcde4714444dea"} Mar 08 03:48:37.901727 master-0 kubenswrapper[7547]: I0308 03:48:37.901685 7547 scope.go:117] "RemoveContainer" containerID="d54a64defa77630a2cfba1757b5211284714af7095323e2acde0e62e40e90243" Mar 08 03:48:37.902332 master-0 kubenswrapper[7547]: I0308 03:48:37.902282 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fcdn7_4818bf75-506a-4b39-bb9b-8067a02d4a51/extract-content/0.log" Mar 08 03:48:37.902450 master-0 kubenswrapper[7547]: E0308 03:48:37.902368 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d54a64defa77630a2cfba1757b5211284714af7095323e2acde0e62e40e90243\": container with ID starting with d54a64defa77630a2cfba1757b5211284714af7095323e2acde0e62e40e90243 not found: ID does not exist" containerID="d54a64defa77630a2cfba1757b5211284714af7095323e2acde0e62e40e90243" Mar 08 03:48:37.902522 master-0 kubenswrapper[7547]: I0308 03:48:37.902430 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54a64defa77630a2cfba1757b5211284714af7095323e2acde0e62e40e90243"} err="failed to get container status \"d54a64defa77630a2cfba1757b5211284714af7095323e2acde0e62e40e90243\": rpc error: code = NotFound desc = could not find container \"d54a64defa77630a2cfba1757b5211284714af7095323e2acde0e62e40e90243\": container with ID starting with d54a64defa77630a2cfba1757b5211284714af7095323e2acde0e62e40e90243 not found: ID does not exist" Mar 08 03:48:37.902522 master-0 kubenswrapper[7547]: I0308 03:48:37.902471 7547 scope.go:117] "RemoveContainer" containerID="3e7278dc6682aa0a3aac1f49856a042d60ff9bc59a90715e1a5e79961ebe2037" Mar 08 03:48:37.906424 master-0 kubenswrapper[7547]: I0308 03:48:37.905242 7547 generic.go:334] "Generic (PLEG): container finished" podID="4818bf75-506a-4b39-bb9b-8067a02d4a51" containerID="5f21b41a33309effcfa7d8641b2cb25646283c6b2dcecea3b71aeb78577ba200" exitCode=2 Mar 08 03:48:37.906424 master-0 kubenswrapper[7547]: I0308 03:48:37.905305 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcdn7" Mar 08 03:48:37.906424 master-0 kubenswrapper[7547]: I0308 03:48:37.905350 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcdn7" event={"ID":"4818bf75-506a-4b39-bb9b-8067a02d4a51","Type":"ContainerDied","Data":"5f21b41a33309effcfa7d8641b2cb25646283c6b2dcecea3b71aeb78577ba200"} Mar 08 03:48:37.906424 master-0 kubenswrapper[7547]: I0308 03:48:37.906058 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcdn7" event={"ID":"4818bf75-506a-4b39-bb9b-8067a02d4a51","Type":"ContainerDied","Data":"e336c6a6f7f015b88a354964bc81c85ce7b4f460b1abcd96eda1c1313b7e0178"} Mar 08 03:48:37.918674 master-0 kubenswrapper[7547]: I0308 03:48:37.917788 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" event={"ID":"26180f77-0b1a-4d0f-9ed0-a12fdee69817","Type":"ContainerStarted","Data":"b9a51bfe829084894104463976cade708d7a51f90ef15a899d7341f663daf1dc"} Mar 08 03:48:37.920412 master-0 kubenswrapper[7547]: I0308 03:48:37.920333 7547 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f62034a-dae9-46af-8c14-006b728b631f-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:37.920412 master-0 kubenswrapper[7547]: I0308 03:48:37.920365 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f62034a-dae9-46af-8c14-006b728b631f-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:37.920412 master-0 kubenswrapper[7547]: I0308 03:48:37.920384 7547 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f62034a-dae9-46af-8c14-006b728b631f-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:37.924995 master-0 kubenswrapper[7547]: I0308 03:48:37.924945 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ljn9" event={"ID":"8b8dcd07-f245-4783-ba40-521a14e96043","Type":"ContainerDied","Data":"bf04badf3d7dec96704972bbd5d79b0249130f03eebcc3996b0e1ce42c352e5e"} Mar 08 03:48:37.925124 master-0 kubenswrapper[7547]: I0308 03:48:37.925041 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ljn9" Mar 08 03:48:37.929222 master-0 kubenswrapper[7547]: I0308 03:48:37.929117 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" event={"ID":"2262647b-c315-477a-93bd-f168c1810475","Type":"ContainerStarted","Data":"ff49e653275c39e7af57a8716c460ac2db01b59edb1ee2f75e1bdf35e77e98b5"} Mar 08 03:48:37.929222 master-0 kubenswrapper[7547]: I0308 03:48:37.929171 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" event={"ID":"2262647b-c315-477a-93bd-f168c1810475","Type":"ContainerStarted","Data":"3f36e6c60bfd2bd17ffa26adda31fc5cb46b7dc64ce396281d544af5a25539b6"} Mar 08 03:48:37.938272 master-0 kubenswrapper[7547]: I0308 03:48:37.937979 7547 scope.go:117] "RemoveContainer" containerID="6a4685509ca0ee5089a0299c404ad04268f955ec57dcda13dc166fb32adff441" Mar 08 03:48:37.967097 master-0 kubenswrapper[7547]: I0308 03:48:37.967060 7547 scope.go:117] "RemoveContainer" containerID="3e2b2e09415114bbeaf9e1f998368ceae3dc08609fd38e1dc5b57c9deb896668" Mar 08 03:48:37.985399 master-0 kubenswrapper[7547]: I0308 03:48:37.985351 7547 scope.go:117] "RemoveContainer" containerID="f3a3a964f5dd60ec9c11603375242a49c3896a4ba1562bbbb0ef714e0b475500" Mar 08 03:48:38.011185 master-0 kubenswrapper[7547]: I0308 03:48:38.006995 7547 scope.go:117] "RemoveContainer" containerID="5f21b41a33309effcfa7d8641b2cb25646283c6b2dcecea3b71aeb78577ba200" Mar 08 03:48:38.029447 master-0 kubenswrapper[7547]: I0308 03:48:38.029408 7547 scope.go:117] "RemoveContainer" containerID="7d24a93a94248357a5a5f5d3f147fa35c7c76b7e48473be4b7ed880f3c7c6c1a" Mar 08 03:48:38.046479 master-0 kubenswrapper[7547]: I0308 03:48:38.046324 7547 scope.go:117] "RemoveContainer" containerID="5f21b41a33309effcfa7d8641b2cb25646283c6b2dcecea3b71aeb78577ba200" Mar 08 03:48:38.047323 master-0 kubenswrapper[7547]: E0308 03:48:38.047235 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f21b41a33309effcfa7d8641b2cb25646283c6b2dcecea3b71aeb78577ba200\": container with ID starting with 5f21b41a33309effcfa7d8641b2cb25646283c6b2dcecea3b71aeb78577ba200 not found: ID does not exist" containerID="5f21b41a33309effcfa7d8641b2cb25646283c6b2dcecea3b71aeb78577ba200" Mar 08 03:48:38.047408 master-0 kubenswrapper[7547]: I0308 03:48:38.047327 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f21b41a33309effcfa7d8641b2cb25646283c6b2dcecea3b71aeb78577ba200"} err="failed to get container status \"5f21b41a33309effcfa7d8641b2cb25646283c6b2dcecea3b71aeb78577ba200\": rpc error: code = NotFound desc = could not find container \"5f21b41a33309effcfa7d8641b2cb25646283c6b2dcecea3b71aeb78577ba200\": container with ID starting with 5f21b41a33309effcfa7d8641b2cb25646283c6b2dcecea3b71aeb78577ba200 not found: ID does not exist" Mar 08 03:48:38.047462 master-0 kubenswrapper[7547]: I0308 03:48:38.047406 7547 scope.go:117] "RemoveContainer" containerID="7d24a93a94248357a5a5f5d3f147fa35c7c76b7e48473be4b7ed880f3c7c6c1a" Mar 08 03:48:38.050199 master-0 kubenswrapper[7547]: E0308 03:48:38.050120 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d24a93a94248357a5a5f5d3f147fa35c7c76b7e48473be4b7ed880f3c7c6c1a\": container with ID starting with 7d24a93a94248357a5a5f5d3f147fa35c7c76b7e48473be4b7ed880f3c7c6c1a not found: ID does not exist" containerID="7d24a93a94248357a5a5f5d3f147fa35c7c76b7e48473be4b7ed880f3c7c6c1a" Mar 08 03:48:38.050291 master-0 kubenswrapper[7547]: I0308 03:48:38.050210 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d24a93a94248357a5a5f5d3f147fa35c7c76b7e48473be4b7ed880f3c7c6c1a"} err="failed to get container status \"7d24a93a94248357a5a5f5d3f147fa35c7c76b7e48473be4b7ed880f3c7c6c1a\": rpc error: code = NotFound desc = could not find container \"7d24a93a94248357a5a5f5d3f147fa35c7c76b7e48473be4b7ed880f3c7c6c1a\": container with ID starting with 7d24a93a94248357a5a5f5d3f147fa35c7c76b7e48473be4b7ed880f3c7c6c1a not found: ID does not exist" Mar 08 03:48:38.050291 master-0 kubenswrapper[7547]: I0308 03:48:38.050255 7547 scope.go:117] "RemoveContainer" containerID="d96fc44d30582015395e41e6921f416abacff20675fd3fc0f2542974b5f0627f" Mar 08 03:48:38.076353 master-0 kubenswrapper[7547]: I0308 03:48:38.076267 7547 scope.go:117] "RemoveContainer" containerID="49114ad9d3b7bdc277da0ec77422d819922e98024efb0cc791aad1d20f5d05e5" Mar 08 03:48:38.939504 master-0 kubenswrapper[7547]: I0308 03:48:38.939448 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwt58" event={"ID":"2b3d1dc7-22f9-4c0c-802a-d7314894b255","Type":"ContainerStarted","Data":"9a8e4c15154302a8bb3171ed8ab697b8ee8b484e44617249609422b38b588432"} Mar 08 03:48:38.944942 master-0 kubenswrapper[7547]: I0308 03:48:38.944895 7547 generic.go:334] "Generic (PLEG): container finished" podID="8e283f49-b85d-4789-a71f-3fcb5033cdf0" containerID="5ff54f1371455049a6c7805b6b5fc557245d737690eacfa02101a51a25748851" exitCode=0 Mar 08 03:48:38.945079 master-0 kubenswrapper[7547]: I0308 03:48:38.944969 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4h8qm" event={"ID":"8e283f49-b85d-4789-a71f-3fcb5033cdf0","Type":"ContainerDied","Data":"5ff54f1371455049a6c7805b6b5fc557245d737690eacfa02101a51a25748851"} Mar 08 03:48:38.950787 master-0 kubenswrapper[7547]: I0308 03:48:38.950757 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8nq8" event={"ID":"0031e3a9-b253-4dda-a890-bf3e4d8737e8","Type":"ContainerStarted","Data":"f0b47768bbdcebe6eae54026a9de105f6f07e381bc83e55351fb7b2741b60d83"} Mar 08 03:48:38.960256 master-0 kubenswrapper[7547]: I0308 03:48:38.960180 7547 generic.go:334] "Generic (PLEG): container finished" podID="6bee226a-2a66-4032-8aba-2c8b82abcb6a" containerID="f5a03a43b579d01ff866bab8c195d17a338aa8d73cf258ca83335e51946cf09b" exitCode=0 Mar 08 03:48:38.960594 master-0 kubenswrapper[7547]: I0308 03:48:38.960522 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnz4w" event={"ID":"6bee226a-2a66-4032-8aba-2c8b82abcb6a","Type":"ContainerDied","Data":"f5a03a43b579d01ff866bab8c195d17a338aa8d73cf258ca83335e51946cf09b"} Mar 08 03:48:39.969442 master-0 kubenswrapper[7547]: I0308 03:48:39.969369 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnz4w" event={"ID":"6bee226a-2a66-4032-8aba-2c8b82abcb6a","Type":"ContainerStarted","Data":"fa31bd4530a32448189d4f9926eff80552b12132cf2092ed18948eb36236e23c"} Mar 08 03:48:39.972404 master-0 kubenswrapper[7547]: I0308 03:48:39.972338 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4h8qm" event={"ID":"8e283f49-b85d-4789-a71f-3fcb5033cdf0","Type":"ContainerStarted","Data":"e97d9f1e998f2ef87415886c189959437a6039f6b47fb4815b9b2ece5b1723ac"} Mar 08 03:48:39.974431 master-0 kubenswrapper[7547]: I0308 03:48:39.974354 7547 generic.go:334] "Generic (PLEG): container finished" podID="2b3d1dc7-22f9-4c0c-802a-d7314894b255" containerID="9a8e4c15154302a8bb3171ed8ab697b8ee8b484e44617249609422b38b588432" exitCode=0 Mar 08 03:48:39.974592 master-0 kubenswrapper[7547]: I0308 03:48:39.974440 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwt58" event={"ID":"2b3d1dc7-22f9-4c0c-802a-d7314894b255","Type":"ContainerDied","Data":"9a8e4c15154302a8bb3171ed8ab697b8ee8b484e44617249609422b38b588432"} Mar 08 03:48:39.977501 master-0 kubenswrapper[7547]: I0308 03:48:39.977442 7547 generic.go:334] "Generic (PLEG): container finished" podID="0031e3a9-b253-4dda-a890-bf3e4d8737e8" containerID="f0b47768bbdcebe6eae54026a9de105f6f07e381bc83e55351fb7b2741b60d83" exitCode=0 Mar 08 03:48:39.977711 master-0 kubenswrapper[7547]: I0308 03:48:39.977540 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8nq8" event={"ID":"0031e3a9-b253-4dda-a890-bf3e4d8737e8","Type":"ContainerDied","Data":"f0b47768bbdcebe6eae54026a9de105f6f07e381bc83e55351fb7b2741b60d83"} Mar 08 03:48:40.320912 master-0 kubenswrapper[7547]: I0308 03:48:40.320717 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:48:40.993593 master-0 kubenswrapper[7547]: I0308 03:48:40.993364 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwt58" event={"ID":"2b3d1dc7-22f9-4c0c-802a-d7314894b255","Type":"ContainerStarted","Data":"f11191c2a62b995b77ea89d9a917668a9b2c29f7371c6b088866c14ac693f295"} Mar 08 03:48:41.003773 master-0 kubenswrapper[7547]: I0308 03:48:41.003102 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8nq8" event={"ID":"0031e3a9-b253-4dda-a890-bf3e4d8737e8","Type":"ContainerStarted","Data":"a69e1edd744234ba21d50fcaac730ae0bd1fd9575ba6056d68507824a223337a"} Mar 08 03:48:41.596686 master-0 kubenswrapper[7547]: I0308 03:48:41.596591 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:48:41.597015 master-0 kubenswrapper[7547]: I0308 03:48:41.596862 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:48:42.011084 master-0 kubenswrapper[7547]: I0308 03:48:42.011022 7547 generic.go:334] "Generic (PLEG): container finished" podID="ee586416-6f56-4ea4-ad62-95de1e6df23b" containerID="f0c21a56c7d12d77087ad5558ab608389fecd51a0d4bdef95c63dd3e4d27cfef" exitCode=0 Mar 08 03:48:42.012111 master-0 kubenswrapper[7547]: I0308 03:48:42.011114 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" event={"ID":"ee586416-6f56-4ea4-ad62-95de1e6df23b","Type":"ContainerDied","Data":"f0c21a56c7d12d77087ad5558ab608389fecd51a0d4bdef95c63dd3e4d27cfef"} Mar 08 03:48:42.012743 master-0 kubenswrapper[7547]: I0308 03:48:42.012704 7547 scope.go:117] "RemoveContainer" containerID="f0c21a56c7d12d77087ad5558ab608389fecd51a0d4bdef95c63dd3e4d27cfef" Mar 08 03:48:42.648291 master-0 kubenswrapper[7547]: I0308 03:48:42.648233 7547 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-lwt58" podUID="2b3d1dc7-22f9-4c0c-802a-d7314894b255" containerName="registry-server" probeResult="failure" output=< Mar 08 03:48:42.648291 master-0 kubenswrapper[7547]: timeout: failed to connect service ":50051" within 1s Mar 08 03:48:42.648291 master-0 kubenswrapper[7547]: > Mar 08 03:48:42.811358 master-0 kubenswrapper[7547]: I0308 03:48:42.811257 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:48:42.811358 master-0 kubenswrapper[7547]: I0308 03:48:42.811349 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:48:42.865268 master-0 kubenswrapper[7547]: I0308 03:48:42.865211 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:48:43.018490 master-0 kubenswrapper[7547]: I0308 03:48:43.018419 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" event={"ID":"ee586416-6f56-4ea4-ad62-95de1e6df23b","Type":"ContainerStarted","Data":"644f0c7d4552f15957ecfc56f2d37a06ec2757ddcc7c2c371f0c34b92aa63533"} Mar 08 03:48:46.040786 master-0 kubenswrapper[7547]: I0308 03:48:46.040605 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-kt66j_0ebf1330-e044-4ff5-8b48-2d667e0c5625/openshift-controller-manager-operator/0.log" Mar 08 03:48:46.040786 master-0 kubenswrapper[7547]: I0308 03:48:46.040688 7547 generic.go:334] "Generic (PLEG): container finished" podID="0ebf1330-e044-4ff5-8b48-2d667e0c5625" containerID="8a84af60c043e955bcc0105f0aa3f93048f54c92376777b25ef1335389f355a8" exitCode=1 Mar 08 03:48:46.040786 master-0 kubenswrapper[7547]: I0308 03:48:46.040750 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" event={"ID":"0ebf1330-e044-4ff5-8b48-2d667e0c5625","Type":"ContainerDied","Data":"8a84af60c043e955bcc0105f0aa3f93048f54c92376777b25ef1335389f355a8"} Mar 08 03:48:46.041785 master-0 kubenswrapper[7547]: I0308 03:48:46.041391 7547 scope.go:117] "RemoveContainer" containerID="8a84af60c043e955bcc0105f0aa3f93048f54c92376777b25ef1335389f355a8" Mar 08 03:48:46.043196 master-0 kubenswrapper[7547]: I0308 03:48:46.043141 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_9fbe4302-8264-4b6c-ae3f-0c6d981bc998/installer/0.log" Mar 08 03:48:46.043337 master-0 kubenswrapper[7547]: I0308 03:48:46.043224 7547 generic.go:334] "Generic (PLEG): container finished" podID="9fbe4302-8264-4b6c-ae3f-0c6d981bc998" containerID="6c87b0d61a0be181b1c9aaa5d8b34b8c3ddfab52941ef9b130c9b50919085c6e" exitCode=1 Mar 08 03:48:46.043337 master-0 kubenswrapper[7547]: I0308 03:48:46.043283 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"9fbe4302-8264-4b6c-ae3f-0c6d981bc998","Type":"ContainerDied","Data":"6c87b0d61a0be181b1c9aaa5d8b34b8c3ddfab52941ef9b130c9b50919085c6e"} Mar 08 03:48:46.720159 master-0 kubenswrapper[7547]: I0308 03:48:46.720110 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_9fbe4302-8264-4b6c-ae3f-0c6d981bc998/installer/0.log" Mar 08 03:48:46.720335 master-0 kubenswrapper[7547]: I0308 03:48:46.720172 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:48:46.848722 master-0 kubenswrapper[7547]: I0308 03:48:46.848665 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9fbe4302-8264-4b6c-ae3f-0c6d981bc998-kube-api-access\") pod \"9fbe4302-8264-4b6c-ae3f-0c6d981bc998\" (UID: \"9fbe4302-8264-4b6c-ae3f-0c6d981bc998\") " Mar 08 03:48:46.849918 master-0 kubenswrapper[7547]: I0308 03:48:46.849468 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9fbe4302-8264-4b6c-ae3f-0c6d981bc998-kubelet-dir\") pod \"9fbe4302-8264-4b6c-ae3f-0c6d981bc998\" (UID: \"9fbe4302-8264-4b6c-ae3f-0c6d981bc998\") " Mar 08 03:48:46.849918 master-0 kubenswrapper[7547]: I0308 03:48:46.849675 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9fbe4302-8264-4b6c-ae3f-0c6d981bc998-var-lock\") pod \"9fbe4302-8264-4b6c-ae3f-0c6d981bc998\" (UID: \"9fbe4302-8264-4b6c-ae3f-0c6d981bc998\") " Mar 08 03:48:46.849918 master-0 kubenswrapper[7547]: I0308 03:48:46.849871 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9fbe4302-8264-4b6c-ae3f-0c6d981bc998-var-lock" (OuterVolumeSpecName: "var-lock") pod "9fbe4302-8264-4b6c-ae3f-0c6d981bc998" (UID: "9fbe4302-8264-4b6c-ae3f-0c6d981bc998"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:48:46.850083 master-0 kubenswrapper[7547]: I0308 03:48:46.850013 7547 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9fbe4302-8264-4b6c-ae3f-0c6d981bc998-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:46.850241 master-0 kubenswrapper[7547]: I0308 03:48:46.850221 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9fbe4302-8264-4b6c-ae3f-0c6d981bc998-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9fbe4302-8264-4b6c-ae3f-0c6d981bc998" (UID: "9fbe4302-8264-4b6c-ae3f-0c6d981bc998"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:48:46.851579 master-0 kubenswrapper[7547]: I0308 03:48:46.851542 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fbe4302-8264-4b6c-ae3f-0c6d981bc998-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9fbe4302-8264-4b6c-ae3f-0c6d981bc998" (UID: "9fbe4302-8264-4b6c-ae3f-0c6d981bc998"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:48:46.950951 master-0 kubenswrapper[7547]: I0308 03:48:46.950730 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9fbe4302-8264-4b6c-ae3f-0c6d981bc998-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:46.950951 master-0 kubenswrapper[7547]: I0308 03:48:46.950777 7547 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9fbe4302-8264-4b6c-ae3f-0c6d981bc998-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:47.054045 master-0 kubenswrapper[7547]: I0308 03:48:47.053978 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-kt66j_0ebf1330-e044-4ff5-8b48-2d667e0c5625/openshift-controller-manager-operator/0.log" Mar 08 03:48:47.054951 master-0 kubenswrapper[7547]: I0308 03:48:47.054136 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" event={"ID":"0ebf1330-e044-4ff5-8b48-2d667e0c5625","Type":"ContainerStarted","Data":"cae216678d94c10a368ff595527d708d87bd43ed6865eacedbf892861c47fe3a"} Mar 08 03:48:47.057872 master-0 kubenswrapper[7547]: I0308 03:48:47.057643 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_9fbe4302-8264-4b6c-ae3f-0c6d981bc998/installer/0.log" Mar 08 03:48:47.057872 master-0 kubenswrapper[7547]: I0308 03:48:47.057701 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"9fbe4302-8264-4b6c-ae3f-0c6d981bc998","Type":"ContainerDied","Data":"c5d13c3d50d361afd8177f873e9bf70ee9d1066fdbbb27d8ba1706b4283439f3"} Mar 08 03:48:47.057872 master-0 kubenswrapper[7547]: I0308 03:48:47.057742 7547 scope.go:117] "RemoveContainer" containerID="6c87b0d61a0be181b1c9aaa5d8b34b8c3ddfab52941ef9b130c9b50919085c6e" Mar 08 03:48:47.058161 master-0 kubenswrapper[7547]: I0308 03:48:47.057928 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:48:48.794665 master-0 kubenswrapper[7547]: I0308 03:48:48.794556 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:48:48.795742 master-0 kubenswrapper[7547]: I0308 03:48:48.794678 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:48:48.856892 master-0 kubenswrapper[7547]: I0308 03:48:48.856800 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:48:49.099724 master-0 kubenswrapper[7547]: E0308 03:48:49.099524 7547 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:48:49.114460 master-0 kubenswrapper[7547]: I0308 03:48:49.114384 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:48:49.286332 master-0 kubenswrapper[7547]: E0308 03:48:49.286067 7547 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:48:39Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:48:39Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:48:39Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:48:39Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5\\\"],\\\"sizeBytes\\\":513581866},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b8cb5e0caeca0fb02f3e8c72b7ddf1c49e3c602e42e119ba30c60525f1db1821\\\"],\\\"sizeBytes\\\":504658657},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2fe5144b1f72bdcf5d5a52130f02ed86fbec3875cc4ac108ead00eaac1659e06\\\"],\\\"sizeBytes\\\":487090672},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a4c3e6ca0cd26f7eb5270cfafbcf423cf2986d152bf5b9fc6469d40599e104e\\\"],\\\"sizeBytes\\\":484450382},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c54c3f7cffe057ae0bdf26163d5e46744685083ae16fc97112e32beacd2d8955\\\"],\\\"sizeBytes\\\":484175664},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d74fe7cb12c554c120262683d9c4066f33ae4f60a5fad83cba419d851b98c12d\\\"],\\\"sizeBytes\\\":470822665},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9b8bc43bac294be3c7669cde049e388ad9d8751242051ba40f83e1c401eceda\\\"],\\\"sizeBytes\\\":468263999},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\\\"],\\\"sizeBytes\\\":465086330},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a85dab5856916220df6f05ce9d6aa10cd4fa0234093b55355246690bba05ad1\\\"],\\\"sizeBytes\\\":463700811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b714a7ada1e295b599b432f32e1fd5b74c8cdbe6fe51e95306322b25cb873914\\\"],\\\"sizeBytes\\\":458126424},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5230462066ab36e3025524e948dd33fa6f51ee29a4f91fa469bfc268568b5fd9\\\"],\\\"sizeBytes\\\":456575686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:89cb093f319eaa04acfe9431b8697bffbc71ab670546f7ed257daa332165c626\\\"],\\\"sizeBytes\\\":448828105},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c680fcc9fd6b66099ca4c0f512521b6f8e0bc29273ddb9405730bc54bacb6783\\\"],\\\"sizeBytes\\\":448041621},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cf9670d0f269f8d49fd9ef4981999be195f6624a4146aa93d9201eb8acc81053\\\"],\\\"sizeBytes\\\":443271011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ceca1efee55b9fd5089428476bbc401fe73db7c0b0f5e16d4ad28ed0f0f9d43\\\"],\\\"sizeBytes\\\":438654375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ace4dcd008420277d915fe983b07bbb50fb3ab0673f28d0166424a75bc2137e7\\\"],\\\"sizeBytes\\\":411585608}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": the server was unable to return a response in the time allotted, but may still be processing the request (patch nodes master-0)" Mar 08 03:48:50.044416 master-0 kubenswrapper[7547]: I0308 03:48:50.044332 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:48:50.046509 master-0 kubenswrapper[7547]: I0308 03:48:50.046428 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:48:50.108712 master-0 kubenswrapper[7547]: I0308 03:48:50.108650 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:48:50.633980 master-0 kubenswrapper[7547]: E0308 03:48:50.633874 7547 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 08 03:48:50.634631 master-0 kubenswrapper[7547]: I0308 03:48:50.634562 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 08 03:48:50.660538 master-0 kubenswrapper[7547]: W0308 03:48:50.660434 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e52bef89f4b50e4590a1719bcc5d7e5.slice/crio-bc8712c641a3b3ccd887956343c178e81a448d9908293a089aa942f0944b3018 WatchSource:0}: Error finding container bc8712c641a3b3ccd887956343c178e81a448d9908293a089aa942f0944b3018: Status 404 returned error can't find the container with id bc8712c641a3b3ccd887956343c178e81a448d9908293a089aa942f0944b3018 Mar 08 03:48:51.088889 master-0 kubenswrapper[7547]: I0308 03:48:51.088715 7547 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="74cf1dbcbe0d060e62a1cff77950d3cf19f4f4c11ebaceeae2f072445a583ffa" exitCode=0 Mar 08 03:48:51.088889 master-0 kubenswrapper[7547]: I0308 03:48:51.088860 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"74cf1dbcbe0d060e62a1cff77950d3cf19f4f4c11ebaceeae2f072445a583ffa"} Mar 08 03:48:51.089724 master-0 kubenswrapper[7547]: I0308 03:48:51.088946 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"bc8712c641a3b3ccd887956343c178e81a448d9908293a089aa942f0944b3018"} Mar 08 03:48:51.153080 master-0 kubenswrapper[7547]: I0308 03:48:51.152940 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:48:51.642900 master-0 kubenswrapper[7547]: I0308 03:48:51.642806 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:48:51.707811 master-0 kubenswrapper[7547]: I0308 03:48:51.707756 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:48:52.098087 master-0 kubenswrapper[7547]: I0308 03:48:52.098010 7547 generic.go:334] "Generic (PLEG): container finished" podID="9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b" containerID="e137d58d0275a0b444de45e72047e1d303bf2156296279cd1f222cf4c2e05cac" exitCode=0 Mar 08 03:48:52.098999 master-0 kubenswrapper[7547]: I0308 03:48:52.098119 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b","Type":"ContainerDied","Data":"e137d58d0275a0b444de45e72047e1d303bf2156296279cd1f222cf4c2e05cac"} Mar 08 03:48:52.100779 master-0 kubenswrapper[7547]: I0308 03:48:52.100700 7547 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="724178cb9f231b822e2bf919b24049f88ede4ee540e7e7751c011ef4363756c9" exitCode=1 Mar 08 03:48:52.100958 master-0 kubenswrapper[7547]: I0308 03:48:52.100805 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"724178cb9f231b822e2bf919b24049f88ede4ee540e7e7751c011ef4363756c9"} Mar 08 03:48:52.101952 master-0 kubenswrapper[7547]: I0308 03:48:52.101900 7547 scope.go:117] "RemoveContainer" containerID="724178cb9f231b822e2bf919b24049f88ede4ee540e7e7751c011ef4363756c9" Mar 08 03:48:52.876780 master-0 kubenswrapper[7547]: I0308 03:48:52.876701 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:48:53.110161 master-0 kubenswrapper[7547]: I0308 03:48:53.109958 7547 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="f5a0acfb3a3f4f285f366c3abcb3f9d3bebb3626e4a976de0dab27a634745185" exitCode=1 Mar 08 03:48:53.110161 master-0 kubenswrapper[7547]: I0308 03:48:53.110025 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerDied","Data":"f5a0acfb3a3f4f285f366c3abcb3f9d3bebb3626e4a976de0dab27a634745185"} Mar 08 03:48:53.111207 master-0 kubenswrapper[7547]: I0308 03:48:53.110623 7547 scope.go:117] "RemoveContainer" containerID="f5a0acfb3a3f4f285f366c3abcb3f9d3bebb3626e4a976de0dab27a634745185" Mar 08 03:48:53.115342 master-0 kubenswrapper[7547]: I0308 03:48:53.115261 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"c67a98430dbc4d1ec5edc5c2aa37ee19dd047e853de22d326be45ea84e3430ff"} Mar 08 03:48:53.537300 master-0 kubenswrapper[7547]: I0308 03:48:53.537236 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 08 03:48:53.540548 master-0 kubenswrapper[7547]: I0308 03:48:53.540502 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b-var-lock\") pod \"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b\" (UID: \"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b\") " Mar 08 03:48:53.540651 master-0 kubenswrapper[7547]: I0308 03:48:53.540617 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b-kubelet-dir\") pod \"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b\" (UID: \"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b\") " Mar 08 03:48:53.540706 master-0 kubenswrapper[7547]: I0308 03:48:53.540672 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b-kube-api-access\") pod \"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b\" (UID: \"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b\") " Mar 08 03:48:53.541536 master-0 kubenswrapper[7547]: I0308 03:48:53.541474 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b-var-lock" (OuterVolumeSpecName: "var-lock") pod "9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b" (UID: "9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:48:53.541593 master-0 kubenswrapper[7547]: I0308 03:48:53.541540 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b" (UID: "9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:48:53.544817 master-0 kubenswrapper[7547]: I0308 03:48:53.544769 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b" (UID: "9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:48:53.642048 master-0 kubenswrapper[7547]: I0308 03:48:53.641901 7547 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:53.642048 master-0 kubenswrapper[7547]: I0308 03:48:53.641974 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:53.642048 master-0 kubenswrapper[7547]: I0308 03:48:53.641996 7547 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:48:53.675633 master-0 kubenswrapper[7547]: I0308 03:48:53.675560 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:48:54.136973 master-0 kubenswrapper[7547]: I0308 03:48:54.136895 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"255dd70f3aa78d8d4e9fb681404034a533a64980f735eecd5cf5d8b6ad4838a5"} Mar 08 03:48:54.138954 master-0 kubenswrapper[7547]: I0308 03:48:54.138912 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b","Type":"ContainerDied","Data":"0ed8c48f565d4b8be16c6ba185f91ba3e8904463e008be6f2e6c969571e27427"} Mar 08 03:48:54.139079 master-0 kubenswrapper[7547]: I0308 03:48:54.138959 7547 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ed8c48f565d4b8be16c6ba185f91ba3e8904463e008be6f2e6c969571e27427" Mar 08 03:48:54.139079 master-0 kubenswrapper[7547]: I0308 03:48:54.138955 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 08 03:48:56.116369 master-0 kubenswrapper[7547]: I0308 03:48:56.116288 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:48:56.676470 master-0 kubenswrapper[7547]: I0308 03:48:56.676345 7547 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:48:59.101078 master-0 kubenswrapper[7547]: E0308 03:48:59.100896 7547 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:48:59.287345 master-0 kubenswrapper[7547]: E0308 03:48:59.287235 7547 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:49:01.187858 master-0 kubenswrapper[7547]: I0308 03:49:01.187725 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_baab6171-046d-4fc9-b7d7-ff2fd12f185f/installer/0.log" Mar 08 03:49:01.188666 master-0 kubenswrapper[7547]: I0308 03:49:01.187876 7547 generic.go:334] "Generic (PLEG): container finished" podID="baab6171-046d-4fc9-b7d7-ff2fd12f185f" containerID="5a92eed331c18522564f92e3e6e14d9dcb5be24514d5ff22fbf01a140de4cfee" exitCode=1 Mar 08 03:49:01.188666 master-0 kubenswrapper[7547]: I0308 03:49:01.187956 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"baab6171-046d-4fc9-b7d7-ff2fd12f185f","Type":"ContainerDied","Data":"5a92eed331c18522564f92e3e6e14d9dcb5be24514d5ff22fbf01a140de4cfee"} Mar 08 03:49:02.577748 master-0 kubenswrapper[7547]: I0308 03:49:02.577691 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_baab6171-046d-4fc9-b7d7-ff2fd12f185f/installer/0.log" Mar 08 03:49:02.578417 master-0 kubenswrapper[7547]: I0308 03:49:02.577789 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:49:02.677206 master-0 kubenswrapper[7547]: I0308 03:49:02.677123 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/baab6171-046d-4fc9-b7d7-ff2fd12f185f-var-lock\") pod \"baab6171-046d-4fc9-b7d7-ff2fd12f185f\" (UID: \"baab6171-046d-4fc9-b7d7-ff2fd12f185f\") " Mar 08 03:49:02.677206 master-0 kubenswrapper[7547]: I0308 03:49:02.677193 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baab6171-046d-4fc9-b7d7-ff2fd12f185f-kube-api-access\") pod \"baab6171-046d-4fc9-b7d7-ff2fd12f185f\" (UID: \"baab6171-046d-4fc9-b7d7-ff2fd12f185f\") " Mar 08 03:49:02.677530 master-0 kubenswrapper[7547]: I0308 03:49:02.677334 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baab6171-046d-4fc9-b7d7-ff2fd12f185f-kubelet-dir\") pod \"baab6171-046d-4fc9-b7d7-ff2fd12f185f\" (UID: \"baab6171-046d-4fc9-b7d7-ff2fd12f185f\") " Mar 08 03:49:02.677530 master-0 kubenswrapper[7547]: I0308 03:49:02.677340 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/baab6171-046d-4fc9-b7d7-ff2fd12f185f-var-lock" (OuterVolumeSpecName: "var-lock") pod "baab6171-046d-4fc9-b7d7-ff2fd12f185f" (UID: "baab6171-046d-4fc9-b7d7-ff2fd12f185f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:49:02.677664 master-0 kubenswrapper[7547]: I0308 03:49:02.677619 7547 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/baab6171-046d-4fc9-b7d7-ff2fd12f185f-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:49:02.677730 master-0 kubenswrapper[7547]: I0308 03:49:02.677681 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/baab6171-046d-4fc9-b7d7-ff2fd12f185f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "baab6171-046d-4fc9-b7d7-ff2fd12f185f" (UID: "baab6171-046d-4fc9-b7d7-ff2fd12f185f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:49:02.681462 master-0 kubenswrapper[7547]: I0308 03:49:02.681404 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baab6171-046d-4fc9-b7d7-ff2fd12f185f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "baab6171-046d-4fc9-b7d7-ff2fd12f185f" (UID: "baab6171-046d-4fc9-b7d7-ff2fd12f185f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:49:02.778334 master-0 kubenswrapper[7547]: I0308 03:49:02.778260 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baab6171-046d-4fc9-b7d7-ff2fd12f185f-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:49:02.778334 master-0 kubenswrapper[7547]: I0308 03:49:02.778313 7547 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baab6171-046d-4fc9-b7d7-ff2fd12f185f-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:49:03.206101 master-0 kubenswrapper[7547]: I0308 03:49:03.206062 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_baab6171-046d-4fc9-b7d7-ff2fd12f185f/installer/0.log" Mar 08 03:49:03.206480 master-0 kubenswrapper[7547]: I0308 03:49:03.206441 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"baab6171-046d-4fc9-b7d7-ff2fd12f185f","Type":"ContainerDied","Data":"f2dd6f50bb704e85814a264a6b3647cea280a2063c980541df1082c59aa92b82"} Mar 08 03:49:03.206661 master-0 kubenswrapper[7547]: I0308 03:49:03.206598 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:49:03.206873 master-0 kubenswrapper[7547]: I0308 03:49:03.206607 7547 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2dd6f50bb704e85814a264a6b3647cea280a2063c980541df1082c59aa92b82" Mar 08 03:49:04.189688 master-0 kubenswrapper[7547]: E0308 03:49:04.189524 7547 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 08 03:49:05.055642 master-0 kubenswrapper[7547]: I0308 03:49:05.055561 7547 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-slm72 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.16:8443/healthz\": dial tcp 10.128.0.16:8443: connect: connection refused" start-of-body= Mar 08 03:49:05.056046 master-0 kubenswrapper[7547]: I0308 03:49:05.055649 7547 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" podUID="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.16:8443/healthz\": dial tcp 10.128.0.16:8443: connect: connection refused" Mar 08 03:49:05.244269 master-0 kubenswrapper[7547]: I0308 03:49:05.244155 7547 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="338e07cf4149947d0b4bb7aee072ff8d4da6cb3eeb924ae9f2fa6dc0d8d523b1" exitCode=0 Mar 08 03:49:05.244269 master-0 kubenswrapper[7547]: I0308 03:49:05.244224 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"338e07cf4149947d0b4bb7aee072ff8d4da6cb3eeb924ae9f2fa6dc0d8d523b1"} Mar 08 03:49:05.248457 master-0 kubenswrapper[7547]: I0308 03:49:05.248361 7547 generic.go:334] "Generic (PLEG): container finished" podID="354f29997baa583b6238f7de9108ee10" containerID="c18b9f8b4dcef22d65b7b32df1f7077ca430d4d3cab49ca6d36290193d631e27" exitCode=0 Mar 08 03:49:06.675608 master-0 kubenswrapper[7547]: I0308 03:49:06.675469 7547 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:49:07.574187 master-0 kubenswrapper[7547]: I0308 03:49:07.574135 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_354f29997baa583b6238f7de9108ee10/etcdctl/0.log" Mar 08 03:49:07.574291 master-0 kubenswrapper[7547]: I0308 03:49:07.574265 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:49:07.748018 master-0 kubenswrapper[7547]: I0308 03:49:07.747944 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"354f29997baa583b6238f7de9108ee10\" (UID: \"354f29997baa583b6238f7de9108ee10\") " Mar 08 03:49:07.748661 master-0 kubenswrapper[7547]: I0308 03:49:07.748140 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"354f29997baa583b6238f7de9108ee10\" (UID: \"354f29997baa583b6238f7de9108ee10\") " Mar 08 03:49:07.748661 master-0 kubenswrapper[7547]: I0308 03:49:07.748270 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs" (OuterVolumeSpecName: "certs") pod "354f29997baa583b6238f7de9108ee10" (UID: "354f29997baa583b6238f7de9108ee10"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:49:07.748661 master-0 kubenswrapper[7547]: I0308 03:49:07.748399 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir" (OuterVolumeSpecName: "data-dir") pod "354f29997baa583b6238f7de9108ee10" (UID: "354f29997baa583b6238f7de9108ee10"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:49:07.748661 master-0 kubenswrapper[7547]: I0308 03:49:07.748656 7547 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:49:07.748955 master-0 kubenswrapper[7547]: I0308 03:49:07.748685 7547 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:49:08.270414 master-0 kubenswrapper[7547]: I0308 03:49:08.270345 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_354f29997baa583b6238f7de9108ee10/etcdctl/0.log" Mar 08 03:49:08.270654 master-0 kubenswrapper[7547]: I0308 03:49:08.270437 7547 generic.go:334] "Generic (PLEG): container finished" podID="354f29997baa583b6238f7de9108ee10" containerID="4573b175e4638284868f035fd979eb84c441b639f2cba6882ebb0bdabc7d53f1" exitCode=137 Mar 08 03:49:08.270654 master-0 kubenswrapper[7547]: I0308 03:49:08.270502 7547 scope.go:117] "RemoveContainer" containerID="c18b9f8b4dcef22d65b7b32df1f7077ca430d4d3cab49ca6d36290193d631e27" Mar 08 03:49:08.270654 master-0 kubenswrapper[7547]: I0308 03:49:08.270523 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:49:08.288603 master-0 kubenswrapper[7547]: I0308 03:49:08.288558 7547 scope.go:117] "RemoveContainer" containerID="4573b175e4638284868f035fd979eb84c441b639f2cba6882ebb0bdabc7d53f1" Mar 08 03:49:08.306952 master-0 kubenswrapper[7547]: I0308 03:49:08.306789 7547 scope.go:117] "RemoveContainer" containerID="c18b9f8b4dcef22d65b7b32df1f7077ca430d4d3cab49ca6d36290193d631e27" Mar 08 03:49:08.307518 master-0 kubenswrapper[7547]: E0308 03:49:08.307469 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c18b9f8b4dcef22d65b7b32df1f7077ca430d4d3cab49ca6d36290193d631e27\": container with ID starting with c18b9f8b4dcef22d65b7b32df1f7077ca430d4d3cab49ca6d36290193d631e27 not found: ID does not exist" containerID="c18b9f8b4dcef22d65b7b32df1f7077ca430d4d3cab49ca6d36290193d631e27" Mar 08 03:49:08.307623 master-0 kubenswrapper[7547]: I0308 03:49:08.307518 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c18b9f8b4dcef22d65b7b32df1f7077ca430d4d3cab49ca6d36290193d631e27"} err="failed to get container status \"c18b9f8b4dcef22d65b7b32df1f7077ca430d4d3cab49ca6d36290193d631e27\": rpc error: code = NotFound desc = could not find container \"c18b9f8b4dcef22d65b7b32df1f7077ca430d4d3cab49ca6d36290193d631e27\": container with ID starting with c18b9f8b4dcef22d65b7b32df1f7077ca430d4d3cab49ca6d36290193d631e27 not found: ID does not exist" Mar 08 03:49:08.307623 master-0 kubenswrapper[7547]: I0308 03:49:08.307561 7547 scope.go:117] "RemoveContainer" containerID="4573b175e4638284868f035fd979eb84c441b639f2cba6882ebb0bdabc7d53f1" Mar 08 03:49:08.308285 master-0 kubenswrapper[7547]: E0308 03:49:08.308228 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4573b175e4638284868f035fd979eb84c441b639f2cba6882ebb0bdabc7d53f1\": container with ID starting with 4573b175e4638284868f035fd979eb84c441b639f2cba6882ebb0bdabc7d53f1 not found: ID does not exist" containerID="4573b175e4638284868f035fd979eb84c441b639f2cba6882ebb0bdabc7d53f1" Mar 08 03:49:08.308285 master-0 kubenswrapper[7547]: I0308 03:49:08.308268 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4573b175e4638284868f035fd979eb84c441b639f2cba6882ebb0bdabc7d53f1"} err="failed to get container status \"4573b175e4638284868f035fd979eb84c441b639f2cba6882ebb0bdabc7d53f1\": rpc error: code = NotFound desc = could not find container \"4573b175e4638284868f035fd979eb84c441b639f2cba6882ebb0bdabc7d53f1\": container with ID starting with 4573b175e4638284868f035fd979eb84c441b639f2cba6882ebb0bdabc7d53f1 not found: ID does not exist" Mar 08 03:49:09.101799 master-0 kubenswrapper[7547]: E0308 03:49:09.101682 7547 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:49:09.259287 master-0 kubenswrapper[7547]: I0308 03:49:09.259174 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="354f29997baa583b6238f7de9108ee10" path="/var/lib/kubelet/pods/354f29997baa583b6238f7de9108ee10/volumes" Mar 08 03:49:09.259589 master-0 kubenswrapper[7547]: I0308 03:49:09.259502 7547 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 08 03:49:09.288347 master-0 kubenswrapper[7547]: E0308 03:49:09.288282 7547 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:49:11.450266 master-0 kubenswrapper[7547]: E0308 03:49:11.450070 7547 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0-master-0.189ac1193b8ccc46 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:48:37.441137734 +0000 UTC m=+80.386822247,LastTimestamp:2026-03-08 03:48:37.441137734 +0000 UTC m=+80.386822247,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:49:15.055774 master-0 kubenswrapper[7547]: I0308 03:49:15.055678 7547 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-slm72 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.16:8443/healthz\": dial tcp 10.128.0.16:8443: connect: connection refused" start-of-body= Mar 08 03:49:15.056621 master-0 kubenswrapper[7547]: I0308 03:49:15.055791 7547 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" podUID="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.16:8443/healthz\": dial tcp 10.128.0.16:8443: connect: connection refused" Mar 08 03:49:16.326856 master-0 kubenswrapper[7547]: I0308 03:49:16.326759 7547 generic.go:334] "Generic (PLEG): container finished" podID="b3eea925-73b3-4693-8f0e-6dd26107f60a" containerID="9c4b058bc98e254a8a4b1a2af3561d6b7519c1e36ed6446917dcc85e6786652f" exitCode=0 Mar 08 03:49:16.529670 master-0 kubenswrapper[7547]: I0308 03:49:16.529600 7547 patch_prober.go:28] interesting pod/etcd-operator-5884b9cd56-vzms7 container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.5:8443/healthz\": dial tcp 10.128.0.5:8443: connect: connection refused" start-of-body= Mar 08 03:49:16.530189 master-0 kubenswrapper[7547]: I0308 03:49:16.530140 7547 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" podUID="5a7752f9-7b9a-451f-997a-e9f696d38b34" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.5:8443/healthz\": dial tcp 10.128.0.5:8443: connect: connection refused" Mar 08 03:49:16.675577 master-0 kubenswrapper[7547]: I0308 03:49:16.675321 7547 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:49:18.253447 master-0 kubenswrapper[7547]: E0308 03:49:18.253376 7547 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 08 03:49:19.102701 master-0 kubenswrapper[7547]: E0308 03:49:19.102578 7547 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:49:19.289139 master-0 kubenswrapper[7547]: E0308 03:49:19.289033 7547 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:49:19.351413 master-0 kubenswrapper[7547]: I0308 03:49:19.351318 7547 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="3df151c3da265182304d84afb0b3bc1e42416ef6485b53e3bd88733c8055b421" exitCode=0 Mar 08 03:49:20.361063 master-0 kubenswrapper[7547]: I0308 03:49:20.360994 7547 generic.go:334] "Generic (PLEG): container finished" podID="5a7752f9-7b9a-451f-997a-e9f696d38b34" containerID="37cbe69de0ead690fe5f97f7713b1a785d6ee472fbe38e74b1ca8bbb8ffc0b32" exitCode=0 Mar 08 03:49:20.364622 master-0 kubenswrapper[7547]: I0308 03:49:20.364557 7547 generic.go:334] "Generic (PLEG): container finished" podID="7ff63c73-62a3-44b4-acd3-1b3df175794f" containerID="0d9517a4dbfbd842f9c484f0081150c86a4b5af486a5ddb8461a1d470f81112c" exitCode=0 Mar 08 03:49:21.221690 master-0 kubenswrapper[7547]: E0308 03:49:21.221564 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[auth-proxy-config], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" podUID="e78b283b-981e-48d7-a5f2-53f8401766ea" Mar 08 03:49:25.820715 master-0 kubenswrapper[7547]: I0308 03:49:25.820029 7547 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-slm72 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.16:8443/healthz\": dial tcp 10.128.0.16:8443: connect: connection refused" start-of-body= Mar 08 03:49:25.820715 master-0 kubenswrapper[7547]: I0308 03:49:25.820116 7547 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" podUID="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.16:8443/healthz\": dial tcp 10.128.0.16:8443: connect: connection refused" Mar 08 03:49:26.319007 master-0 kubenswrapper[7547]: I0308 03:49:26.318921 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:49:26.319252 master-0 kubenswrapper[7547]: E0308 03:49:26.319075 7547 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: configmap "kube-rbac-proxy" not found Mar 08 03:49:26.319252 master-0 kubenswrapper[7547]: E0308 03:49:26.319207 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:51:28.319180787 +0000 UTC m=+251.264865330 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : configmap "kube-rbac-proxy" not found Mar 08 03:49:26.826174 master-0 kubenswrapper[7547]: I0308 03:49:26.826122 7547 generic.go:334] "Generic (PLEG): container finished" podID="e4541b7b-3f7f-4851-9bd9-26fcda5cab13" containerID="2b3399b78be3045c232df9d3c4545d85577efb44cef1d2c0a18e98d67e4c7cb7" exitCode=0 Mar 08 03:49:26.827802 master-0 kubenswrapper[7547]: I0308 03:49:26.827757 7547 generic.go:334] "Generic (PLEG): container finished" podID="3ddfd0e7-fe76-41bc-b316-94505df81002" containerID="8f8cad46e77715e164ec2e62df8c2ee60a2b96fa8c918baa1589a3082317e15b" exitCode=0 Mar 08 03:49:26.829726 master-0 kubenswrapper[7547]: I0308 03:49:26.829684 7547 generic.go:334] "Generic (PLEG): container finished" podID="30211469-7108-4820-a988-26fc4ced734e" containerID="47c3f7232d0f0bc4de9dbe2ca382d3e0709c3d618e0b06a088f2ef41c6b071e7" exitCode=0 Mar 08 03:49:29.104985 master-0 kubenswrapper[7547]: E0308 03:49:29.103603 7547 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:49:29.104985 master-0 kubenswrapper[7547]: I0308 03:49:29.103685 7547 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 08 03:49:29.290049 master-0 kubenswrapper[7547]: E0308 03:49:29.289956 7547 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:49:29.290049 master-0 kubenswrapper[7547]: E0308 03:49:29.290020 7547 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 03:49:29.852382 master-0 kubenswrapper[7547]: I0308 03:49:29.852338 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-ggzm8_164586b1-f133-4427-8ab6-eb0839b79738/approver/0.log" Mar 08 03:49:29.853403 master-0 kubenswrapper[7547]: I0308 03:49:29.853324 7547 generic.go:334] "Generic (PLEG): container finished" podID="164586b1-f133-4427-8ab6-eb0839b79738" containerID="eca6f5647fbdf9b3ef8c7044a7fb91cd16de860543c74991829e340da4a238fe" exitCode=1 Mar 08 03:49:29.855992 master-0 kubenswrapper[7547]: I0308 03:49:29.855930 7547 generic.go:334] "Generic (PLEG): container finished" podID="5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a" containerID="357e1b5825405be66b8754168b57e640c102e496555d0a6b7dd9834bacebf15e" exitCode=0 Mar 08 03:49:30.865255 master-0 kubenswrapper[7547]: I0308 03:49:30.865086 7547 generic.go:334] "Generic (PLEG): container finished" podID="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" containerID="d4bcecac644708f2f25ce7ab391ef8889989648db06b3e9db25dd3f64bfa6da8" exitCode=0 Mar 08 03:49:35.900693 master-0 kubenswrapper[7547]: I0308 03:49:35.900514 7547 generic.go:334] "Generic (PLEG): container finished" podID="0d377285-0336-41b7-b48f-c44a7b563498" containerID="db7380d3fe7301944a7a66ec837b1d91caad2bb5d7122a498e8b10a38c9f552b" exitCode=0 Mar 08 03:49:37.471140 master-0 kubenswrapper[7547]: I0308 03:49:37.471071 7547 status_manager.go:851] "Failed to get status for pod" podUID="a4ff897a-ac47-45e0-aa7d-88c5aea50b70" pod="openshift-marketplace/redhat-marketplace-w2lsp" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods redhat-marketplace-w2lsp)" Mar 08 03:49:39.104416 master-0 kubenswrapper[7547]: E0308 03:49:39.104326 7547 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 08 03:49:43.262850 master-0 kubenswrapper[7547]: E0308 03:49:43.262726 7547 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:49:43.263705 master-0 kubenswrapper[7547]: E0308 03:49:43.262999 7547 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.032s" Mar 08 03:49:43.263705 master-0 kubenswrapper[7547]: I0308 03:49:43.263043 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:49:43.263705 master-0 kubenswrapper[7547]: I0308 03:49:43.263079 7547 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:49:43.263705 master-0 kubenswrapper[7547]: I0308 03:49:43.263194 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" event={"ID":"b3eea925-73b3-4693-8f0e-6dd26107f60a","Type":"ContainerDied","Data":"9c4b058bc98e254a8a4b1a2af3561d6b7519c1e36ed6446917dcc85e6786652f"} Mar 08 03:49:43.264046 master-0 kubenswrapper[7547]: I0308 03:49:43.263920 7547 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"c67a98430dbc4d1ec5edc5c2aa37ee19dd047e853de22d326be45ea84e3430ff"} pod="kube-system/bootstrap-kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 08 03:49:43.264046 master-0 kubenswrapper[7547]: I0308 03:49:43.264037 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" containerID="cri-o://c67a98430dbc4d1ec5edc5c2aa37ee19dd047e853de22d326be45ea84e3430ff" gracePeriod=30 Mar 08 03:49:43.265285 master-0 kubenswrapper[7547]: I0308 03:49:43.265211 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:49:43.268278 master-0 kubenswrapper[7547]: I0308 03:49:43.268221 7547 scope.go:117] "RemoveContainer" containerID="d4bcecac644708f2f25ce7ab391ef8889989648db06b3e9db25dd3f64bfa6da8" Mar 08 03:49:43.269182 master-0 kubenswrapper[7547]: I0308 03:49:43.269124 7547 scope.go:117] "RemoveContainer" containerID="9c4b058bc98e254a8a4b1a2af3561d6b7519c1e36ed6446917dcc85e6786652f" Mar 08 03:49:43.275714 master-0 kubenswrapper[7547]: I0308 03:49:43.275658 7547 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 08 03:49:43.968800 master-0 kubenswrapper[7547]: I0308 03:49:43.968699 7547 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="c67a98430dbc4d1ec5edc5c2aa37ee19dd047e853de22d326be45ea84e3430ff" exitCode=2 Mar 08 03:49:45.454089 master-0 kubenswrapper[7547]: E0308 03:49:45.453872 7547 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{cluster-version-operator-8c9c967c7-zq9rp.189ac1193d16f297 openshift-cluster-version 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-cluster-version,Name:cluster-version-operator-8c9c967c7-zq9rp,UID:2262647b-c315-477a-93bd-f168c1810475,APIVersion:v1,ResourceVersion:9555,FieldPath:spec.containers{cluster-version-operator},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:48:37.466968727 +0000 UTC m=+80.412653250,LastTimestamp:2026-03-08 03:48:37.466968727 +0000 UTC m=+80.412653250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:49:49.306955 master-0 kubenswrapper[7547]: E0308 03:49:49.306672 7547 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Mar 08 03:49:49.377597 master-0 kubenswrapper[7547]: E0308 03:49:49.377327 7547 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:49:39Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:49:39Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:49:39Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:49:39Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:ae042a5d32eb2f18d537f2068849e665b55df7d8360daedaaeea98bd2a79e769\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d077bbabe6cb885ed229119008480493e8364e4bfddaa00b099f68c52b016e6b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733328350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:063b8972231e65eb43f6545ba37804f68138dc54d97b91a652a1c5bc7dc76aa5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cf682d23b2857e455609879a0867d171a221c18e2cec995dd79570b77c5a4705\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1272201949},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e0c034ae18daa01af8d073f8cc24ae4af87883c664304910eab1167fdfd60c0b\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ef0c6b9e405f7a452211e063ce07ded04ccbe38b53860bfd71b5a7cd5072830a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1229556414},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:79984dfbdf9aeae3985c7fd7515e12328775c0e7fc4782929d0998f4dd2a87c6\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:7be89499615ec913d0fe40ca89682080a3f1181a066dbc501c877cc7ccbcc9ae\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1220167376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff40e33e63d6c1f4e4393d5506e38def25ba20582d980fec8b81f81c867ceeec\\\"],\\\"sizeBytes\\\":918278686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5\\\"],\\\"sizeBytes\\\":513581866},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b8cb5e0caeca0fb02f3e8c72b7ddf1c49e3c602e42e119ba30c60525f1db1821\\\"],\\\"sizeBytes\\\":504658657},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2fe5144b1f72bdcf5d5a52130f02ed86fbec3875cc4ac108ead00eaac1659e06\\\"],\\\"sizeBytes\\\":487090672},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a4c3e6ca0cd26f7eb5270cfafbcf423cf2986d152bf5b9fc6469d40599e104e\\\"],\\\"sizeBytes\\\":484450382},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c54c3f7cffe057ae0bdf26163d5e46744685083ae16fc97112e32beacd2d8955\\\"],\\\"sizeBytes\\\":484175664},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d74fe7cb12c554c120262683d9c4066f33ae4f60a5fad83cba419d851b98c12d\\\"],\\\"sizeBytes\\\":470822665},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9b8bc43bac294be3c7669cde049e388ad9d8751242051ba40f83e1c401eceda\\\"],\\\"sizeBytes\\\":468263999},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\\\"],\\\"sizeBytes\\\":465086330},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a85dab5856916220df6f05ce9d6aa10cd4fa0234093b55355246690bba05ad1\\\"],\\\"sizeBytes\\\":463700811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b714a7ada1e295b599b432f32e1fd5b74c8cdbe6fe51e95306322b25cb873914\\\"],\\\"sizeBytes\\\":458126424},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5230462066ab36e3025524e948dd33fa6f51ee29a4f91fa469bfc268568b5fd9\\\"],\\\"sizeBytes\\\":456575686}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:49:59.378596 master-0 kubenswrapper[7547]: E0308 03:49:59.378393 7547 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:49:59.709098 master-0 kubenswrapper[7547]: E0308 03:49:59.708980 7547 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Mar 08 03:50:08.144446 master-0 kubenswrapper[7547]: I0308 03:50:08.144367 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-86d7cdfdfb-chpl6_26180f77-0b1a-4d0f-9ed0-a12fdee69817/kube-controller-manager-operator/1.log" Mar 08 03:50:08.145570 master-0 kubenswrapper[7547]: I0308 03:50:08.145385 7547 generic.go:334] "Generic (PLEG): container finished" podID="26180f77-0b1a-4d0f-9ed0-a12fdee69817" containerID="b9a51bfe829084894104463976cade708d7a51f90ef15a899d7341f663daf1dc" exitCode=255 Mar 08 03:50:09.379527 master-0 kubenswrapper[7547]: E0308 03:50:09.379409 7547 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:50:10.509999 master-0 kubenswrapper[7547]: E0308 03:50:10.509869 7547 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Mar 08 03:50:16.529725 master-0 kubenswrapper[7547]: I0308 03:50:16.529590 7547 patch_prober.go:28] interesting pod/etcd-operator-5884b9cd56-vzms7 container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.5:8443/healthz\": dial tcp 10.128.0.5:8443: connect: connection refused" start-of-body= Mar 08 03:50:16.530739 master-0 kubenswrapper[7547]: I0308 03:50:16.529764 7547 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" podUID="5a7752f9-7b9a-451f-997a-e9f696d38b34" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.5:8443/healthz\": dial tcp 10.128.0.5:8443: connect: connection refused" Mar 08 03:50:17.208045 master-0 kubenswrapper[7547]: I0308 03:50:17.207965 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-kt66j_0ebf1330-e044-4ff5-8b48-2d667e0c5625/openshift-controller-manager-operator/1.log" Mar 08 03:50:17.209362 master-0 kubenswrapper[7547]: I0308 03:50:17.209333 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-kt66j_0ebf1330-e044-4ff5-8b48-2d667e0c5625/openshift-controller-manager-operator/0.log" Mar 08 03:50:17.209561 master-0 kubenswrapper[7547]: I0308 03:50:17.209529 7547 generic.go:334] "Generic (PLEG): container finished" podID="0ebf1330-e044-4ff5-8b48-2d667e0c5625" containerID="cae216678d94c10a368ff595527d708d87bd43ed6865eacedbf892861c47fe3a" exitCode=255 Mar 08 03:50:17.278777 master-0 kubenswrapper[7547]: E0308 03:50:17.278673 7547 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:50:17.279018 master-0 kubenswrapper[7547]: E0308 03:50:17.278996 7547 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.016s" Mar 08 03:50:17.280929 master-0 kubenswrapper[7547]: I0308 03:50:17.280876 7547 scope.go:117] "RemoveContainer" containerID="37cbe69de0ead690fe5f97f7713b1a785d6ee472fbe38e74b1ca8bbb8ffc0b32" Mar 08 03:50:17.281122 master-0 kubenswrapper[7547]: I0308 03:50:17.281074 7547 scope.go:117] "RemoveContainer" containerID="357e1b5825405be66b8754168b57e640c102e496555d0a6b7dd9834bacebf15e" Mar 08 03:50:17.284300 master-0 kubenswrapper[7547]: I0308 03:50:17.284128 7547 scope.go:117] "RemoveContainer" containerID="8f8cad46e77715e164ec2e62df8c2ee60a2b96fa8c918baa1589a3082317e15b" Mar 08 03:50:17.286172 master-0 kubenswrapper[7547]: I0308 03:50:17.284707 7547 scope.go:117] "RemoveContainer" containerID="b9a51bfe829084894104463976cade708d7a51f90ef15a899d7341f663daf1dc" Mar 08 03:50:17.286172 master-0 kubenswrapper[7547]: I0308 03:50:17.285143 7547 scope.go:117] "RemoveContainer" containerID="eca6f5647fbdf9b3ef8c7044a7fb91cd16de860543c74991829e340da4a238fe" Mar 08 03:50:17.286172 master-0 kubenswrapper[7547]: E0308 03:50:17.285148 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-86d7cdfdfb-chpl6_openshift-kube-controller-manager-operator(26180f77-0b1a-4d0f-9ed0-a12fdee69817)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" podUID="26180f77-0b1a-4d0f-9ed0-a12fdee69817" Mar 08 03:50:17.286172 master-0 kubenswrapper[7547]: I0308 03:50:17.285202 7547 scope.go:117] "RemoveContainer" containerID="cae216678d94c10a368ff595527d708d87bd43ed6865eacedbf892861c47fe3a" Mar 08 03:50:17.286172 master-0 kubenswrapper[7547]: E0308 03:50:17.285520 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=openshift-controller-manager-operator pod=openshift-controller-manager-operator-8565d84698-kt66j_openshift-controller-manager-operator(0ebf1330-e044-4ff5-8b48-2d667e0c5625)\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" podUID="0ebf1330-e044-4ff5-8b48-2d667e0c5625" Mar 08 03:50:17.286172 master-0 kubenswrapper[7547]: I0308 03:50:17.285813 7547 scope.go:117] "RemoveContainer" containerID="2b3399b78be3045c232df9d3c4545d85577efb44cef1d2c0a18e98d67e4c7cb7" Mar 08 03:50:17.288610 master-0 kubenswrapper[7547]: I0308 03:50:17.286984 7547 scope.go:117] "RemoveContainer" containerID="db7380d3fe7301944a7a66ec837b1d91caad2bb5d7122a498e8b10a38c9f552b" Mar 08 03:50:17.288610 master-0 kubenswrapper[7547]: I0308 03:50:17.288050 7547 scope.go:117] "RemoveContainer" containerID="0d9517a4dbfbd842f9c484f0081150c86a4b5af486a5ddb8461a1d470f81112c" Mar 08 03:50:17.291176 master-0 kubenswrapper[7547]: I0308 03:50:17.291112 7547 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 08 03:50:18.237103 master-0 kubenswrapper[7547]: I0308 03:50:18.237059 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-ggzm8_164586b1-f133-4427-8ab6-eb0839b79738/approver/0.log" Mar 08 03:50:19.379855 master-0 kubenswrapper[7547]: E0308 03:50:19.379705 7547 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": context deadline exceeded" Mar 08 03:50:19.457975 master-0 kubenswrapper[7547]: E0308 03:50:19.457714 7547 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{redhat-operators-rnz4w.189ac1193ee586f8 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-rnz4w,UID:6bee226a-2a66-4032-8aba-2c8b82abcb6a,APIVersion:v1,ResourceVersion:9072,FieldPath:spec.initContainers{extract-content},},Reason:Pulled,Message:Successfully pulled image \"registry.redhat.io/redhat/redhat-operator-index:v4.18\" in 668ms (668ms including waiting). Image size: 1733328350 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:48:37.497284344 +0000 UTC m=+80.442968867,LastTimestamp:2026-03-08 03:48:37.497284344 +0000 UTC m=+80.442968867,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:50:22.111315 master-0 kubenswrapper[7547]: E0308 03:50:22.111013 7547 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Mar 08 03:50:26.307417 master-0 kubenswrapper[7547]: I0308 03:50:26.307323 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-75qmb_2f59fe81-deee-4ced-ae9d-f17752c82c4b/manager/0.log" Mar 08 03:50:26.307417 master-0 kubenswrapper[7547]: I0308 03:50:26.307408 7547 generic.go:334] "Generic (PLEG): container finished" podID="2f59fe81-deee-4ced-ae9d-f17752c82c4b" containerID="3059f49f388319ee646920103084d28d8b0077750e77df3225c9bad4053dd550" exitCode=1 Mar 08 03:50:29.380773 master-0 kubenswrapper[7547]: E0308 03:50:29.380410 7547 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:50:29.380773 master-0 kubenswrapper[7547]: E0308 03:50:29.380487 7547 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 03:50:32.353497 master-0 kubenswrapper[7547]: I0308 03:50:32.353410 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/0.log" Mar 08 03:50:32.353497 master-0 kubenswrapper[7547]: I0308 03:50:32.353481 7547 generic.go:334] "Generic (PLEG): container finished" podID="9ec89e27-4360-48f2-a7ca-5d823bda4510" containerID="e1cf094994e913e66c5a9e6e155292c3e34468235cb173dcf1919a0eed0dd4ca" exitCode=1 Mar 08 03:50:34.575898 master-0 kubenswrapper[7547]: I0308 03:50:34.575761 7547 patch_prober.go:28] interesting pod/operator-controller-controller-manager-6598bfb6c4-75qmb container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.40:8081/readyz\": dial tcp 10.128.0.40:8081: connect: connection refused" start-of-body= Mar 08 03:50:34.577441 master-0 kubenswrapper[7547]: I0308 03:50:34.577368 7547 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" podUID="2f59fe81-deee-4ced-ae9d-f17752c82c4b" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.40:8081/readyz\": dial tcp 10.128.0.40:8081: connect: connection refused" Mar 08 03:50:35.312187 master-0 kubenswrapper[7547]: E0308 03:50:35.312046 7547 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Mar 08 03:50:37.473363 master-0 kubenswrapper[7547]: I0308 03:50:37.473259 7547 status_manager.go:851] "Failed to get status for pod" podUID="baab6171-046d-4fc9-b7d7-ff2fd12f185f" pod="openshift-kube-apiserver/installer-1-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-1-master-0)" Mar 08 03:50:38.394599 master-0 kubenswrapper[7547]: I0308 03:50:38.394521 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_47643289-ac4b-425d-8ea1-913b6ca39ee0/installer/0.log" Mar 08 03:50:38.394928 master-0 kubenswrapper[7547]: I0308 03:50:38.394601 7547 generic.go:334] "Generic (PLEG): container finished" podID="47643289-ac4b-425d-8ea1-913b6ca39ee0" containerID="18670cb65f485400f1fdb45bed0a06f4e06d21d135459ea29b3c0fcd10f2d210" exitCode=1 Mar 08 03:50:38.397906 master-0 kubenswrapper[7547]: I0308 03:50:38.397748 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-t77qr_c9de4939-680a-4e3e-89fd-e20ecb8b10f2/ingress-operator/0.log" Mar 08 03:50:38.397906 master-0 kubenswrapper[7547]: I0308 03:50:38.397801 7547 generic.go:334] "Generic (PLEG): container finished" podID="c9de4939-680a-4e3e-89fd-e20ecb8b10f2" containerID="db28e69e1ea518493719876e18b2faf675fe251b59f240840b24dd0b6d115924" exitCode=1 Mar 08 03:50:38.400542 master-0 kubenswrapper[7547]: I0308 03:50:38.400479 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-5-master-0_0f865279-e751-456d-8c96-6381f8b45ce1/installer/0.log" Mar 08 03:50:38.400660 master-0 kubenswrapper[7547]: I0308 03:50:38.400567 7547 generic.go:334] "Generic (PLEG): container finished" podID="0f865279-e751-456d-8c96-6381f8b45ce1" containerID="9289f2928e2e95c2ade5890aeb0e93be12cf91a6e92bf8866de144086be0fb16" exitCode=1 Mar 08 03:50:41.423200 master-0 kubenswrapper[7547]: I0308 03:50:41.422911 7547 generic.go:334] "Generic (PLEG): container finished" podID="54ad284e-d40e-4e69-b898-f5093952a0e6" containerID="791aee9d23f28d5b9bc6bbbcd3f26705c245a61021bebb20a57835608ad72cab" exitCode=0 Mar 08 03:50:43.438651 master-0 kubenswrapper[7547]: I0308 03:50:43.438560 7547 generic.go:334] "Generic (PLEG): container finished" podID="ee586416-6f56-4ea4-ad62-95de1e6df23b" containerID="644f0c7d4552f15957ecfc56f2d37a06ec2757ddcc7c2c371f0c34b92aa63533" exitCode=0 Mar 08 03:50:44.572673 master-0 kubenswrapper[7547]: I0308 03:50:44.572551 7547 patch_prober.go:28] interesting pod/operator-controller-controller-manager-6598bfb6c4-75qmb container/manager namespace/openshift-operator-controller: Liveness probe status=failure output="Get \"http://10.128.0.40:8081/healthz\": dial tcp 10.128.0.40:8081: connect: connection refused" start-of-body= Mar 08 03:50:44.572673 master-0 kubenswrapper[7547]: I0308 03:50:44.572614 7547 patch_prober.go:28] interesting pod/operator-controller-controller-manager-6598bfb6c4-75qmb container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.40:8081/readyz\": dial tcp 10.128.0.40:8081: connect: connection refused" start-of-body= Mar 08 03:50:44.572673 master-0 kubenswrapper[7547]: I0308 03:50:44.572648 7547 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" podUID="2f59fe81-deee-4ced-ae9d-f17752c82c4b" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.40:8081/healthz\": dial tcp 10.128.0.40:8081: connect: connection refused" Mar 08 03:50:44.573628 master-0 kubenswrapper[7547]: I0308 03:50:44.572746 7547 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" podUID="2f59fe81-deee-4ced-ae9d-f17752c82c4b" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.40:8081/readyz\": dial tcp 10.128.0.40:8081: connect: connection refused" Mar 08 03:50:45.454406 master-0 kubenswrapper[7547]: I0308 03:50:45.454287 7547 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="a234e673236026ace29cfb9074b221693da101dadaf715ffecb4dfd643bf0e5f" exitCode=1 Mar 08 03:50:45.457280 master-0 kubenswrapper[7547]: I0308 03:50:45.457220 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-8h6fj_1b69fbf6-1ca5-413e-bffd-965730bcec1b/manager/0.log" Mar 08 03:50:45.457912 master-0 kubenswrapper[7547]: I0308 03:50:45.457801 7547 generic.go:334] "Generic (PLEG): container finished" podID="1b69fbf6-1ca5-413e-bffd-965730bcec1b" containerID="55aa7553b7b737c589cdd0270a8ec23cc64ce136f8130219ce1dabd7e976b992" exitCode=1 Mar 08 03:50:49.650896 master-0 kubenswrapper[7547]: E0308 03:50:49.650623 7547 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:50:39Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:50:39Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:50:39Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:50:39Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:ae042a5d32eb2f18d537f2068849e665b55df7d8360daedaaeea98bd2a79e769\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d077bbabe6cb885ed229119008480493e8364e4bfddaa00b099f68c52b016e6b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733328350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:063b8972231e65eb43f6545ba37804f68138dc54d97b91a652a1c5bc7dc76aa5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cf682d23b2857e455609879a0867d171a221c18e2cec995dd79570b77c5a4705\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1272201949},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e0c034ae18daa01af8d073f8cc24ae4af87883c664304910eab1167fdfd60c0b\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ef0c6b9e405f7a452211e063ce07ded04ccbe38b53860bfd71b5a7cd5072830a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1229556414},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:79984dfbdf9aeae3985c7fd7515e12328775c0e7fc4782929d0998f4dd2a87c6\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:7be89499615ec913d0fe40ca89682080a3f1181a066dbc501c877cc7ccbcc9ae\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1220167376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff40e33e63d6c1f4e4393d5506e38def25ba20582d980fec8b81f81c867ceeec\\\"],\\\"sizeBytes\\\":918278686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5\\\"],\\\"sizeBytes\\\":513581866},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b8cb5e0caeca0fb02f3e8c72b7ddf1c49e3c602e42e119ba30c60525f1db1821\\\"],\\\"sizeBytes\\\":504658657},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2fe5144b1f72bdcf5d5a52130f02ed86fbec3875cc4ac108ead00eaac1659e06\\\"],\\\"sizeBytes\\\":487090672},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a4c3e6ca0cd26f7eb5270cfafbcf423cf2986d152bf5b9fc6469d40599e104e\\\"],\\\"sizeBytes\\\":484450382},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c54c3f7cffe057ae0bdf26163d5e46744685083ae16fc97112e32beacd2d8955\\\"],\\\"sizeBytes\\\":484175664},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d74fe7cb12c554c120262683d9c4066f33ae4f60a5fad83cba419d851b98c12d\\\"],\\\"sizeBytes\\\":470822665},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9b8bc43bac294be3c7669cde049e388ad9d8751242051ba40f83e1c401eceda\\\"],\\\"sizeBytes\\\":468263999},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\\\"],\\\"sizeBytes\\\":465086330},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a85dab5856916220df6f05ce9d6aa10cd4fa0234093b55355246690bba05ad1\\\"],\\\"sizeBytes\\\":463700811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b714a7ada1e295b599b432f32e1fd5b74c8cdbe6fe51e95306322b25cb873914\\\"],\\\"sizeBytes\\\":458126424},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5230462066ab36e3025524e948dd33fa6f51ee29a4f91fa469bfc268568b5fd9\\\"],\\\"sizeBytes\\\":456575686}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:50:50.315883 master-0 kubenswrapper[7547]: I0308 03:50:50.315721 7547 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-9sw2d container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.24:8080/healthz\": dial tcp 10.128.0.24:8080: connect: connection refused" start-of-body= Mar 08 03:50:50.316235 master-0 kubenswrapper[7547]: I0308 03:50:50.315731 7547 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-9sw2d container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.24:8080/healthz\": dial tcp 10.128.0.24:8080: connect: connection refused" start-of-body= Mar 08 03:50:50.316235 master-0 kubenswrapper[7547]: I0308 03:50:50.316083 7547 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" podUID="54ad284e-d40e-4e69-b898-f5093952a0e6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.24:8080/healthz\": dial tcp 10.128.0.24:8080: connect: connection refused" Mar 08 03:50:50.317141 master-0 kubenswrapper[7547]: I0308 03:50:50.317058 7547 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" podUID="54ad284e-d40e-4e69-b898-f5093952a0e6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.24:8080/healthz\": dial tcp 10.128.0.24:8080: connect: connection refused" Mar 08 03:50:51.294520 master-0 kubenswrapper[7547]: E0308 03:50:51.294425 7547 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:50:51.295234 master-0 kubenswrapper[7547]: E0308 03:50:51.294730 7547 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.016s" Mar 08 03:50:51.295750 master-0 kubenswrapper[7547]: I0308 03:50:51.295696 7547 scope.go:117] "RemoveContainer" containerID="47c3f7232d0f0bc4de9dbe2ca382d3e0709c3d618e0b06a088f2ef41c6b071e7" Mar 08 03:50:51.296043 master-0 kubenswrapper[7547]: I0308 03:50:51.295985 7547 scope.go:117] "RemoveContainer" containerID="b9a51bfe829084894104463976cade708d7a51f90ef15a899d7341f663daf1dc" Mar 08 03:50:51.296419 master-0 kubenswrapper[7547]: I0308 03:50:51.296373 7547 scope.go:117] "RemoveContainer" containerID="3059f49f388319ee646920103084d28d8b0077750e77df3225c9bad4053dd550" Mar 08 03:50:51.296637 master-0 kubenswrapper[7547]: I0308 03:50:51.296597 7547 scope.go:117] "RemoveContainer" containerID="cae216678d94c10a368ff595527d708d87bd43ed6865eacedbf892861c47fe3a" Mar 08 03:50:51.298813 master-0 kubenswrapper[7547]: I0308 03:50:51.297966 7547 scope.go:117] "RemoveContainer" containerID="db28e69e1ea518493719876e18b2faf675fe251b59f240840b24dd0b6d115924" Mar 08 03:50:51.313939 master-0 kubenswrapper[7547]: I0308 03:50:51.313876 7547 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 08 03:50:51.713130 master-0 kubenswrapper[7547]: E0308 03:50:51.713035 7547 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 08 03:50:52.093810 master-0 kubenswrapper[7547]: I0308 03:50:52.093725 7547 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-8h6fj container/manager namespace/openshift-catalogd: Liveness probe status=failure output="Get \"http://10.128.0.41:8081/healthz\": dial tcp 10.128.0.41:8081: connect: connection refused" start-of-body= Mar 08 03:50:52.093810 master-0 kubenswrapper[7547]: I0308 03:50:52.093808 7547 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" podUID="1b69fbf6-1ca5-413e-bffd-965730bcec1b" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.41:8081/healthz\": dial tcp 10.128.0.41:8081: connect: connection refused" Mar 08 03:50:52.094207 master-0 kubenswrapper[7547]: I0308 03:50:52.094012 7547 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-8h6fj container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.41:8081/readyz\": dial tcp 10.128.0.41:8081: connect: connection refused" start-of-body= Mar 08 03:50:52.094207 master-0 kubenswrapper[7547]: I0308 03:50:52.094106 7547 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" podUID="1b69fbf6-1ca5-413e-bffd-965730bcec1b" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.41:8081/readyz\": dial tcp 10.128.0.41:8081: connect: connection refused" Mar 08 03:50:52.506220 master-0 kubenswrapper[7547]: I0308 03:50:52.506119 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-86d7cdfdfb-chpl6_26180f77-0b1a-4d0f-9ed0-a12fdee69817/kube-controller-manager-operator/1.log" Mar 08 03:50:52.510617 master-0 kubenswrapper[7547]: I0308 03:50:52.510555 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-t77qr_c9de4939-680a-4e3e-89fd-e20ecb8b10f2/ingress-operator/0.log" Mar 08 03:50:52.513633 master-0 kubenswrapper[7547]: I0308 03:50:52.513580 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-kt66j_0ebf1330-e044-4ff5-8b48-2d667e0c5625/openshift-controller-manager-operator/1.log" Mar 08 03:50:52.514634 master-0 kubenswrapper[7547]: I0308 03:50:52.514585 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-kt66j_0ebf1330-e044-4ff5-8b48-2d667e0c5625/openshift-controller-manager-operator/0.log" Mar 08 03:50:52.518494 master-0 kubenswrapper[7547]: I0308 03:50:52.518440 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-75qmb_2f59fe81-deee-4ced-ae9d-f17752c82c4b/manager/0.log" Mar 08 03:50:53.461320 master-0 kubenswrapper[7547]: E0308 03:50:53.461078 7547 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{kube-controller-manager-operator-86d7cdfdfb-chpl6.189ac1075a4197f7 openshift-kube-controller-manager-operator 4150 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager-operator,Name:kube-controller-manager-operator-86d7cdfdfb-chpl6,UID:26180f77-0b1a-4d0f-9ed0-a12fdee69817,APIVersion:v1,ResourceVersion:3788,FieldPath:spec.containers{kube-controller-manager-operator},},Reason:Created,Message:Created container: kube-controller-manager-operator,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:47:20 +0000 UTC,LastTimestamp:2026-03-08 03:48:37.506604966 +0000 UTC m=+80.452289489,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:50:59.651761 master-0 kubenswrapper[7547]: E0308 03:50:59.651656 7547 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:51:00.316359 master-0 kubenswrapper[7547]: I0308 03:51:00.316265 7547 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-9sw2d container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.24:8080/healthz\": dial tcp 10.128.0.24:8080: connect: connection refused" start-of-body= Mar 08 03:51:00.316815 master-0 kubenswrapper[7547]: I0308 03:51:00.316750 7547 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" podUID="54ad284e-d40e-4e69-b898-f5093952a0e6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.24:8080/healthz\": dial tcp 10.128.0.24:8080: connect: connection refused" Mar 08 03:51:00.317090 master-0 kubenswrapper[7547]: I0308 03:51:00.316324 7547 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-9sw2d container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.24:8080/healthz\": dial tcp 10.128.0.24:8080: connect: connection refused" start-of-body= Mar 08 03:51:00.317329 master-0 kubenswrapper[7547]: I0308 03:51:00.317287 7547 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" podUID="54ad284e-d40e-4e69-b898-f5093952a0e6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.24:8080/healthz\": dial tcp 10.128.0.24:8080: connect: connection refused" Mar 08 03:51:02.094669 master-0 kubenswrapper[7547]: I0308 03:51:02.094600 7547 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-8h6fj container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.41:8081/readyz\": dial tcp 10.128.0.41:8081: connect: connection refused" start-of-body= Mar 08 03:51:02.095755 master-0 kubenswrapper[7547]: I0308 03:51:02.095693 7547 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" podUID="1b69fbf6-1ca5-413e-bffd-965730bcec1b" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.41:8081/readyz\": dial tcp 10.128.0.41:8081: connect: connection refused" Mar 08 03:51:04.306023 master-0 kubenswrapper[7547]: E0308 03:51:04.305947 7547 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 08 03:51:08.714162 master-0 kubenswrapper[7547]: E0308 03:51:08.714043 7547 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 08 03:51:09.652619 master-0 kubenswrapper[7547]: E0308 03:51:09.652502 7547 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:51:10.316089 master-0 kubenswrapper[7547]: I0308 03:51:10.315932 7547 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-9sw2d container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.24:8080/healthz\": dial tcp 10.128.0.24:8080: connect: connection refused" start-of-body= Mar 08 03:51:10.319436 master-0 kubenswrapper[7547]: I0308 03:51:10.316068 7547 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" podUID="54ad284e-d40e-4e69-b898-f5093952a0e6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.24:8080/healthz\": dial tcp 10.128.0.24:8080: connect: connection refused" Mar 08 03:51:10.319436 master-0 kubenswrapper[7547]: I0308 03:51:10.316089 7547 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-9sw2d container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.24:8080/healthz\": dial tcp 10.128.0.24:8080: connect: connection refused" start-of-body= Mar 08 03:51:10.319436 master-0 kubenswrapper[7547]: I0308 03:51:10.316202 7547 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" podUID="54ad284e-d40e-4e69-b898-f5093952a0e6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.24:8080/healthz\": dial tcp 10.128.0.24:8080: connect: connection refused" Mar 08 03:51:12.094440 master-0 kubenswrapper[7547]: I0308 03:51:12.094336 7547 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-8h6fj container/manager namespace/openshift-catalogd: Liveness probe status=failure output="Get \"http://10.128.0.41:8081/healthz\": dial tcp 10.128.0.41:8081: connect: connection refused" start-of-body= Mar 08 03:51:12.095245 master-0 kubenswrapper[7547]: I0308 03:51:12.094438 7547 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" podUID="1b69fbf6-1ca5-413e-bffd-965730bcec1b" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.41:8081/healthz\": dial tcp 10.128.0.41:8081: connect: connection refused" Mar 08 03:51:12.095245 master-0 kubenswrapper[7547]: I0308 03:51:12.094465 7547 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-8h6fj container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.41:8081/readyz\": dial tcp 10.128.0.41:8081: connect: connection refused" start-of-body= Mar 08 03:51:12.095245 master-0 kubenswrapper[7547]: I0308 03:51:12.094528 7547 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" podUID="1b69fbf6-1ca5-413e-bffd-965730bcec1b" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.41:8081/readyz\": dial tcp 10.128.0.41:8081: connect: connection refused" Mar 08 03:51:14.711178 master-0 kubenswrapper[7547]: I0308 03:51:14.711070 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-6fbfc8dc8f-nm8fj_b3eea925-73b3-4693-8f0e-6dd26107f60a/cluster-storage-operator/1.log" Mar 08 03:51:14.712168 master-0 kubenswrapper[7547]: I0308 03:51:14.711774 7547 generic.go:334] "Generic (PLEG): container finished" podID="b3eea925-73b3-4693-8f0e-6dd26107f60a" containerID="11981809b9cc27f184966b17ad1925dff97bd3f4b8d6d288eb4740ef6e4ff5eb" exitCode=255 Mar 08 03:51:14.723446 master-0 kubenswrapper[7547]: I0308 03:51:14.723357 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-slm72_a60bc804-52e7-422a-87fd-ac4c5aa90cb3/authentication-operator/1.log" Mar 08 03:51:14.724285 master-0 kubenswrapper[7547]: I0308 03:51:14.724214 7547 generic.go:334] "Generic (PLEG): container finished" podID="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" containerID="376406ceea2c5527fe4c957342f4e7bbd7c621b656b4317d1368005ebb85d7c7" exitCode=255 Mar 08 03:51:15.734051 master-0 kubenswrapper[7547]: I0308 03:51:15.733957 7547 generic.go:334] "Generic (PLEG): container finished" podID="7e5935ea-8d95-45e3-b836-c7892953ef3d" containerID="7fe9302ada8235a3afd5b8f3fc53b3d920a5fbae69778891c3722690a5eb8590" exitCode=0 Mar 08 03:51:19.653628 master-0 kubenswrapper[7547]: E0308 03:51:19.653470 7547 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:51:20.315496 master-0 kubenswrapper[7547]: I0308 03:51:20.315381 7547 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-9sw2d container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.24:8080/healthz\": dial tcp 10.128.0.24:8080: connect: connection refused" start-of-body= Mar 08 03:51:20.315765 master-0 kubenswrapper[7547]: I0308 03:51:20.315483 7547 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" podUID="54ad284e-d40e-4e69-b898-f5093952a0e6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.24:8080/healthz\": dial tcp 10.128.0.24:8080: connect: connection refused" Mar 08 03:51:20.770011 master-0 kubenswrapper[7547]: I0308 03:51:20.769942 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-jghp5_d831cb23-7411-4072-8273-c167d9afca28/cluster-baremetal-operator/0.log" Mar 08 03:51:20.770994 master-0 kubenswrapper[7547]: I0308 03:51:20.770017 7547 generic.go:334] "Generic (PLEG): container finished" podID="d831cb23-7411-4072-8273-c167d9afca28" containerID="712603a1b97b084eebc58893e05cde574b9f0f2e5360a98b0fe0e6acfea60707" exitCode=1 Mar 08 03:51:22.094290 master-0 kubenswrapper[7547]: I0308 03:51:22.094209 7547 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-8h6fj container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.41:8081/readyz\": dial tcp 10.128.0.41:8081: connect: connection refused" start-of-body= Mar 08 03:51:22.095187 master-0 kubenswrapper[7547]: I0308 03:51:22.094311 7547 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" podUID="1b69fbf6-1ca5-413e-bffd-965730bcec1b" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.41:8081/readyz\": dial tcp 10.128.0.41:8081: connect: connection refused" Mar 08 03:51:25.319364 master-0 kubenswrapper[7547]: E0308 03:51:25.319285 7547 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:51:25.320187 master-0 kubenswrapper[7547]: E0308 03:51:25.319602 7547 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.025s" Mar 08 03:51:25.320187 master-0 kubenswrapper[7547]: I0308 03:51:25.319654 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:51:25.320187 master-0 kubenswrapper[7547]: I0308 03:51:25.319734 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:51:25.320187 master-0 kubenswrapper[7547]: I0308 03:51:25.319758 7547 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:51:25.320187 master-0 kubenswrapper[7547]: I0308 03:51:25.319810 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:51:25.327280 master-0 kubenswrapper[7547]: I0308 03:51:25.327213 7547 scope.go:117] "RemoveContainer" containerID="791aee9d23f28d5b9bc6bbbcd3f26705c245a61021bebb20a57835608ad72cab" Mar 08 03:51:25.327647 master-0 kubenswrapper[7547]: I0308 03:51:25.327606 7547 scope.go:117] "RemoveContainer" containerID="55aa7553b7b737c589cdd0270a8ec23cc64ce136f8130219ce1dabd7e976b992" Mar 08 03:51:25.361471 master-0 kubenswrapper[7547]: I0308 03:51:25.361402 7547 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 08 03:51:25.715066 master-0 kubenswrapper[7547]: E0308 03:51:25.714996 7547 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="7s" Mar 08 03:51:25.807265 master-0 kubenswrapper[7547]: I0308 03:51:25.807220 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-8h6fj_1b69fbf6-1ca5-413e-bffd-965730bcec1b/manager/0.log" Mar 08 03:51:28.368005 master-0 kubenswrapper[7547]: I0308 03:51:28.367863 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:51:28.369268 master-0 kubenswrapper[7547]: E0308 03:51:28.368137 7547 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: configmap "kube-rbac-proxy" not found Mar 08 03:51:28.369268 master-0 kubenswrapper[7547]: E0308 03:51:28.368258 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config podName:e78b283b-981e-48d7-a5f2-53f8401766ea nodeName:}" failed. No retries permitted until 2026-03-08 03:53:30.368226826 +0000 UTC m=+373.313911389 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-2vjh2" (UID: "e78b283b-981e-48d7-a5f2-53f8401766ea") : configmap "kube-rbac-proxy" not found Mar 08 03:51:29.654511 master-0 kubenswrapper[7547]: E0308 03:51:29.654336 7547 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:51:29.654511 master-0 kubenswrapper[7547]: E0308 03:51:29.654411 7547 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 03:51:37.475591 master-0 kubenswrapper[7547]: I0308 03:51:37.475500 7547 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods bootstrap-kube-scheduler-master-0)" Mar 08 03:51:42.716111 master-0 kubenswrapper[7547]: E0308 03:51:42.716011 7547 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 08 03:51:46.270084 master-0 kubenswrapper[7547]: E0308 03:51:46.269958 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[auth-proxy-config], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" podUID="e78b283b-981e-48d7-a5f2-53f8401766ea" Mar 08 03:51:48.980486 master-0 kubenswrapper[7547]: I0308 03:51:48.980397 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5c74bfc494-g6n58_e4541b7b-3f7f-4851-9bd9-26fcda5cab13/kube-scheduler-operator-container/1.log" Mar 08 03:51:48.981484 master-0 kubenswrapper[7547]: I0308 03:51:48.980983 7547 generic.go:334] "Generic (PLEG): container finished" podID="e4541b7b-3f7f-4851-9bd9-26fcda5cab13" containerID="38b4abf7d4c06fafbe1f2864c946ad3648498ee7c33fbece731408c279494ae4" exitCode=255 Mar 08 03:51:48.983309 master-0 kubenswrapper[7547]: I0308 03:51:48.983257 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-69b6fc6b88-kg795_0d377285-0336-41b7-b48f-c44a7b563498/service-ca-operator/1.log" Mar 08 03:51:48.983756 master-0 kubenswrapper[7547]: I0308 03:51:48.983710 7547 generic.go:334] "Generic (PLEG): container finished" podID="0d377285-0336-41b7-b48f-c44a7b563498" containerID="c699bcd2d347a8b5955755f0e95a151eaeb522f32880f9765692ad3e2e0369c6" exitCode=255 Mar 08 03:51:48.986282 master-0 kubenswrapper[7547]: I0308 03:51:48.986248 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-77899cf6d-x9h9q_7ff63c73-62a3-44b4-acd3-1b3df175794f/cluster-olm-operator/1.log" Mar 08 03:51:48.987784 master-0 kubenswrapper[7547]: I0308 03:51:48.987749 7547 generic.go:334] "Generic (PLEG): container finished" podID="7ff63c73-62a3-44b4-acd3-1b3df175794f" containerID="7b9e0618571c76237a54adfbc9471783f3afade6ddbedbe9d5d1037a9f845813" exitCode=255 Mar 08 03:51:48.991879 master-0 kubenswrapper[7547]: I0308 03:51:48.990541 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-7f65c457f5-6fhhs_5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a/kube-storage-version-migrator-operator/1.log" Mar 08 03:51:48.991879 master-0 kubenswrapper[7547]: I0308 03:51:48.991192 7547 generic.go:334] "Generic (PLEG): container finished" podID="5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a" containerID="2f6ec83521bbab74297dfc2bb7addc122c614b4ebd158773a1f21c9c8a08aa06" exitCode=255 Mar 08 03:51:48.994301 master-0 kubenswrapper[7547]: I0308 03:51:48.993368 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-vzms7_5a7752f9-7b9a-451f-997a-e9f696d38b34/etcd-operator/1.log" Mar 08 03:51:48.994301 master-0 kubenswrapper[7547]: I0308 03:51:48.993949 7547 generic.go:334] "Generic (PLEG): container finished" podID="5a7752f9-7b9a-451f-997a-e9f696d38b34" containerID="9cb40f8e472021b6bf28adddecb51a371c5cff426f2d0e4b345adbb4c28df1e5" exitCode=255 Mar 08 03:51:49.008596 master-0 kubenswrapper[7547]: I0308 03:51:49.008516 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-99d2k_3ddfd0e7-fe76-41bc-b316-94505df81002/network-operator/1.log" Mar 08 03:51:49.009337 master-0 kubenswrapper[7547]: I0308 03:51:49.009288 7547 generic.go:334] "Generic (PLEG): container finished" podID="3ddfd0e7-fe76-41bc-b316-94505df81002" containerID="e9dbfd241ad84e1bb7af7ba76075dd9557271049d4b44017afb55ac9ce7ffb9b" exitCode=255 Mar 08 03:51:52.560178 master-0 kubenswrapper[7547]: E0308 03:51:52.560119 7547 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="27.24s" Mar 08 03:51:52.572653 master-0 kubenswrapper[7547]: I0308 03:51:52.572566 7547 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 08 03:51:52.576406 master-0 kubenswrapper[7547]: I0308 03:51:52.576093 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"3df151c3da265182304d84afb0b3bc1e42416ef6485b53e3bd88733c8055b421"} Mar 08 03:51:52.576406 master-0 kubenswrapper[7547]: I0308 03:51:52.576164 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" event={"ID":"5a7752f9-7b9a-451f-997a-e9f696d38b34","Type":"ContainerDied","Data":"37cbe69de0ead690fe5f97f7713b1a785d6ee472fbe38e74b1ca8bbb8ffc0b32"} Mar 08 03:51:52.576406 master-0 kubenswrapper[7547]: I0308 03:51:52.576197 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" event={"ID":"7ff63c73-62a3-44b4-acd3-1b3df175794f","Type":"ContainerDied","Data":"0d9517a4dbfbd842f9c484f0081150c86a4b5af486a5ddb8461a1d470f81112c"} Mar 08 03:51:52.576406 master-0 kubenswrapper[7547]: I0308 03:51:52.576224 7547 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:51:52.576406 master-0 kubenswrapper[7547]: I0308 03:51:52.576257 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" event={"ID":"e4541b7b-3f7f-4851-9bd9-26fcda5cab13","Type":"ContainerDied","Data":"2b3399b78be3045c232df9d3c4545d85577efb44cef1d2c0a18e98d67e4c7cb7"} Mar 08 03:51:52.576406 master-0 kubenswrapper[7547]: I0308 03:51:52.576286 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" event={"ID":"3ddfd0e7-fe76-41bc-b316-94505df81002","Type":"ContainerDied","Data":"8f8cad46e77715e164ec2e62df8c2ee60a2b96fa8c918baa1589a3082317e15b"} Mar 08 03:51:52.576406 master-0 kubenswrapper[7547]: I0308 03:51:52.576315 7547 scope.go:117] "RemoveContainer" containerID="37cbe69de0ead690fe5f97f7713b1a785d6ee472fbe38e74b1ca8bbb8ffc0b32" Mar 08 03:51:52.577276 master-0 kubenswrapper[7547]: I0308 03:51:52.576336 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" event={"ID":"30211469-7108-4820-a988-26fc4ced734e","Type":"ContainerDied","Data":"47c3f7232d0f0bc4de9dbe2ca382d3e0709c3d618e0b06a088f2ef41c6b071e7"} Mar 08 03:51:52.577276 master-0 kubenswrapper[7547]: I0308 03:51:52.576874 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:51:52.577276 master-0 kubenswrapper[7547]: I0308 03:51:52.576910 7547 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:51:52.577276 master-0 kubenswrapper[7547]: I0308 03:51:52.576933 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-ggzm8" event={"ID":"164586b1-f133-4427-8ab6-eb0839b79738","Type":"ContainerDied","Data":"eca6f5647fbdf9b3ef8c7044a7fb91cd16de860543c74991829e340da4a238fe"} Mar 08 03:51:52.579258 master-0 kubenswrapper[7547]: I0308 03:51:52.579106 7547 scope.go:117] "RemoveContainer" containerID="e9dbfd241ad84e1bb7af7ba76075dd9557271049d4b44017afb55ac9ce7ffb9b" Mar 08 03:51:52.579520 master-0 kubenswrapper[7547]: E0308 03:51:52.579455 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=network-operator pod=network-operator-7c649bf6d4-99d2k_openshift-network-operator(3ddfd0e7-fe76-41bc-b316-94505df81002)\"" pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" podUID="3ddfd0e7-fe76-41bc-b316-94505df81002" Mar 08 03:51:52.579657 master-0 kubenswrapper[7547]: I0308 03:51:52.579554 7547 scope.go:117] "RemoveContainer" containerID="9cb40f8e472021b6bf28adddecb51a371c5cff426f2d0e4b345adbb4c28df1e5" Mar 08 03:51:52.579878 master-0 kubenswrapper[7547]: E0308 03:51:52.579798 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=etcd-operator pod=etcd-operator-5884b9cd56-vzms7_openshift-etcd-operator(5a7752f9-7b9a-451f-997a-e9f696d38b34)\"" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" podUID="5a7752f9-7b9a-451f-997a-e9f696d38b34" Mar 08 03:51:52.590657 master-0 kubenswrapper[7547]: I0308 03:51:52.590589 7547 scope.go:117] "RemoveContainer" containerID="712603a1b97b084eebc58893e05cde574b9f0f2e5360a98b0fe0e6acfea60707" Mar 08 03:51:52.594293 master-0 kubenswrapper[7547]: I0308 03:51:52.593440 7547 scope.go:117] "RemoveContainer" containerID="11981809b9cc27f184966b17ad1925dff97bd3f4b8d6d288eb4740ef6e4ff5eb" Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595291 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595338 7547 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595364 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" event={"ID":"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a","Type":"ContainerDied","Data":"357e1b5825405be66b8754168b57e640c102e496555d0a6b7dd9834bacebf15e"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595398 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" event={"ID":"a60bc804-52e7-422a-87fd-ac4c5aa90cb3","Type":"ContainerDied","Data":"d4bcecac644708f2f25ce7ab391ef8889989648db06b3e9db25dd3f64bfa6da8"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595424 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" event={"ID":"0d377285-0336-41b7-b48f-c44a7b563498","Type":"ContainerDied","Data":"db7380d3fe7301944a7a66ec837b1d91caad2bb5d7122a498e8b10a38c9f552b"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595459 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"c67a98430dbc4d1ec5edc5c2aa37ee19dd047e853de22d326be45ea84e3430ff"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595482 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"a234e673236026ace29cfb9074b221693da101dadaf715ffecb4dfd643bf0e5f"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595500 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" event={"ID":"b3eea925-73b3-4693-8f0e-6dd26107f60a","Type":"ContainerStarted","Data":"11981809b9cc27f184966b17ad1925dff97bd3f4b8d6d288eb4740ef6e4ff5eb"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595519 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" event={"ID":"a60bc804-52e7-422a-87fd-ac4c5aa90cb3","Type":"ContainerStarted","Data":"376406ceea2c5527fe4c957342f4e7bbd7c621b656b4317d1368005ebb85d7c7"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595541 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" event={"ID":"26180f77-0b1a-4d0f-9ed0-a12fdee69817","Type":"ContainerDied","Data":"b9a51bfe829084894104463976cade708d7a51f90ef15a899d7341f663daf1dc"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595565 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" event={"ID":"0ebf1330-e044-4ff5-8b48-2d667e0c5625","Type":"ContainerDied","Data":"cae216678d94c10a368ff595527d708d87bd43ed6865eacedbf892861c47fe3a"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595596 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" event={"ID":"5a7752f9-7b9a-451f-997a-e9f696d38b34","Type":"ContainerStarted","Data":"9cb40f8e472021b6bf28adddecb51a371c5cff426f2d0e4b345adbb4c28df1e5"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595617 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" event={"ID":"3ddfd0e7-fe76-41bc-b316-94505df81002","Type":"ContainerStarted","Data":"e9dbfd241ad84e1bb7af7ba76075dd9557271049d4b44017afb55ac9ce7ffb9b"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595640 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" event={"ID":"0d377285-0336-41b7-b48f-c44a7b563498","Type":"ContainerStarted","Data":"c699bcd2d347a8b5955755f0e95a151eaeb522f32880f9765692ad3e2e0369c6"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595661 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" event={"ID":"7ff63c73-62a3-44b4-acd3-1b3df175794f","Type":"ContainerStarted","Data":"7b9e0618571c76237a54adfbc9471783f3afade6ddbedbe9d5d1037a9f845813"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595684 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" event={"ID":"e4541b7b-3f7f-4851-9bd9-26fcda5cab13","Type":"ContainerStarted","Data":"38b4abf7d4c06fafbe1f2864c946ad3648498ee7c33fbece731408c279494ae4"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595703 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-ggzm8" event={"ID":"164586b1-f133-4427-8ab6-eb0839b79738","Type":"ContainerStarted","Data":"8ebdb5e799974ba85edc25e5ce7cb1526623500db5c19cf0e1e303e5992b5514"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595731 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" event={"ID":"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a","Type":"ContainerStarted","Data":"2f6ec83521bbab74297dfc2bb7addc122c614b4ebd158773a1f21c9c8a08aa06"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595750 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" event={"ID":"2f59fe81-deee-4ced-ae9d-f17752c82c4b","Type":"ContainerDied","Data":"3059f49f388319ee646920103084d28d8b0077750e77df3225c9bad4053dd550"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595868 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" event={"ID":"9ec89e27-4360-48f2-a7ca-5d823bda4510","Type":"ContainerDied","Data":"e1cf094994e913e66c5a9e6e155292c3e34468235cb173dcf1919a0eed0dd4ca"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595934 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"47643289-ac4b-425d-8ea1-913b6ca39ee0","Type":"ContainerDied","Data":"18670cb65f485400f1fdb45bed0a06f4e06d21d135459ea29b3c0fcd10f2d210"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595963 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" event={"ID":"c9de4939-680a-4e3e-89fd-e20ecb8b10f2","Type":"ContainerDied","Data":"db28e69e1ea518493719876e18b2faf675fe251b59f240840b24dd0b6d115924"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.595987 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"0f865279-e751-456d-8c96-6381f8b45ce1","Type":"ContainerDied","Data":"9289f2928e2e95c2ade5890aeb0e93be12cf91a6e92bf8866de144086be0fb16"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.596012 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" event={"ID":"54ad284e-d40e-4e69-b898-f5093952a0e6","Type":"ContainerDied","Data":"791aee9d23f28d5b9bc6bbbcd3f26705c245a61021bebb20a57835608ad72cab"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.596051 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" event={"ID":"ee586416-6f56-4ea4-ad62-95de1e6df23b","Type":"ContainerDied","Data":"644f0c7d4552f15957ecfc56f2d37a06ec2757ddcc7c2c371f0c34b92aa63533"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.596075 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"a234e673236026ace29cfb9074b221693da101dadaf715ffecb4dfd643bf0e5f"} Mar 08 03:51:52.596036 master-0 kubenswrapper[7547]: I0308 03:51:52.596099 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" event={"ID":"1b69fbf6-1ca5-413e-bffd-965730bcec1b","Type":"ContainerDied","Data":"55aa7553b7b737c589cdd0270a8ec23cc64ce136f8130219ce1dabd7e976b992"} Mar 08 03:51:52.598385 master-0 kubenswrapper[7547]: I0308 03:51:52.596124 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" event={"ID":"30211469-7108-4820-a988-26fc4ced734e","Type":"ContainerStarted","Data":"3ef77df24a6188f2df12c68f2e96fc205f0952663894e7ef47efc8b458472ef3"} Mar 08 03:51:52.598385 master-0 kubenswrapper[7547]: I0308 03:51:52.596208 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" event={"ID":"26180f77-0b1a-4d0f-9ed0-a12fdee69817","Type":"ContainerStarted","Data":"23505cadf27654fbabe6ecbcfaae5a323aefeb8dca3ee86054a0d67533ac3988"} Mar 08 03:51:52.598385 master-0 kubenswrapper[7547]: I0308 03:51:52.596276 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" event={"ID":"c9de4939-680a-4e3e-89fd-e20ecb8b10f2","Type":"ContainerStarted","Data":"6dda8620c23b75054d58040dd3196a9a4c153d3bd8a4de55108611f4453bb22c"} Mar 08 03:51:52.598385 master-0 kubenswrapper[7547]: I0308 03:51:52.596366 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" event={"ID":"0ebf1330-e044-4ff5-8b48-2d667e0c5625","Type":"ContainerStarted","Data":"1be9d220d27c69fde01642289a0e8eb2f1052094efecbeaca301570ba5131c28"} Mar 08 03:51:52.598385 master-0 kubenswrapper[7547]: I0308 03:51:52.596470 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" event={"ID":"2f59fe81-deee-4ced-ae9d-f17752c82c4b","Type":"ContainerStarted","Data":"b9cf2b977eb9896bb3772a53fc657d6a23f43414b986d9aa544cc9a2d1e44724"} Mar 08 03:51:52.598385 master-0 kubenswrapper[7547]: I0308 03:51:52.596573 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"a8391451b1644c11ad363666bf4d456fe86930894f61c4a9474dc40e3b26d78b"} Mar 08 03:51:52.599242 master-0 kubenswrapper[7547]: I0308 03:51:52.599173 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"afc2ac57ed877bb9156ca731d8cd2f853ddb9f606dc1ae3cba22d206076d25c5"} Mar 08 03:51:52.599242 master-0 kubenswrapper[7547]: I0308 03:51:52.599228 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"b968b6dde7d2ded374cd8ae315cb70a664d6c49c41163b10766b7ed997cf628a"} Mar 08 03:51:52.599418 master-0 kubenswrapper[7547]: I0308 03:51:52.599249 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"5d6ee3d775aef6ab7485b93f260e08787c53fee03078cb5743281f3a00c6731a"} Mar 08 03:51:52.599418 master-0 kubenswrapper[7547]: I0308 03:51:52.599269 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"743b6d0d3328cb1e5fd90f39085d9403830aebc1de828659a1d9c0fc9660f4a2"} Mar 08 03:51:52.599418 master-0 kubenswrapper[7547]: I0308 03:51:52.599289 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" event={"ID":"b3eea925-73b3-4693-8f0e-6dd26107f60a","Type":"ContainerDied","Data":"11981809b9cc27f184966b17ad1925dff97bd3f4b8d6d288eb4740ef6e4ff5eb"} Mar 08 03:51:52.599418 master-0 kubenswrapper[7547]: I0308 03:51:52.599314 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" event={"ID":"a60bc804-52e7-422a-87fd-ac4c5aa90cb3","Type":"ContainerDied","Data":"376406ceea2c5527fe4c957342f4e7bbd7c621b656b4317d1368005ebb85d7c7"} Mar 08 03:51:52.599418 master-0 kubenswrapper[7547]: I0308 03:51:52.599347 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" event={"ID":"7e5935ea-8d95-45e3-b836-c7892953ef3d","Type":"ContainerDied","Data":"7fe9302ada8235a3afd5b8f3fc53b3d920a5fbae69778891c3722690a5eb8590"} Mar 08 03:51:52.599418 master-0 kubenswrapper[7547]: I0308 03:51:52.599369 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" event={"ID":"d831cb23-7411-4072-8273-c167d9afca28","Type":"ContainerDied","Data":"712603a1b97b084eebc58893e05cde574b9f0f2e5360a98b0fe0e6acfea60707"} Mar 08 03:51:52.599418 master-0 kubenswrapper[7547]: I0308 03:51:52.597275 7547 scope.go:117] "RemoveContainer" containerID="376406ceea2c5527fe4c957342f4e7bbd7c621b656b4317d1368005ebb85d7c7" Mar 08 03:51:52.601038 master-0 kubenswrapper[7547]: I0308 03:51:52.599405 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" event={"ID":"54ad284e-d40e-4e69-b898-f5093952a0e6","Type":"ContainerStarted","Data":"1a31a3069861ce06f33609b07d5ca3abb641a6a3e5a27333ce4ca305d8846e91"} Mar 08 03:51:52.601038 master-0 kubenswrapper[7547]: I0308 03:51:52.599474 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" event={"ID":"1b69fbf6-1ca5-413e-bffd-965730bcec1b","Type":"ContainerStarted","Data":"0a736e2d8048f2ceaf18442229490b10930da59bbbcadc3ca87a4ce073f730b4"} Mar 08 03:51:52.601038 master-0 kubenswrapper[7547]: I0308 03:51:52.599545 7547 scope.go:117] "RemoveContainer" containerID="a234e673236026ace29cfb9074b221693da101dadaf715ffecb4dfd643bf0e5f" Mar 08 03:51:52.603940 master-0 kubenswrapper[7547]: I0308 03:51:52.603871 7547 scope.go:117] "RemoveContainer" containerID="644f0c7d4552f15957ecfc56f2d37a06ec2757ddcc7c2c371f0c34b92aa63533" Mar 08 03:51:52.604242 master-0 kubenswrapper[7547]: I0308 03:51:52.604191 7547 scope.go:117] "RemoveContainer" containerID="7fe9302ada8235a3afd5b8f3fc53b3d920a5fbae69778891c3722690a5eb8590" Mar 08 03:51:52.604501 master-0 kubenswrapper[7547]: I0308 03:51:52.599496 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" event={"ID":"e4541b7b-3f7f-4851-9bd9-26fcda5cab13","Type":"ContainerDied","Data":"38b4abf7d4c06fafbe1f2864c946ad3648498ee7c33fbece731408c279494ae4"} Mar 08 03:51:52.604501 master-0 kubenswrapper[7547]: I0308 03:51:52.604447 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" event={"ID":"0d377285-0336-41b7-b48f-c44a7b563498","Type":"ContainerDied","Data":"c699bcd2d347a8b5955755f0e95a151eaeb522f32880f9765692ad3e2e0369c6"} Mar 08 03:51:52.605119 master-0 kubenswrapper[7547]: I0308 03:51:52.605059 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" event={"ID":"7ff63c73-62a3-44b4-acd3-1b3df175794f","Type":"ContainerDied","Data":"7b9e0618571c76237a54adfbc9471783f3afade6ddbedbe9d5d1037a9f845813"} Mar 08 03:51:52.605119 master-0 kubenswrapper[7547]: I0308 03:51:52.605094 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" event={"ID":"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a","Type":"ContainerDied","Data":"2f6ec83521bbab74297dfc2bb7addc122c614b4ebd158773a1f21c9c8a08aa06"} Mar 08 03:51:52.605119 master-0 kubenswrapper[7547]: I0308 03:51:52.605115 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" event={"ID":"5a7752f9-7b9a-451f-997a-e9f696d38b34","Type":"ContainerDied","Data":"9cb40f8e472021b6bf28adddecb51a371c5cff426f2d0e4b345adbb4c28df1e5"} Mar 08 03:51:52.605374 master-0 kubenswrapper[7547]: I0308 03:51:52.605137 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" event={"ID":"3ddfd0e7-fe76-41bc-b316-94505df81002","Type":"ContainerDied","Data":"e9dbfd241ad84e1bb7af7ba76075dd9557271049d4b44017afb55ac9ce7ffb9b"} Mar 08 03:51:52.606704 master-0 kubenswrapper[7547]: I0308 03:51:52.606656 7547 scope.go:117] "RemoveContainer" containerID="e1cf094994e913e66c5a9e6e155292c3e34468235cb173dcf1919a0eed0dd4ca" Mar 08 03:51:52.606890 master-0 kubenswrapper[7547]: I0308 03:51:52.606794 7547 scope.go:117] "RemoveContainer" containerID="2f6ec83521bbab74297dfc2bb7addc122c614b4ebd158773a1f21c9c8a08aa06" Mar 08 03:51:52.611909 master-0 kubenswrapper[7547]: I0308 03:51:52.607070 7547 scope.go:117] "RemoveContainer" containerID="c699bcd2d347a8b5955755f0e95a151eaeb522f32880f9765692ad3e2e0369c6" Mar 08 03:51:52.611909 master-0 kubenswrapper[7547]: E0308 03:51:52.607511 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-7f65c457f5-6fhhs_openshift-kube-storage-version-migrator-operator(5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" podUID="5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a" Mar 08 03:51:52.611909 master-0 kubenswrapper[7547]: I0308 03:51:52.607785 7547 scope.go:117] "RemoveContainer" containerID="38b4abf7d4c06fafbe1f2864c946ad3648498ee7c33fbece731408c279494ae4" Mar 08 03:51:52.611909 master-0 kubenswrapper[7547]: I0308 03:51:52.607999 7547 scope.go:117] "RemoveContainer" containerID="7b9e0618571c76237a54adfbc9471783f3afade6ddbedbe9d5d1037a9f845813" Mar 08 03:51:52.611909 master-0 kubenswrapper[7547]: E0308 03:51:52.608107 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-scheduler-operator-container pod=openshift-kube-scheduler-operator-5c74bfc494-g6n58_openshift-kube-scheduler-operator(e4541b7b-3f7f-4851-9bd9-26fcda5cab13)\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" podUID="e4541b7b-3f7f-4851-9bd9-26fcda5cab13" Mar 08 03:51:52.611909 master-0 kubenswrapper[7547]: E0308 03:51:52.608198 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=service-ca-operator pod=service-ca-operator-69b6fc6b88-kg795_openshift-service-ca-operator(0d377285-0336-41b7-b48f-c44a7b563498)\"" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" podUID="0d377285-0336-41b7-b48f-c44a7b563498" Mar 08 03:51:52.611909 master-0 kubenswrapper[7547]: E0308 03:51:52.608232 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-olm-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-olm-operator pod=cluster-olm-operator-77899cf6d-x9h9q_openshift-cluster-olm-operator(7ff63c73-62a3-44b4-acd3-1b3df175794f)\"" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" podUID="7ff63c73-62a3-44b4-acd3-1b3df175794f" Mar 08 03:51:52.622517 master-0 kubenswrapper[7547]: I0308 03:51:52.620461 7547 scope.go:117] "RemoveContainer" containerID="0d9517a4dbfbd842f9c484f0081150c86a4b5af486a5ddb8461a1d470f81112c" Mar 08 03:51:52.631437 master-0 kubenswrapper[7547]: I0308 03:51:52.626630 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 08 03:51:52.631437 master-0 kubenswrapper[7547]: I0308 03:51:52.626673 7547 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="3a052842-9f94-48a2-b991-c898ea6973c2" Mar 08 03:51:52.631437 master-0 kubenswrapper[7547]: I0308 03:51:52.630490 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 08 03:51:52.631437 master-0 kubenswrapper[7547]: I0308 03:51:52.630544 7547 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="3a052842-9f94-48a2-b991-c898ea6973c2" Mar 08 03:51:52.632492 master-0 kubenswrapper[7547]: I0308 03:51:52.632421 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rnz4w" podStartSLOduration=211.952140945 podStartE2EDuration="3m34.632399581s" podCreationTimestamp="2026-03-08 03:48:18 +0000 UTC" firstStartedPulling="2026-03-08 03:48:36.828899341 +0000 UTC m=+79.774583854" lastFinishedPulling="2026-03-08 03:48:39.509157937 +0000 UTC m=+82.454842490" observedRunningTime="2026-03-08 03:51:52.569547251 +0000 UTC m=+275.515231764" watchObservedRunningTime="2026-03-08 03:51:52.632399581 +0000 UTC m=+275.578084134" Mar 08 03:51:52.640144 master-0 kubenswrapper[7547]: I0308 03:51:52.640078 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 08 03:51:52.650579 master-0 kubenswrapper[7547]: I0308 03:51:52.650489 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4h8qm" podStartSLOduration=209.065430119 podStartE2EDuration="3m30.650446966s" podCreationTimestamp="2026-03-08 03:48:22 +0000 UTC" firstStartedPulling="2026-03-08 03:48:37.923111118 +0000 UTC m=+80.868795671" lastFinishedPulling="2026-03-08 03:48:39.508127965 +0000 UTC m=+82.453812518" observedRunningTime="2026-03-08 03:51:52.645312816 +0000 UTC m=+275.590997359" watchObservedRunningTime="2026-03-08 03:51:52.650446966 +0000 UTC m=+275.596131519" Mar 08 03:51:52.670999 master-0 kubenswrapper[7547]: I0308 03:51:52.670396 7547 scope.go:117] "RemoveContainer" containerID="2b3399b78be3045c232df9d3c4545d85577efb44cef1d2c0a18e98d67e4c7cb7" Mar 08 03:51:52.680360 master-0 kubenswrapper[7547]: I0308 03:51:52.680270 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-5-master-0" podStartSLOduration=205.680250758 podStartE2EDuration="3m25.680250758s" podCreationTimestamp="2026-03-08 03:48:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:51:52.677385611 +0000 UTC m=+275.623070164" watchObservedRunningTime="2026-03-08 03:51:52.680250758 +0000 UTC m=+275.625935311" Mar 08 03:51:52.753633 master-0 kubenswrapper[7547]: I0308 03:51:52.753562 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 08 03:51:52.757781 master-0 kubenswrapper[7547]: I0308 03:51:52.757686 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 08 03:51:52.760617 master-0 kubenswrapper[7547]: I0308 03:51:52.760497 7547 scope.go:117] "RemoveContainer" containerID="8f8cad46e77715e164ec2e62df8c2ee60a2b96fa8c918baa1589a3082317e15b" Mar 08 03:51:52.792755 master-0 kubenswrapper[7547]: I0308 03:51:52.792679 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-master-0" podStartSLOduration=213.792634516 podStartE2EDuration="3m33.792634516s" podCreationTimestamp="2026-03-08 03:48:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:51:52.771601341 +0000 UTC m=+275.717285864" watchObservedRunningTime="2026-03-08 03:51:52.792634516 +0000 UTC m=+275.738319049" Mar 08 03:51:52.797266 master-0 kubenswrapper[7547]: I0308 03:51:52.797101 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" podStartSLOduration=196.797081571 podStartE2EDuration="3m16.797081571s" podCreationTimestamp="2026-03-08 03:48:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:51:52.790769033 +0000 UTC m=+275.736453566" watchObservedRunningTime="2026-03-08 03:51:52.797081571 +0000 UTC m=+275.742766084" Mar 08 03:51:52.840720 master-0 kubenswrapper[7547]: I0308 03:51:52.840671 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9ljn9"] Mar 08 03:51:52.843572 master-0 kubenswrapper[7547]: I0308 03:51:52.843554 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9ljn9"] Mar 08 03:51:52.874584 master-0 kubenswrapper[7547]: I0308 03:51:52.874551 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 08 03:51:52.884201 master-0 kubenswrapper[7547]: I0308 03:51:52.883441 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 08 03:51:52.884298 master-0 kubenswrapper[7547]: I0308 03:51:52.884223 7547 scope.go:117] "RemoveContainer" containerID="357e1b5825405be66b8754168b57e640c102e496555d0a6b7dd9834bacebf15e" Mar 08 03:51:52.920801 master-0 kubenswrapper[7547]: I0308 03:51:52.920766 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c88pb"] Mar 08 03:51:52.932225 master-0 kubenswrapper[7547]: I0308 03:51:52.932181 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c88pb"] Mar 08 03:51:52.947231 master-0 kubenswrapper[7547]: I0308 03:51:52.945122 7547 scope.go:117] "RemoveContainer" containerID="d4bcecac644708f2f25ce7ab391ef8889989648db06b3e9db25dd3f64bfa6da8" Mar 08 03:51:52.953710 master-0 kubenswrapper[7547]: I0308 03:51:52.953623 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p8nq8" podStartSLOduration=211.460990781 podStartE2EDuration="3m33.953606709s" podCreationTimestamp="2026-03-08 03:48:19 +0000 UTC" firstStartedPulling="2026-03-08 03:48:37.867783904 +0000 UTC m=+80.813468447" lastFinishedPulling="2026-03-08 03:48:40.360399852 +0000 UTC m=+83.306084375" observedRunningTime="2026-03-08 03:51:52.952199725 +0000 UTC m=+275.897884238" watchObservedRunningTime="2026-03-08 03:51:52.953606709 +0000 UTC m=+275.899291222" Mar 08 03:51:52.970488 master-0 kubenswrapper[7547]: I0308 03:51:52.970399 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lwt58" podStartSLOduration=209.281751765 podStartE2EDuration="3m31.970382774s" podCreationTimestamp="2026-03-08 03:48:21 +0000 UTC" firstStartedPulling="2026-03-08 03:48:37.90501676 +0000 UTC m=+80.850701303" lastFinishedPulling="2026-03-08 03:48:40.593647769 +0000 UTC m=+83.539332312" observedRunningTime="2026-03-08 03:51:52.969514113 +0000 UTC m=+275.915198626" watchObservedRunningTime="2026-03-08 03:51:52.970382774 +0000 UTC m=+275.916067287" Mar 08 03:51:52.990074 master-0 kubenswrapper[7547]: I0308 03:51:52.990024 7547 scope.go:117] "RemoveContainer" containerID="db7380d3fe7301944a7a66ec837b1d91caad2bb5d7122a498e8b10a38c9f552b" Mar 08 03:51:53.023714 master-0 kubenswrapper[7547]: I0308 03:51:53.023663 7547 scope.go:117] "RemoveContainer" containerID="c67a98430dbc4d1ec5edc5c2aa37ee19dd047e853de22d326be45ea84e3430ff" Mar 08 03:51:53.058217 master-0 kubenswrapper[7547]: I0308 03:51:53.058189 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-7f65c457f5-6fhhs_5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a/kube-storage-version-migrator-operator/1.log" Mar 08 03:51:53.058632 master-0 kubenswrapper[7547]: I0308 03:51:53.058614 7547 scope.go:117] "RemoveContainer" containerID="2f6ec83521bbab74297dfc2bb7addc122c614b4ebd158773a1f21c9c8a08aa06" Mar 08 03:51:53.058788 master-0 kubenswrapper[7547]: E0308 03:51:53.058768 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-7f65c457f5-6fhhs_openshift-kube-storage-version-migrator-operator(5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" podUID="5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a" Mar 08 03:51:53.061557 master-0 kubenswrapper[7547]: I0308 03:51:53.061447 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-vzms7_5a7752f9-7b9a-451f-997a-e9f696d38b34/etcd-operator/1.log" Mar 08 03:51:53.062166 master-0 kubenswrapper[7547]: I0308 03:51:53.062148 7547 scope.go:117] "RemoveContainer" containerID="9cb40f8e472021b6bf28adddecb51a371c5cff426f2d0e4b345adbb4c28df1e5" Mar 08 03:51:53.062308 master-0 kubenswrapper[7547]: E0308 03:51:53.062288 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=etcd-operator pod=etcd-operator-5884b9cd56-vzms7_openshift-etcd-operator(5a7752f9-7b9a-451f-997a-e9f696d38b34)\"" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" podUID="5a7752f9-7b9a-451f-997a-e9f696d38b34" Mar 08 03:51:53.071263 master-0 kubenswrapper[7547]: I0308 03:51:53.071232 7547 scope.go:117] "RemoveContainer" containerID="724178cb9f231b822e2bf919b24049f88ede4ee540e7e7751c011ef4363756c9" Mar 08 03:51:53.071367 master-0 kubenswrapper[7547]: I0308 03:51:53.071345 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" event={"ID":"7e5935ea-8d95-45e3-b836-c7892953ef3d","Type":"ContainerStarted","Data":"2437868a78df876a1b3a4d8757e1788d355f9ffb34efa554d9cc67df46e10738"} Mar 08 03:51:53.076979 master-0 kubenswrapper[7547]: I0308 03:51:53.074303 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fcdn7"] Mar 08 03:51:53.086019 master-0 kubenswrapper[7547]: I0308 03:51:53.084262 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fcdn7"] Mar 08 03:51:53.088868 master-0 kubenswrapper[7547]: I0308 03:51:53.088455 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-99d2k_3ddfd0e7-fe76-41bc-b316-94505df81002/network-operator/1.log" Mar 08 03:51:53.088988 master-0 kubenswrapper[7547]: I0308 03:51:53.088887 7547 scope.go:117] "RemoveContainer" containerID="e9dbfd241ad84e1bb7af7ba76075dd9557271049d4b44017afb55ac9ce7ffb9b" Mar 08 03:51:53.092420 master-0 kubenswrapper[7547]: E0308 03:51:53.090713 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=network-operator pod=network-operator-7c649bf6d4-99d2k_openshift-network-operator(3ddfd0e7-fe76-41bc-b316-94505df81002)\"" pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" podUID="3ddfd0e7-fe76-41bc-b316-94505df81002" Mar 08 03:51:53.101009 master-0 kubenswrapper[7547]: I0308 03:51:53.100952 7547 scope.go:117] "RemoveContainer" containerID="2b806592f345fd33c0e6baaad7d7fe21c75572bbe4983f5588e4e61c09a25b29" Mar 08 03:51:53.101440 master-0 kubenswrapper[7547]: E0308 03:51:53.101380 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"724178cb9f231b822e2bf919b24049f88ede4ee540e7e7751c011ef4363756c9\": container with ID starting with 724178cb9f231b822e2bf919b24049f88ede4ee540e7e7751c011ef4363756c9 not found: ID does not exist" containerID="724178cb9f231b822e2bf919b24049f88ede4ee540e7e7751c011ef4363756c9" Mar 08 03:51:53.104471 master-0 kubenswrapper[7547]: I0308 03:51:53.104451 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-jghp5_d831cb23-7411-4072-8273-c167d9afca28/cluster-baremetal-operator/0.log" Mar 08 03:51:53.104540 master-0 kubenswrapper[7547]: I0308 03:51:53.104508 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" event={"ID":"d831cb23-7411-4072-8273-c167d9afca28","Type":"ContainerStarted","Data":"d31110b071a8e480e8de5e3d670b44efd0f87c9bdf6fe27693931341b4131afe"} Mar 08 03:51:53.109887 master-0 kubenswrapper[7547]: I0308 03:51:53.109865 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-slm72_a60bc804-52e7-422a-87fd-ac4c5aa90cb3/authentication-operator/1.log" Mar 08 03:51:53.109957 master-0 kubenswrapper[7547]: I0308 03:51:53.109936 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" event={"ID":"a60bc804-52e7-422a-87fd-ac4c5aa90cb3","Type":"ContainerStarted","Data":"9f8bedc6586942e8e66b1ee9c8e96a428b5e99d89a0a7931a59a00990d670518"} Mar 08 03:51:53.114585 master-0 kubenswrapper[7547]: I0308 03:51:53.114565 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-77899cf6d-x9h9q_7ff63c73-62a3-44b4-acd3-1b3df175794f/cluster-olm-operator/1.log" Mar 08 03:51:53.117339 master-0 kubenswrapper[7547]: I0308 03:51:53.117321 7547 scope.go:117] "RemoveContainer" containerID="7b9e0618571c76237a54adfbc9471783f3afade6ddbedbe9d5d1037a9f845813" Mar 08 03:51:53.117477 master-0 kubenswrapper[7547]: E0308 03:51:53.117453 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-olm-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-olm-operator pod=cluster-olm-operator-77899cf6d-x9h9q_openshift-cluster-olm-operator(7ff63c73-62a3-44b4-acd3-1b3df175794f)\"" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" podUID="7ff63c73-62a3-44b4-acd3-1b3df175794f" Mar 08 03:51:53.120091 master-0 kubenswrapper[7547]: I0308 03:51:53.120021 7547 scope.go:117] "RemoveContainer" containerID="8a84af60c043e955bcc0105f0aa3f93048f54c92376777b25ef1335389f355a8" Mar 08 03:51:53.126005 master-0 kubenswrapper[7547]: I0308 03:51:53.125636 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-6fbfc8dc8f-nm8fj_b3eea925-73b3-4693-8f0e-6dd26107f60a/cluster-storage-operator/1.log" Mar 08 03:51:53.126157 master-0 kubenswrapper[7547]: I0308 03:51:53.126090 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" event={"ID":"b3eea925-73b3-4693-8f0e-6dd26107f60a","Type":"ContainerStarted","Data":"e1fec486ad164fa39bf18c97bf6ae2ed95b35ced36abe0b9ec4d03fe751e8f33"} Mar 08 03:51:53.134438 master-0 kubenswrapper[7547]: I0308 03:51:53.133574 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w2lsp"] Mar 08 03:51:53.135768 master-0 kubenswrapper[7547]: I0308 03:51:53.135739 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w2lsp"] Mar 08 03:51:53.136007 master-0 kubenswrapper[7547]: I0308 03:51:53.135986 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-69b6fc6b88-kg795_0d377285-0336-41b7-b48f-c44a7b563498/service-ca-operator/1.log" Mar 08 03:51:53.136462 master-0 kubenswrapper[7547]: I0308 03:51:53.136425 7547 scope.go:117] "RemoveContainer" containerID="c699bcd2d347a8b5955755f0e95a151eaeb522f32880f9765692ad3e2e0369c6" Mar 08 03:51:53.136666 master-0 kubenswrapper[7547]: E0308 03:51:53.136639 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=service-ca-operator pod=service-ca-operator-69b6fc6b88-kg795_openshift-service-ca-operator(0d377285-0336-41b7-b48f-c44a7b563498)\"" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" podUID="0d377285-0336-41b7-b48f-c44a7b563498" Mar 08 03:51:53.138648 master-0 kubenswrapper[7547]: I0308 03:51:53.138613 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/0.log" Mar 08 03:51:53.138716 master-0 kubenswrapper[7547]: I0308 03:51:53.138684 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" event={"ID":"9ec89e27-4360-48f2-a7ca-5d823bda4510","Type":"ContainerStarted","Data":"bc802833ef70245be653ae91aa731daa7eab1a05e8fd9b4edd25c8e6a8279edb"} Mar 08 03:51:53.140265 master-0 kubenswrapper[7547]: I0308 03:51:53.139982 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5c74bfc494-g6n58_e4541b7b-3f7f-4851-9bd9-26fcda5cab13/kube-scheduler-operator-container/1.log" Mar 08 03:51:53.140325 master-0 kubenswrapper[7547]: I0308 03:51:53.140294 7547 scope.go:117] "RemoveContainer" containerID="38b4abf7d4c06fafbe1f2864c946ad3648498ee7c33fbece731408c279494ae4" Mar 08 03:51:53.140439 master-0 kubenswrapper[7547]: E0308 03:51:53.140416 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-scheduler-operator-container pod=openshift-kube-scheduler-operator-5c74bfc494-g6n58_openshift-kube-scheduler-operator(e4541b7b-3f7f-4851-9bd9-26fcda5cab13)\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" podUID="e4541b7b-3f7f-4851-9bd9-26fcda5cab13" Mar 08 03:51:53.142297 master-0 kubenswrapper[7547]: I0308 03:51:53.142279 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" event={"ID":"ee586416-6f56-4ea4-ad62-95de1e6df23b","Type":"ContainerStarted","Data":"ee2bfb125e22b7f5901652b9d324e5701d25b6ae22870a7e30683877ccc3b4cb"} Mar 08 03:51:53.146152 master-0 kubenswrapper[7547]: I0308 03:51:53.146120 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:51:53.146152 master-0 kubenswrapper[7547]: I0308 03:51:53.146154 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:51:53.146399 master-0 kubenswrapper[7547]: I0308 03:51:53.146372 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:51:53.150848 master-0 kubenswrapper[7547]: I0308 03:51:53.149506 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:51:53.155676 master-0 kubenswrapper[7547]: I0308 03:51:53.155627 7547 scope.go:117] "RemoveContainer" containerID="f0c21a56c7d12d77087ad5558ab608389fecd51a0d4bdef95c63dd3e4d27cfef" Mar 08 03:51:53.156416 master-0 kubenswrapper[7547]: E0308 03:51:53.156391 7547 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 08 03:51:53.174027 master-0 kubenswrapper[7547]: I0308 03:51:53.173986 7547 scope.go:117] "RemoveContainer" containerID="c67a98430dbc4d1ec5edc5c2aa37ee19dd047e853de22d326be45ea84e3430ff" Mar 08 03:51:53.174456 master-0 kubenswrapper[7547]: E0308 03:51:53.174403 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c67a98430dbc4d1ec5edc5c2aa37ee19dd047e853de22d326be45ea84e3430ff\": container with ID starting with c67a98430dbc4d1ec5edc5c2aa37ee19dd047e853de22d326be45ea84e3430ff not found: ID does not exist" containerID="c67a98430dbc4d1ec5edc5c2aa37ee19dd047e853de22d326be45ea84e3430ff" Mar 08 03:51:53.174456 master-0 kubenswrapper[7547]: I0308 03:51:53.174443 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c67a98430dbc4d1ec5edc5c2aa37ee19dd047e853de22d326be45ea84e3430ff"} err="failed to get container status \"c67a98430dbc4d1ec5edc5c2aa37ee19dd047e853de22d326be45ea84e3430ff\": rpc error: code = NotFound desc = could not find container \"c67a98430dbc4d1ec5edc5c2aa37ee19dd047e853de22d326be45ea84e3430ff\": container with ID starting with c67a98430dbc4d1ec5edc5c2aa37ee19dd047e853de22d326be45ea84e3430ff not found: ID does not exist" Mar 08 03:51:53.174574 master-0 kubenswrapper[7547]: I0308 03:51:53.174464 7547 scope.go:117] "RemoveContainer" containerID="724178cb9f231b822e2bf919b24049f88ede4ee540e7e7751c011ef4363756c9" Mar 08 03:51:53.175858 master-0 kubenswrapper[7547]: I0308 03:51:53.175807 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"724178cb9f231b822e2bf919b24049f88ede4ee540e7e7751c011ef4363756c9"} err="failed to get container status \"724178cb9f231b822e2bf919b24049f88ede4ee540e7e7751c011ef4363756c9\": rpc error: code = NotFound desc = could not find container \"724178cb9f231b822e2bf919b24049f88ede4ee540e7e7751c011ef4363756c9\": container with ID starting with 724178cb9f231b822e2bf919b24049f88ede4ee540e7e7751c011ef4363756c9 not found: ID does not exist" Mar 08 03:51:53.175858 master-0 kubenswrapper[7547]: I0308 03:51:53.175855 7547 scope.go:117] "RemoveContainer" containerID="9c4b058bc98e254a8a4b1a2af3561d6b7519c1e36ed6446917dcc85e6786652f" Mar 08 03:51:53.191280 master-0 kubenswrapper[7547]: I0308 03:51:53.191242 7547 scope.go:117] "RemoveContainer" containerID="d4bcecac644708f2f25ce7ab391ef8889989648db06b3e9db25dd3f64bfa6da8" Mar 08 03:51:53.191603 master-0 kubenswrapper[7547]: E0308 03:51:53.191580 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4bcecac644708f2f25ce7ab391ef8889989648db06b3e9db25dd3f64bfa6da8\": container with ID starting with d4bcecac644708f2f25ce7ab391ef8889989648db06b3e9db25dd3f64bfa6da8 not found: ID does not exist" containerID="d4bcecac644708f2f25ce7ab391ef8889989648db06b3e9db25dd3f64bfa6da8" Mar 08 03:51:53.191657 master-0 kubenswrapper[7547]: I0308 03:51:53.191608 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4bcecac644708f2f25ce7ab391ef8889989648db06b3e9db25dd3f64bfa6da8"} err="failed to get container status \"d4bcecac644708f2f25ce7ab391ef8889989648db06b3e9db25dd3f64bfa6da8\": rpc error: code = NotFound desc = could not find container \"d4bcecac644708f2f25ce7ab391ef8889989648db06b3e9db25dd3f64bfa6da8\": container with ID starting with d4bcecac644708f2f25ce7ab391ef8889989648db06b3e9db25dd3f64bfa6da8 not found: ID does not exist" Mar 08 03:51:53.191657 master-0 kubenswrapper[7547]: I0308 03:51:53.191627 7547 scope.go:117] "RemoveContainer" containerID="2b3399b78be3045c232df9d3c4545d85577efb44cef1d2c0a18e98d67e4c7cb7" Mar 08 03:51:53.191953 master-0 kubenswrapper[7547]: E0308 03:51:53.191934 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b3399b78be3045c232df9d3c4545d85577efb44cef1d2c0a18e98d67e4c7cb7\": container with ID starting with 2b3399b78be3045c232df9d3c4545d85577efb44cef1d2c0a18e98d67e4c7cb7 not found: ID does not exist" containerID="2b3399b78be3045c232df9d3c4545d85577efb44cef1d2c0a18e98d67e4c7cb7" Mar 08 03:51:53.191998 master-0 kubenswrapper[7547]: I0308 03:51:53.191953 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b3399b78be3045c232df9d3c4545d85577efb44cef1d2c0a18e98d67e4c7cb7"} err="failed to get container status \"2b3399b78be3045c232df9d3c4545d85577efb44cef1d2c0a18e98d67e4c7cb7\": rpc error: code = NotFound desc = could not find container \"2b3399b78be3045c232df9d3c4545d85577efb44cef1d2c0a18e98d67e4c7cb7\": container with ID starting with 2b3399b78be3045c232df9d3c4545d85577efb44cef1d2c0a18e98d67e4c7cb7 not found: ID does not exist" Mar 08 03:51:53.191998 master-0 kubenswrapper[7547]: I0308 03:51:53.191966 7547 scope.go:117] "RemoveContainer" containerID="db7380d3fe7301944a7a66ec837b1d91caad2bb5d7122a498e8b10a38c9f552b" Mar 08 03:51:53.192243 master-0 kubenswrapper[7547]: E0308 03:51:53.192224 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db7380d3fe7301944a7a66ec837b1d91caad2bb5d7122a498e8b10a38c9f552b\": container with ID starting with db7380d3fe7301944a7a66ec837b1d91caad2bb5d7122a498e8b10a38c9f552b not found: ID does not exist" containerID="db7380d3fe7301944a7a66ec837b1d91caad2bb5d7122a498e8b10a38c9f552b" Mar 08 03:51:53.192299 master-0 kubenswrapper[7547]: I0308 03:51:53.192245 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7380d3fe7301944a7a66ec837b1d91caad2bb5d7122a498e8b10a38c9f552b"} err="failed to get container status \"db7380d3fe7301944a7a66ec837b1d91caad2bb5d7122a498e8b10a38c9f552b\": rpc error: code = NotFound desc = could not find container \"db7380d3fe7301944a7a66ec837b1d91caad2bb5d7122a498e8b10a38c9f552b\": container with ID starting with db7380d3fe7301944a7a66ec837b1d91caad2bb5d7122a498e8b10a38c9f552b not found: ID does not exist" Mar 08 03:51:53.192299 master-0 kubenswrapper[7547]: I0308 03:51:53.192258 7547 scope.go:117] "RemoveContainer" containerID="0d9517a4dbfbd842f9c484f0081150c86a4b5af486a5ddb8461a1d470f81112c" Mar 08 03:51:53.192454 master-0 kubenswrapper[7547]: E0308 03:51:53.192429 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d9517a4dbfbd842f9c484f0081150c86a4b5af486a5ddb8461a1d470f81112c\": container with ID starting with 0d9517a4dbfbd842f9c484f0081150c86a4b5af486a5ddb8461a1d470f81112c not found: ID does not exist" containerID="0d9517a4dbfbd842f9c484f0081150c86a4b5af486a5ddb8461a1d470f81112c" Mar 08 03:51:53.192496 master-0 kubenswrapper[7547]: I0308 03:51:53.192452 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d9517a4dbfbd842f9c484f0081150c86a4b5af486a5ddb8461a1d470f81112c"} err="failed to get container status \"0d9517a4dbfbd842f9c484f0081150c86a4b5af486a5ddb8461a1d470f81112c\": rpc error: code = NotFound desc = could not find container \"0d9517a4dbfbd842f9c484f0081150c86a4b5af486a5ddb8461a1d470f81112c\": container with ID starting with 0d9517a4dbfbd842f9c484f0081150c86a4b5af486a5ddb8461a1d470f81112c not found: ID does not exist" Mar 08 03:51:53.192496 master-0 kubenswrapper[7547]: I0308 03:51:53.192465 7547 scope.go:117] "RemoveContainer" containerID="357e1b5825405be66b8754168b57e640c102e496555d0a6b7dd9834bacebf15e" Mar 08 03:51:53.193392 master-0 kubenswrapper[7547]: E0308 03:51:53.193147 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"357e1b5825405be66b8754168b57e640c102e496555d0a6b7dd9834bacebf15e\": container with ID starting with 357e1b5825405be66b8754168b57e640c102e496555d0a6b7dd9834bacebf15e not found: ID does not exist" containerID="357e1b5825405be66b8754168b57e640c102e496555d0a6b7dd9834bacebf15e" Mar 08 03:51:53.193392 master-0 kubenswrapper[7547]: I0308 03:51:53.193167 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"357e1b5825405be66b8754168b57e640c102e496555d0a6b7dd9834bacebf15e"} err="failed to get container status \"357e1b5825405be66b8754168b57e640c102e496555d0a6b7dd9834bacebf15e\": rpc error: code = NotFound desc = could not find container \"357e1b5825405be66b8754168b57e640c102e496555d0a6b7dd9834bacebf15e\": container with ID starting with 357e1b5825405be66b8754168b57e640c102e496555d0a6b7dd9834bacebf15e not found: ID does not exist" Mar 08 03:51:53.193392 master-0 kubenswrapper[7547]: I0308 03:51:53.193180 7547 scope.go:117] "RemoveContainer" containerID="37cbe69de0ead690fe5f97f7713b1a785d6ee472fbe38e74b1ca8bbb8ffc0b32" Mar 08 03:51:53.193392 master-0 kubenswrapper[7547]: E0308 03:51:53.193334 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37cbe69de0ead690fe5f97f7713b1a785d6ee472fbe38e74b1ca8bbb8ffc0b32\": container with ID starting with 37cbe69de0ead690fe5f97f7713b1a785d6ee472fbe38e74b1ca8bbb8ffc0b32 not found: ID does not exist" containerID="37cbe69de0ead690fe5f97f7713b1a785d6ee472fbe38e74b1ca8bbb8ffc0b32" Mar 08 03:51:53.193392 master-0 kubenswrapper[7547]: I0308 03:51:53.193348 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37cbe69de0ead690fe5f97f7713b1a785d6ee472fbe38e74b1ca8bbb8ffc0b32"} err="failed to get container status \"37cbe69de0ead690fe5f97f7713b1a785d6ee472fbe38e74b1ca8bbb8ffc0b32\": rpc error: code = NotFound desc = could not find container \"37cbe69de0ead690fe5f97f7713b1a785d6ee472fbe38e74b1ca8bbb8ffc0b32\": container with ID starting with 37cbe69de0ead690fe5f97f7713b1a785d6ee472fbe38e74b1ca8bbb8ffc0b32 not found: ID does not exist" Mar 08 03:51:53.193392 master-0 kubenswrapper[7547]: I0308 03:51:53.193360 7547 scope.go:117] "RemoveContainer" containerID="8f8cad46e77715e164ec2e62df8c2ee60a2b96fa8c918baa1589a3082317e15b" Mar 08 03:51:53.193584 master-0 kubenswrapper[7547]: E0308 03:51:53.193532 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f8cad46e77715e164ec2e62df8c2ee60a2b96fa8c918baa1589a3082317e15b\": container with ID starting with 8f8cad46e77715e164ec2e62df8c2ee60a2b96fa8c918baa1589a3082317e15b not found: ID does not exist" containerID="8f8cad46e77715e164ec2e62df8c2ee60a2b96fa8c918baa1589a3082317e15b" Mar 08 03:51:53.193584 master-0 kubenswrapper[7547]: I0308 03:51:53.193547 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f8cad46e77715e164ec2e62df8c2ee60a2b96fa8c918baa1589a3082317e15b"} err="failed to get container status \"8f8cad46e77715e164ec2e62df8c2ee60a2b96fa8c918baa1589a3082317e15b\": rpc error: code = NotFound desc = could not find container \"8f8cad46e77715e164ec2e62df8c2ee60a2b96fa8c918baa1589a3082317e15b\": container with ID starting with 8f8cad46e77715e164ec2e62df8c2ee60a2b96fa8c918baa1589a3082317e15b not found: ID does not exist" Mar 08 03:51:53.283928 master-0 kubenswrapper[7547]: I0308 03:51:53.270904 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f62034a-dae9-46af-8c14-006b728b631f" path="/var/lib/kubelet/pods/0f62034a-dae9-46af-8c14-006b728b631f/volumes" Mar 08 03:51:53.283928 master-0 kubenswrapper[7547]: I0308 03:51:53.271370 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47b7e26d-8fb3-4749-a544-c86c3a06e439" path="/var/lib/kubelet/pods/47b7e26d-8fb3-4749-a544-c86c3a06e439/volumes" Mar 08 03:51:53.283928 master-0 kubenswrapper[7547]: I0308 03:51:53.271867 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4818bf75-506a-4b39-bb9b-8067a02d4a51" path="/var/lib/kubelet/pods/4818bf75-506a-4b39-bb9b-8067a02d4a51/volumes" Mar 08 03:51:53.285067 master-0 kubenswrapper[7547]: I0308 03:51:53.284555 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b8dcd07-f245-4783-ba40-521a14e96043" path="/var/lib/kubelet/pods/8b8dcd07-f245-4783-ba40-521a14e96043/volumes" Mar 08 03:51:53.285857 master-0 kubenswrapper[7547]: I0308 03:51:53.285227 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fbe4302-8264-4b6c-ae3f-0c6d981bc998" path="/var/lib/kubelet/pods/9fbe4302-8264-4b6c-ae3f-0c6d981bc998/volumes" Mar 08 03:51:53.285857 master-0 kubenswrapper[7547]: I0308 03:51:53.285672 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4ff897a-ac47-45e0-aa7d-88c5aea50b70" path="/var/lib/kubelet/pods/a4ff897a-ac47-45e0-aa7d-88c5aea50b70/volumes" Mar 08 03:51:53.435508 master-0 kubenswrapper[7547]: I0308 03:51:53.435477 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-5-master-0_0f865279-e751-456d-8c96-6381f8b45ce1/installer/0.log" Mar 08 03:51:53.435695 master-0 kubenswrapper[7547]: I0308 03:51:53.435552 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:51:53.442062 master-0 kubenswrapper[7547]: I0308 03:51:53.442017 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f865279-e751-456d-8c96-6381f8b45ce1-var-lock\") pod \"0f865279-e751-456d-8c96-6381f8b45ce1\" (UID: \"0f865279-e751-456d-8c96-6381f8b45ce1\") " Mar 08 03:51:53.442131 master-0 kubenswrapper[7547]: I0308 03:51:53.442080 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f865279-e751-456d-8c96-6381f8b45ce1-kube-api-access\") pod \"0f865279-e751-456d-8c96-6381f8b45ce1\" (UID: \"0f865279-e751-456d-8c96-6381f8b45ce1\") " Mar 08 03:51:53.442167 master-0 kubenswrapper[7547]: I0308 03:51:53.442131 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f865279-e751-456d-8c96-6381f8b45ce1-kubelet-dir\") pod \"0f865279-e751-456d-8c96-6381f8b45ce1\" (UID: \"0f865279-e751-456d-8c96-6381f8b45ce1\") " Mar 08 03:51:53.442397 master-0 kubenswrapper[7547]: I0308 03:51:53.442365 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f865279-e751-456d-8c96-6381f8b45ce1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0f865279-e751-456d-8c96-6381f8b45ce1" (UID: "0f865279-e751-456d-8c96-6381f8b45ce1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:51:53.442459 master-0 kubenswrapper[7547]: I0308 03:51:53.442409 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f865279-e751-456d-8c96-6381f8b45ce1-var-lock" (OuterVolumeSpecName: "var-lock") pod "0f865279-e751-456d-8c96-6381f8b45ce1" (UID: "0f865279-e751-456d-8c96-6381f8b45ce1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:51:53.445062 master-0 kubenswrapper[7547]: I0308 03:51:53.445025 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f865279-e751-456d-8c96-6381f8b45ce1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0f865279-e751-456d-8c96-6381f8b45ce1" (UID: "0f865279-e751-456d-8c96-6381f8b45ce1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:51:53.451645 master-0 kubenswrapper[7547]: I0308 03:51:53.451149 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=1.451132069 podStartE2EDuration="1.451132069s" podCreationTimestamp="2026-03-08 03:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:51:53.449580043 +0000 UTC m=+276.395264556" watchObservedRunningTime="2026-03-08 03:51:53.451132069 +0000 UTC m=+276.396816582" Mar 08 03:51:53.475973 master-0 kubenswrapper[7547]: I0308 03:51:53.475506 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_47643289-ac4b-425d-8ea1-913b6ca39ee0/installer/0.log" Mar 08 03:51:53.475973 master-0 kubenswrapper[7547]: I0308 03:51:53.475620 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:51:53.543843 master-0 kubenswrapper[7547]: I0308 03:51:53.543533 7547 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f865279-e751-456d-8c96-6381f8b45ce1-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:51:53.543843 master-0 kubenswrapper[7547]: I0308 03:51:53.543606 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f865279-e751-456d-8c96-6381f8b45ce1-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:51:53.543843 master-0 kubenswrapper[7547]: I0308 03:51:53.543621 7547 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f865279-e751-456d-8c96-6381f8b45ce1-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:51:53.644770 master-0 kubenswrapper[7547]: I0308 03:51:53.644724 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47643289-ac4b-425d-8ea1-913b6ca39ee0-kube-api-access\") pod \"47643289-ac4b-425d-8ea1-913b6ca39ee0\" (UID: \"47643289-ac4b-425d-8ea1-913b6ca39ee0\") " Mar 08 03:51:53.645328 master-0 kubenswrapper[7547]: I0308 03:51:53.644803 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47643289-ac4b-425d-8ea1-913b6ca39ee0-kubelet-dir\") pod \"47643289-ac4b-425d-8ea1-913b6ca39ee0\" (UID: \"47643289-ac4b-425d-8ea1-913b6ca39ee0\") " Mar 08 03:51:53.645328 master-0 kubenswrapper[7547]: I0308 03:51:53.644853 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47643289-ac4b-425d-8ea1-913b6ca39ee0-var-lock\") pod \"47643289-ac4b-425d-8ea1-913b6ca39ee0\" (UID: \"47643289-ac4b-425d-8ea1-913b6ca39ee0\") " Mar 08 03:51:53.645328 master-0 kubenswrapper[7547]: I0308 03:51:53.644959 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47643289-ac4b-425d-8ea1-913b6ca39ee0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "47643289-ac4b-425d-8ea1-913b6ca39ee0" (UID: "47643289-ac4b-425d-8ea1-913b6ca39ee0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:51:53.645328 master-0 kubenswrapper[7547]: I0308 03:51:53.645063 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47643289-ac4b-425d-8ea1-913b6ca39ee0-var-lock" (OuterVolumeSpecName: "var-lock") pod "47643289-ac4b-425d-8ea1-913b6ca39ee0" (UID: "47643289-ac4b-425d-8ea1-913b6ca39ee0"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:51:53.645328 master-0 kubenswrapper[7547]: I0308 03:51:53.645154 7547 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47643289-ac4b-425d-8ea1-913b6ca39ee0-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:51:53.645328 master-0 kubenswrapper[7547]: I0308 03:51:53.645166 7547 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47643289-ac4b-425d-8ea1-913b6ca39ee0-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:51:53.647175 master-0 kubenswrapper[7547]: I0308 03:51:53.647146 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47643289-ac4b-425d-8ea1-913b6ca39ee0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "47643289-ac4b-425d-8ea1-913b6ca39ee0" (UID: "47643289-ac4b-425d-8ea1-913b6ca39ee0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:51:53.675295 master-0 kubenswrapper[7547]: I0308 03:51:53.675200 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:51:53.748834 master-0 kubenswrapper[7547]: I0308 03:51:53.748776 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47643289-ac4b-425d-8ea1-913b6ca39ee0-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:51:54.074440 master-0 kubenswrapper[7547]: I0308 03:51:54.074380 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:51:54.151702 master-0 kubenswrapper[7547]: I0308 03:51:54.151623 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-6fbfc8dc8f-nm8fj_b3eea925-73b3-4693-8f0e-6dd26107f60a/cluster-storage-operator/1.log" Mar 08 03:51:54.154020 master-0 kubenswrapper[7547]: I0308 03:51:54.153961 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_47643289-ac4b-425d-8ea1-913b6ca39ee0/installer/0.log" Mar 08 03:51:54.154193 master-0 kubenswrapper[7547]: I0308 03:51:54.154151 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:51:54.155334 master-0 kubenswrapper[7547]: I0308 03:51:54.155263 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"47643289-ac4b-425d-8ea1-913b6ca39ee0","Type":"ContainerDied","Data":"68a9c31781b9210f187f31760c081b7a914e3f7c545e30283d68c9a55506f854"} Mar 08 03:51:54.155421 master-0 kubenswrapper[7547]: I0308 03:51:54.155332 7547 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68a9c31781b9210f187f31760c081b7a914e3f7c545e30283d68c9a55506f854" Mar 08 03:51:54.157708 master-0 kubenswrapper[7547]: I0308 03:51:54.157663 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-86d7cdfdfb-chpl6_26180f77-0b1a-4d0f-9ed0-a12fdee69817/kube-controller-manager-operator/1.log" Mar 08 03:51:54.161756 master-0 kubenswrapper[7547]: I0308 03:51:54.161696 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-kt66j_0ebf1330-e044-4ff5-8b48-2d667e0c5625/openshift-controller-manager-operator/1.log" Mar 08 03:51:54.163473 master-0 kubenswrapper[7547]: I0308 03:51:54.163420 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-5-master-0_0f865279-e751-456d-8c96-6381f8b45ce1/installer/0.log" Mar 08 03:51:54.163624 master-0 kubenswrapper[7547]: I0308 03:51:54.163570 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"0f865279-e751-456d-8c96-6381f8b45ce1","Type":"ContainerDied","Data":"46d4f01a8b97928ed12b356249f2c516cd9275fd33a04ced54c1129e7817bd38"} Mar 08 03:51:54.163624 master-0 kubenswrapper[7547]: I0308 03:51:54.163608 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:51:54.163759 master-0 kubenswrapper[7547]: I0308 03:51:54.163625 7547 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46d4f01a8b97928ed12b356249f2c516cd9275fd33a04ced54c1129e7817bd38" Mar 08 03:51:54.166045 master-0 kubenswrapper[7547]: I0308 03:51:54.165990 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"28bcae4d70566beaa13732bd5095c7d8d6a2ad6f8be2ed4c2e4b067a051fc9f1"} Mar 08 03:51:54.166447 master-0 kubenswrapper[7547]: I0308 03:51:54.166397 7547 scope.go:117] "RemoveContainer" containerID="e9dbfd241ad84e1bb7af7ba76075dd9557271049d4b44017afb55ac9ce7ffb9b" Mar 08 03:51:54.166655 master-0 kubenswrapper[7547]: E0308 03:51:54.166610 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=network-operator pod=network-operator-7c649bf6d4-99d2k_openshift-network-operator(3ddfd0e7-fe76-41bc-b316-94505df81002)\"" pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" podUID="3ddfd0e7-fe76-41bc-b316-94505df81002" Mar 08 03:51:54.167121 master-0 kubenswrapper[7547]: I0308 03:51:54.167083 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:51:54.167790 master-0 kubenswrapper[7547]: I0308 03:51:54.167752 7547 scope.go:117] "RemoveContainer" containerID="9cb40f8e472021b6bf28adddecb51a371c5cff426f2d0e4b345adbb4c28df1e5" Mar 08 03:51:54.167986 master-0 kubenswrapper[7547]: E0308 03:51:54.167944 7547 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=etcd-operator pod=etcd-operator-5884b9cd56-vzms7_openshift-etcd-operator(5a7752f9-7b9a-451f-997a-e9f696d38b34)\"" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" podUID="5a7752f9-7b9a-451f-997a-e9f696d38b34" Mar 08 03:51:55.635689 master-0 kubenswrapper[7547]: I0308 03:51:55.635600 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.755276 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: E0308 03:51:57.755741 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f865279-e751-456d-8c96-6381f8b45ce1" containerName="installer" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.755801 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f865279-e751-456d-8c96-6381f8b45ce1" containerName="installer" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: E0308 03:51:57.755818 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b8dcd07-f245-4783-ba40-521a14e96043" containerName="extract-utilities" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.755883 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b8dcd07-f245-4783-ba40-521a14e96043" containerName="extract-utilities" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: E0308 03:51:57.755911 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baab6171-046d-4fc9-b7d7-ff2fd12f185f" containerName="installer" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.755924 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="baab6171-046d-4fc9-b7d7-ff2fd12f185f" containerName="installer" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: E0308 03:51:57.755980 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f62034a-dae9-46af-8c14-006b728b631f" containerName="installer" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.755993 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f62034a-dae9-46af-8c14-006b728b631f" containerName="installer" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: E0308 03:51:57.756010 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b7e26d-8fb3-4749-a544-c86c3a06e439" containerName="extract-utilities" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.756058 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b7e26d-8fb3-4749-a544-c86c3a06e439" containerName="extract-utilities" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: E0308 03:51:57.756079 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b7e26d-8fb3-4749-a544-c86c3a06e439" containerName="extract-content" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.756092 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b7e26d-8fb3-4749-a544-c86c3a06e439" containerName="extract-content" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: E0308 03:51:57.756147 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ff897a-ac47-45e0-aa7d-88c5aea50b70" containerName="extract-content" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.756162 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ff897a-ac47-45e0-aa7d-88c5aea50b70" containerName="extract-content" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: E0308 03:51:57.756180 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ff897a-ac47-45e0-aa7d-88c5aea50b70" containerName="extract-utilities" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.756192 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ff897a-ac47-45e0-aa7d-88c5aea50b70" containerName="extract-utilities" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: E0308 03:51:57.756253 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4818bf75-506a-4b39-bb9b-8067a02d4a51" containerName="extract-utilities" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.756267 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="4818bf75-506a-4b39-bb9b-8067a02d4a51" containerName="extract-utilities" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: E0308 03:51:57.756282 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47643289-ac4b-425d-8ea1-913b6ca39ee0" containerName="installer" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.756295 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="47643289-ac4b-425d-8ea1-913b6ca39ee0" containerName="installer" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: E0308 03:51:57.756351 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b8dcd07-f245-4783-ba40-521a14e96043" containerName="extract-content" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.756364 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b8dcd07-f245-4783-ba40-521a14e96043" containerName="extract-content" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: E0308 03:51:57.756378 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fbe4302-8264-4b6c-ae3f-0c6d981bc998" containerName="installer" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.756424 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fbe4302-8264-4b6c-ae3f-0c6d981bc998" containerName="installer" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: E0308 03:51:57.756441 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4818bf75-506a-4b39-bb9b-8067a02d4a51" containerName="extract-content" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.756453 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="4818bf75-506a-4b39-bb9b-8067a02d4a51" containerName="extract-content" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: E0308 03:51:57.756474 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b" containerName="installer" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.756527 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b" containerName="installer" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.756807 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="4818bf75-506a-4b39-bb9b-8067a02d4a51" containerName="extract-content" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.756881 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="baab6171-046d-4fc9-b7d7-ff2fd12f185f" containerName="installer" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.756904 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="47b7e26d-8fb3-4749-a544-c86c3a06e439" containerName="extract-content" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.756923 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b" containerName="installer" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.756973 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b8dcd07-f245-4783-ba40-521a14e96043" containerName="extract-content" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.756989 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4ff897a-ac47-45e0-aa7d-88c5aea50b70" containerName="extract-content" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.757009 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fbe4302-8264-4b6c-ae3f-0c6d981bc998" containerName="installer" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.757022 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="47643289-ac4b-425d-8ea1-913b6ca39ee0" containerName="installer" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.757040 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f865279-e751-456d-8c96-6381f8b45ce1" containerName="installer" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.757059 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f62034a-dae9-46af-8c14-006b728b631f" containerName="installer" Mar 08 03:51:57.757622 master-0 kubenswrapper[7547]: I0308 03:51:57.757689 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:51:57.773719 master-0 kubenswrapper[7547]: I0308 03:51:57.761039 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-7rcx9" Mar 08 03:51:57.773719 master-0 kubenswrapper[7547]: I0308 03:51:57.761593 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 03:51:57.773719 master-0 kubenswrapper[7547]: I0308 03:51:57.773084 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 08 03:51:57.821233 master-0 kubenswrapper[7547]: I0308 03:51:57.821145 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d191ff84-f4e4-4d99-8cbb-c10771e68baf-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"d191ff84-f4e4-4d99-8cbb-c10771e68baf\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:51:57.821233 master-0 kubenswrapper[7547]: I0308 03:51:57.821214 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d191ff84-f4e4-4d99-8cbb-c10771e68baf-var-lock\") pod \"installer-3-master-0\" (UID: \"d191ff84-f4e4-4d99-8cbb-c10771e68baf\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:51:57.821571 master-0 kubenswrapper[7547]: I0308 03:51:57.821260 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d191ff84-f4e4-4d99-8cbb-c10771e68baf-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d191ff84-f4e4-4d99-8cbb-c10771e68baf\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:51:57.922393 master-0 kubenswrapper[7547]: I0308 03:51:57.922305 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d191ff84-f4e4-4d99-8cbb-c10771e68baf-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"d191ff84-f4e4-4d99-8cbb-c10771e68baf\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:51:57.922393 master-0 kubenswrapper[7547]: I0308 03:51:57.922385 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d191ff84-f4e4-4d99-8cbb-c10771e68baf-var-lock\") pod \"installer-3-master-0\" (UID: \"d191ff84-f4e4-4d99-8cbb-c10771e68baf\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:51:57.922701 master-0 kubenswrapper[7547]: I0308 03:51:57.922496 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d191ff84-f4e4-4d99-8cbb-c10771e68baf-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"d191ff84-f4e4-4d99-8cbb-c10771e68baf\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:51:57.922701 master-0 kubenswrapper[7547]: I0308 03:51:57.922588 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d191ff84-f4e4-4d99-8cbb-c10771e68baf-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d191ff84-f4e4-4d99-8cbb-c10771e68baf\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:51:57.922878 master-0 kubenswrapper[7547]: I0308 03:51:57.922731 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d191ff84-f4e4-4d99-8cbb-c10771e68baf-var-lock\") pod \"installer-3-master-0\" (UID: \"d191ff84-f4e4-4d99-8cbb-c10771e68baf\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:51:57.945214 master-0 kubenswrapper[7547]: I0308 03:51:57.945141 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d191ff84-f4e4-4d99-8cbb-c10771e68baf-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d191ff84-f4e4-4d99-8cbb-c10771e68baf\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:51:58.081865 master-0 kubenswrapper[7547]: I0308 03:51:58.081684 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:51:58.231447 master-0 kubenswrapper[7547]: I0308 03:51:58.231379 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:51:58.593097 master-0 kubenswrapper[7547]: I0308 03:51:58.593005 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 08 03:51:59.215289 master-0 kubenswrapper[7547]: I0308 03:51:59.215074 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"d191ff84-f4e4-4d99-8cbb-c10771e68baf","Type":"ContainerStarted","Data":"ab476bbfa4b9ad96fb2348ef6d3d71a1b60822ddb1c07515c6d2e7af7a64fce8"} Mar 08 03:51:59.215289 master-0 kubenswrapper[7547]: I0308 03:51:59.215138 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"d191ff84-f4e4-4d99-8cbb-c10771e68baf","Type":"ContainerStarted","Data":"490c7966b29451303b6f42ccaf5b249c853bf84b3cd4be6c7f5b23f3365fe971"} Mar 08 03:51:59.241694 master-0 kubenswrapper[7547]: I0308 03:51:59.241569 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-3-master-0" podStartSLOduration=2.241546796 podStartE2EDuration="2.241546796s" podCreationTimestamp="2026-03-08 03:51:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:51:59.239747414 +0000 UTC m=+282.185431967" watchObservedRunningTime="2026-03-08 03:51:59.241546796 +0000 UTC m=+282.187231349" Mar 08 03:52:00.635000 master-0 kubenswrapper[7547]: I0308 03:52:00.634909 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 08 03:52:00.669342 master-0 kubenswrapper[7547]: I0308 03:52:00.669231 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 08 03:52:01.261318 master-0 kubenswrapper[7547]: I0308 03:52:01.261209 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 08 03:52:05.238800 master-0 kubenswrapper[7547]: I0308 03:52:05.237976 7547 scope.go:117] "RemoveContainer" containerID="9cb40f8e472021b6bf28adddecb51a371c5cff426f2d0e4b345adbb4c28df1e5" Mar 08 03:52:05.239909 master-0 kubenswrapper[7547]: I0308 03:52:05.238880 7547 scope.go:117] "RemoveContainer" containerID="38b4abf7d4c06fafbe1f2864c946ad3648498ee7c33fbece731408c279494ae4" Mar 08 03:52:05.443663 master-0 kubenswrapper[7547]: I0308 03:52:05.443603 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6999cc9685-kprrt"] Mar 08 03:52:05.445359 master-0 kubenswrapper[7547]: I0308 03:52:05.445309 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:05.452020 master-0 kubenswrapper[7547]: I0308 03:52:05.450271 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t"] Mar 08 03:52:05.452020 master-0 kubenswrapper[7547]: I0308 03:52:05.450621 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-hdmbr" Mar 08 03:52:05.452020 master-0 kubenswrapper[7547]: I0308 03:52:05.451181 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 03:52:05.452020 master-0 kubenswrapper[7547]: I0308 03:52:05.451213 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:52:05.452339 master-0 kubenswrapper[7547]: I0308 03:52:05.452114 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 03:52:05.452339 master-0 kubenswrapper[7547]: I0308 03:52:05.452252 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 03:52:05.452339 master-0 kubenswrapper[7547]: I0308 03:52:05.452340 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 03:52:05.452520 master-0 kubenswrapper[7547]: I0308 03:52:05.452466 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 03:52:05.456293 master-0 kubenswrapper[7547]: I0308 03:52:05.455522 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-27jjw" Mar 08 03:52:05.456521 master-0 kubenswrapper[7547]: I0308 03:52:05.456489 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 03:52:05.456657 master-0 kubenswrapper[7547]: I0308 03:52:05.456633 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 03:52:05.456920 master-0 kubenswrapper[7547]: I0308 03:52:05.456898 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 03:52:05.457091 master-0 kubenswrapper[7547]: I0308 03:52:05.457040 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 03:52:05.457400 master-0 kubenswrapper[7547]: I0308 03:52:05.457351 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 03:52:05.459919 master-0 kubenswrapper[7547]: I0308 03:52:05.459586 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 03:52:05.465273 master-0 kubenswrapper[7547]: I0308 03:52:05.465199 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6999cc9685-kprrt"] Mar 08 03:52:05.472140 master-0 kubenswrapper[7547]: I0308 03:52:05.469389 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t"] Mar 08 03:52:05.633177 master-0 kubenswrapper[7547]: I0308 03:52:05.633113 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-serving-cert\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:05.633300 master-0 kubenswrapper[7547]: I0308 03:52:05.633181 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb87l\" (UniqueName: \"kubernetes.io/projected/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-kube-api-access-pb87l\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:52:05.633300 master-0 kubenswrapper[7547]: I0308 03:52:05.633211 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-proxy-ca-bundles\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:05.633300 master-0 kubenswrapper[7547]: I0308 03:52:05.633236 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-serving-cert\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:52:05.633300 master-0 kubenswrapper[7547]: I0308 03:52:05.633257 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-client-ca\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:05.633300 master-0 kubenswrapper[7547]: I0308 03:52:05.633283 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-config\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:05.633494 master-0 kubenswrapper[7547]: I0308 03:52:05.633308 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd7d5\" (UniqueName: \"kubernetes.io/projected/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-kube-api-access-bd7d5\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:05.633494 master-0 kubenswrapper[7547]: I0308 03:52:05.633368 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-client-ca\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:52:05.633494 master-0 kubenswrapper[7547]: I0308 03:52:05.633396 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-config\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:52:05.734062 master-0 kubenswrapper[7547]: I0308 03:52:05.734004 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-client-ca\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:52:05.734062 master-0 kubenswrapper[7547]: I0308 03:52:05.734068 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-config\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:52:05.734316 master-0 kubenswrapper[7547]: I0308 03:52:05.734099 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-serving-cert\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:05.734354 master-0 kubenswrapper[7547]: I0308 03:52:05.734295 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb87l\" (UniqueName: \"kubernetes.io/projected/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-kube-api-access-pb87l\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:52:05.734354 master-0 kubenswrapper[7547]: I0308 03:52:05.734350 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-proxy-ca-bundles\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:05.734421 master-0 kubenswrapper[7547]: I0308 03:52:05.734382 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-serving-cert\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:52:05.734421 master-0 kubenswrapper[7547]: I0308 03:52:05.734403 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-client-ca\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:05.734485 master-0 kubenswrapper[7547]: I0308 03:52:05.734436 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-config\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:05.734485 master-0 kubenswrapper[7547]: I0308 03:52:05.734466 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd7d5\" (UniqueName: \"kubernetes.io/projected/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-kube-api-access-bd7d5\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:05.737844 master-0 kubenswrapper[7547]: I0308 03:52:05.736019 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-config\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:52:05.737844 master-0 kubenswrapper[7547]: I0308 03:52:05.736313 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-client-ca\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:05.737844 master-0 kubenswrapper[7547]: I0308 03:52:05.736335 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-config\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:05.737844 master-0 kubenswrapper[7547]: I0308 03:52:05.735102 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-client-ca\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:52:05.737844 master-0 kubenswrapper[7547]: I0308 03:52:05.737788 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-proxy-ca-bundles\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:05.738214 master-0 kubenswrapper[7547]: I0308 03:52:05.738181 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-serving-cert\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:52:05.739195 master-0 kubenswrapper[7547]: I0308 03:52:05.739161 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-serving-cert\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:05.755943 master-0 kubenswrapper[7547]: I0308 03:52:05.755904 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd7d5\" (UniqueName: \"kubernetes.io/projected/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-kube-api-access-bd7d5\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:05.759441 master-0 kubenswrapper[7547]: I0308 03:52:05.759367 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb87l\" (UniqueName: \"kubernetes.io/projected/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-kube-api-access-pb87l\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:52:05.822378 master-0 kubenswrapper[7547]: I0308 03:52:05.822323 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:05.880848 master-0 kubenswrapper[7547]: I0308 03:52:05.880194 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:52:06.122812 master-0 kubenswrapper[7547]: I0308 03:52:06.122765 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:52:06.231785 master-0 kubenswrapper[7547]: I0308 03:52:06.231716 7547 scope.go:117] "RemoveContainer" containerID="c699bcd2d347a8b5955755f0e95a151eaeb522f32880f9765692ad3e2e0369c6" Mar 08 03:52:06.270166 master-0 kubenswrapper[7547]: I0308 03:52:06.270095 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5c74bfc494-g6n58_e4541b7b-3f7f-4851-9bd9-26fcda5cab13/kube-scheduler-operator-container/1.log" Mar 08 03:52:06.323410 master-0 kubenswrapper[7547]: I0308 03:52:06.270256 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" event={"ID":"e4541b7b-3f7f-4851-9bd9-26fcda5cab13","Type":"ContainerStarted","Data":"4cb1740edb19b1a8742cb3fae30d0aed3408253aa76bc836bdaff0453990796a"} Mar 08 03:52:06.323410 master-0 kubenswrapper[7547]: I0308 03:52:06.274919 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-vzms7_5a7752f9-7b9a-451f-997a-e9f696d38b34/etcd-operator/1.log" Mar 08 03:52:06.323410 master-0 kubenswrapper[7547]: I0308 03:52:06.274967 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" event={"ID":"5a7752f9-7b9a-451f-997a-e9f696d38b34","Type":"ContainerStarted","Data":"361da48b967f54f48155f816e56d6a7090a322660478d9fab86bad7cccb8b43b"} Mar 08 03:52:06.376136 master-0 kubenswrapper[7547]: I0308 03:52:06.376000 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6999cc9685-kprrt"] Mar 08 03:52:06.382071 master-0 kubenswrapper[7547]: I0308 03:52:06.381975 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t"] Mar 08 03:52:07.238936 master-0 kubenswrapper[7547]: I0308 03:52:07.238852 7547 scope.go:117] "RemoveContainer" containerID="2f6ec83521bbab74297dfc2bb7addc122c614b4ebd158773a1f21c9c8a08aa06" Mar 08 03:52:07.288910 master-0 kubenswrapper[7547]: I0308 03:52:07.288406 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" event={"ID":"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2","Type":"ContainerStarted","Data":"435ec6619f140faf62f3c471feef1a5ed855198061a6c789c631830099974cc7"} Mar 08 03:52:07.290252 master-0 kubenswrapper[7547]: I0308 03:52:07.290174 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" event={"ID":"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2","Type":"ContainerStarted","Data":"c4262d5f7cd90d77070e291ffede65804485b27ce848841d5c9b49cfb475af2e"} Mar 08 03:52:07.290549 master-0 kubenswrapper[7547]: I0308 03:52:07.290523 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:52:07.292295 master-0 kubenswrapper[7547]: I0308 03:52:07.292122 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-69b6fc6b88-kg795_0d377285-0336-41b7-b48f-c44a7b563498/service-ca-operator/1.log" Mar 08 03:52:07.292605 master-0 kubenswrapper[7547]: I0308 03:52:07.292428 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" event={"ID":"0d377285-0336-41b7-b48f-c44a7b563498","Type":"ContainerStarted","Data":"5231fffe5dbecd6bd29c981e1faee946ff50d64987f0833f69411826f95ee807"} Mar 08 03:52:07.297352 master-0 kubenswrapper[7547]: I0308 03:52:07.297298 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" event={"ID":"a6c4695c-da78-46b6-8f92-ca93c5ebb96b","Type":"ContainerStarted","Data":"a7de4e022aef3ade6de907c6671db640f64c6ee4620c8dd001c2eff19b497698"} Mar 08 03:52:07.297468 master-0 kubenswrapper[7547]: I0308 03:52:07.297358 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" event={"ID":"a6c4695c-da78-46b6-8f92-ca93c5ebb96b","Type":"ContainerStarted","Data":"4025d1a6cb66b179d453ec8f3c19442902ea80a04085eeeda4fa9c48c774a80e"} Mar 08 03:52:07.297899 master-0 kubenswrapper[7547]: I0308 03:52:07.297847 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:07.298681 master-0 kubenswrapper[7547]: I0308 03:52:07.298644 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:52:07.312249 master-0 kubenswrapper[7547]: I0308 03:52:07.306099 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:52:07.322540 master-0 kubenswrapper[7547]: I0308 03:52:07.322441 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" podStartSLOduration=225.322419073 podStartE2EDuration="3m45.322419073s" podCreationTimestamp="2026-03-08 03:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:52:07.314976087 +0000 UTC m=+290.260660670" watchObservedRunningTime="2026-03-08 03:52:07.322419073 +0000 UTC m=+290.268103616" Mar 08 03:52:07.378482 master-0 kubenswrapper[7547]: I0308 03:52:07.378392 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" podStartSLOduration=225.378375971 podStartE2EDuration="3m45.378375971s" podCreationTimestamp="2026-03-08 03:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:52:07.377790167 +0000 UTC m=+290.323474690" watchObservedRunningTime="2026-03-08 03:52:07.378375971 +0000 UTC m=+290.324060484" Mar 08 03:52:08.231995 master-0 kubenswrapper[7547]: I0308 03:52:08.231929 7547 scope.go:117] "RemoveContainer" containerID="e9dbfd241ad84e1bb7af7ba76075dd9557271049d4b44017afb55ac9ce7ffb9b" Mar 08 03:52:08.232233 master-0 kubenswrapper[7547]: I0308 03:52:08.232010 7547 scope.go:117] "RemoveContainer" containerID="7b9e0618571c76237a54adfbc9471783f3afade6ddbedbe9d5d1037a9f845813" Mar 08 03:52:08.309695 master-0 kubenswrapper[7547]: I0308 03:52:08.309627 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-7f65c457f5-6fhhs_5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a/kube-storage-version-migrator-operator/1.log" Mar 08 03:52:08.311139 master-0 kubenswrapper[7547]: I0308 03:52:08.309791 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" event={"ID":"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a","Type":"ContainerStarted","Data":"86d3fc79d4416e45057d013e9349568276e5bdea9068b3d31cd46c8d01de2ee8"} Mar 08 03:52:09.321469 master-0 kubenswrapper[7547]: I0308 03:52:09.321397 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-99d2k_3ddfd0e7-fe76-41bc-b316-94505df81002/network-operator/1.log" Mar 08 03:52:09.322215 master-0 kubenswrapper[7547]: I0308 03:52:09.321536 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" event={"ID":"3ddfd0e7-fe76-41bc-b316-94505df81002","Type":"ContainerStarted","Data":"e86ec780189d72a3240027c6c883433aebb50582e6257e268556e19dfb2abaa8"} Mar 08 03:52:09.325595 master-0 kubenswrapper[7547]: I0308 03:52:09.325452 7547 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-77899cf6d-x9h9q_7ff63c73-62a3-44b4-acd3-1b3df175794f/cluster-olm-operator/1.log" Mar 08 03:52:09.326777 master-0 kubenswrapper[7547]: I0308 03:52:09.326733 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" event={"ID":"7ff63c73-62a3-44b4-acd3-1b3df175794f","Type":"ContainerStarted","Data":"29db3090b08a751660235fe4ca24e963b75ea9864f7e52c12b7c2e57d617cbd1"} Mar 08 03:52:10.543343 master-0 kubenswrapper[7547]: I0308 03:52:10.543287 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n"] Mar 08 03:52:10.545003 master-0 kubenswrapper[7547]: I0308 03:52:10.544968 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" Mar 08 03:52:10.546770 master-0 kubenswrapper[7547]: I0308 03:52:10.546737 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-92fqc" Mar 08 03:52:10.547531 master-0 kubenswrapper[7547]: I0308 03:52:10.547505 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 03:52:10.548677 master-0 kubenswrapper[7547]: I0308 03:52:10.548424 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 03:52:10.548677 master-0 kubenswrapper[7547]: I0308 03:52:10.548584 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 03:52:10.548781 master-0 kubenswrapper[7547]: I0308 03:52:10.548752 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 03:52:10.556876 master-0 kubenswrapper[7547]: I0308 03:52:10.553446 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 03:52:10.612988 master-0 kubenswrapper[7547]: I0308 03:52:10.611583 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/19998073-79e7-4078-91b4-418a037caa38-machine-approver-tls\") pod \"machine-approver-955fcfb87-zhs8n\" (UID: \"19998073-79e7-4078-91b4-418a037caa38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" Mar 08 03:52:10.612988 master-0 kubenswrapper[7547]: I0308 03:52:10.611633 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/19998073-79e7-4078-91b4-418a037caa38-auth-proxy-config\") pod \"machine-approver-955fcfb87-zhs8n\" (UID: \"19998073-79e7-4078-91b4-418a037caa38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" Mar 08 03:52:10.612988 master-0 kubenswrapper[7547]: I0308 03:52:10.611672 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19998073-79e7-4078-91b4-418a037caa38-config\") pod \"machine-approver-955fcfb87-zhs8n\" (UID: \"19998073-79e7-4078-91b4-418a037caa38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" Mar 08 03:52:10.612988 master-0 kubenswrapper[7547]: I0308 03:52:10.611705 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzh2d\" (UniqueName: \"kubernetes.io/projected/19998073-79e7-4078-91b4-418a037caa38-kube-api-access-qzh2d\") pod \"machine-approver-955fcfb87-zhs8n\" (UID: \"19998073-79e7-4078-91b4-418a037caa38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" Mar 08 03:52:10.642406 master-0 kubenswrapper[7547]: I0308 03:52:10.642323 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk"] Mar 08 03:52:10.643821 master-0 kubenswrapper[7547]: I0308 03:52:10.643771 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk" Mar 08 03:52:10.647870 master-0 kubenswrapper[7547]: I0308 03:52:10.646315 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks"] Mar 08 03:52:10.647870 master-0 kubenswrapper[7547]: I0308 03:52:10.646572 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-hvbwb" Mar 08 03:52:10.647870 master-0 kubenswrapper[7547]: I0308 03:52:10.647007 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 03:52:10.648635 master-0 kubenswrapper[7547]: I0308 03:52:10.648510 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks" Mar 08 03:52:10.650757 master-0 kubenswrapper[7547]: I0308 03:52:10.650483 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-n8nt7" Mar 08 03:52:10.651206 master-0 kubenswrapper[7547]: I0308 03:52:10.650899 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 03:52:10.651206 master-0 kubenswrapper[7547]: I0308 03:52:10.650987 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 03:52:10.651206 master-0 kubenswrapper[7547]: I0308 03:52:10.651092 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 03:52:10.660414 master-0 kubenswrapper[7547]: I0308 03:52:10.660370 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs"] Mar 08 03:52:10.662476 master-0 kubenswrapper[7547]: I0308 03:52:10.661818 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:52:10.670886 master-0 kubenswrapper[7547]: I0308 03:52:10.668267 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-rnqsm" Mar 08 03:52:10.670886 master-0 kubenswrapper[7547]: I0308 03:52:10.668452 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 08 03:52:10.670886 master-0 kubenswrapper[7547]: I0308 03:52:10.668598 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 08 03:52:10.670886 master-0 kubenswrapper[7547]: I0308 03:52:10.668730 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 08 03:52:10.670886 master-0 kubenswrapper[7547]: I0308 03:52:10.668854 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 08 03:52:10.670886 master-0 kubenswrapper[7547]: I0308 03:52:10.669631 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk"] Mar 08 03:52:10.672191 master-0 kubenswrapper[7547]: I0308 03:52:10.671541 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs"] Mar 08 03:52:10.677206 master-0 kubenswrapper[7547]: I0308 03:52:10.677168 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks"] Mar 08 03:52:10.712361 master-0 kubenswrapper[7547]: I0308 03:52:10.712317 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9z8g\" (UniqueName: \"kubernetes.io/projected/139881ee-6cfa-4a7e-b002-63cece048d16-kube-api-access-h9z8g\") pod \"control-plane-machine-set-operator-6686554ddc-7bcsk\" (UID: \"139881ee-6cfa-4a7e-b002-63cece048d16\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk" Mar 08 03:52:10.712553 master-0 kubenswrapper[7547]: I0308 03:52:10.712404 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7rsc\" (UniqueName: \"kubernetes.io/projected/c0861ccd-5e86-4277-9082-95f3133508a0-kube-api-access-n7rsc\") pod \"cloud-credential-operator-55d85b7b47-5v6gs\" (UID: \"c0861ccd-5e86-4277-9082-95f3133508a0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:52:10.712553 master-0 kubenswrapper[7547]: I0308 03:52:10.712450 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/19998073-79e7-4078-91b4-418a037caa38-machine-approver-tls\") pod \"machine-approver-955fcfb87-zhs8n\" (UID: \"19998073-79e7-4078-91b4-418a037caa38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" Mar 08 03:52:10.712553 master-0 kubenswrapper[7547]: I0308 03:52:10.712471 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/19998073-79e7-4078-91b4-418a037caa38-auth-proxy-config\") pod \"machine-approver-955fcfb87-zhs8n\" (UID: \"19998073-79e7-4078-91b4-418a037caa38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" Mar 08 03:52:10.712553 master-0 kubenswrapper[7547]: I0308 03:52:10.712488 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0861ccd-5e86-4277-9082-95f3133508a0-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-5v6gs\" (UID: \"c0861ccd-5e86-4277-9082-95f3133508a0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:52:10.712553 master-0 kubenswrapper[7547]: I0308 03:52:10.712518 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/634c0f6d-bce6-42cf-9253-80d1bcc7c507-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-lrnks\" (UID: \"634c0f6d-bce6-42cf-9253-80d1bcc7c507\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks" Mar 08 03:52:10.712553 master-0 kubenswrapper[7547]: I0308 03:52:10.712539 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/139881ee-6cfa-4a7e-b002-63cece048d16-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-7bcsk\" (UID: \"139881ee-6cfa-4a7e-b002-63cece048d16\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk" Mar 08 03:52:10.712739 master-0 kubenswrapper[7547]: I0308 03:52:10.712565 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19998073-79e7-4078-91b4-418a037caa38-config\") pod \"machine-approver-955fcfb87-zhs8n\" (UID: \"19998073-79e7-4078-91b4-418a037caa38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" Mar 08 03:52:10.712739 master-0 kubenswrapper[7547]: I0308 03:52:10.712584 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzh2d\" (UniqueName: \"kubernetes.io/projected/19998073-79e7-4078-91b4-418a037caa38-kube-api-access-qzh2d\") pod \"machine-approver-955fcfb87-zhs8n\" (UID: \"19998073-79e7-4078-91b4-418a037caa38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" Mar 08 03:52:10.712739 master-0 kubenswrapper[7547]: I0308 03:52:10.712651 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cwmn\" (UniqueName: \"kubernetes.io/projected/634c0f6d-bce6-42cf-9253-80d1bcc7c507-kube-api-access-8cwmn\") pod \"cluster-samples-operator-664cb58b85-lrnks\" (UID: \"634c0f6d-bce6-42cf-9253-80d1bcc7c507\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks" Mar 08 03:52:10.712739 master-0 kubenswrapper[7547]: I0308 03:52:10.712677 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0861ccd-5e86-4277-9082-95f3133508a0-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-5v6gs\" (UID: \"c0861ccd-5e86-4277-9082-95f3133508a0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:52:10.713419 master-0 kubenswrapper[7547]: I0308 03:52:10.713401 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/19998073-79e7-4078-91b4-418a037caa38-auth-proxy-config\") pod \"machine-approver-955fcfb87-zhs8n\" (UID: \"19998073-79e7-4078-91b4-418a037caa38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" Mar 08 03:52:10.713642 master-0 kubenswrapper[7547]: I0308 03:52:10.713606 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19998073-79e7-4078-91b4-418a037caa38-config\") pod \"machine-approver-955fcfb87-zhs8n\" (UID: \"19998073-79e7-4078-91b4-418a037caa38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" Mar 08 03:52:10.718547 master-0 kubenswrapper[7547]: I0308 03:52:10.718525 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/19998073-79e7-4078-91b4-418a037caa38-machine-approver-tls\") pod \"machine-approver-955fcfb87-zhs8n\" (UID: \"19998073-79e7-4078-91b4-418a037caa38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" Mar 08 03:52:10.727802 master-0 kubenswrapper[7547]: I0308 03:52:10.727771 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzh2d\" (UniqueName: \"kubernetes.io/projected/19998073-79e7-4078-91b4-418a037caa38-kube-api-access-qzh2d\") pod \"machine-approver-955fcfb87-zhs8n\" (UID: \"19998073-79e7-4078-91b4-418a037caa38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" Mar 08 03:52:10.814236 master-0 kubenswrapper[7547]: I0308 03:52:10.814105 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0861ccd-5e86-4277-9082-95f3133508a0-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-5v6gs\" (UID: \"c0861ccd-5e86-4277-9082-95f3133508a0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:52:10.814236 master-0 kubenswrapper[7547]: I0308 03:52:10.814198 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/634c0f6d-bce6-42cf-9253-80d1bcc7c507-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-lrnks\" (UID: \"634c0f6d-bce6-42cf-9253-80d1bcc7c507\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks" Mar 08 03:52:10.814471 master-0 kubenswrapper[7547]: I0308 03:52:10.814248 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/139881ee-6cfa-4a7e-b002-63cece048d16-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-7bcsk\" (UID: \"139881ee-6cfa-4a7e-b002-63cece048d16\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk" Mar 08 03:52:10.814471 master-0 kubenswrapper[7547]: I0308 03:52:10.814323 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cwmn\" (UniqueName: \"kubernetes.io/projected/634c0f6d-bce6-42cf-9253-80d1bcc7c507-kube-api-access-8cwmn\") pod \"cluster-samples-operator-664cb58b85-lrnks\" (UID: \"634c0f6d-bce6-42cf-9253-80d1bcc7c507\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks" Mar 08 03:52:10.814471 master-0 kubenswrapper[7547]: I0308 03:52:10.814370 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0861ccd-5e86-4277-9082-95f3133508a0-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-5v6gs\" (UID: \"c0861ccd-5e86-4277-9082-95f3133508a0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:52:10.814471 master-0 kubenswrapper[7547]: I0308 03:52:10.814418 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9z8g\" (UniqueName: \"kubernetes.io/projected/139881ee-6cfa-4a7e-b002-63cece048d16-kube-api-access-h9z8g\") pod \"control-plane-machine-set-operator-6686554ddc-7bcsk\" (UID: \"139881ee-6cfa-4a7e-b002-63cece048d16\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk" Mar 08 03:52:10.814471 master-0 kubenswrapper[7547]: I0308 03:52:10.814451 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7rsc\" (UniqueName: \"kubernetes.io/projected/c0861ccd-5e86-4277-9082-95f3133508a0-kube-api-access-n7rsc\") pod \"cloud-credential-operator-55d85b7b47-5v6gs\" (UID: \"c0861ccd-5e86-4277-9082-95f3133508a0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:52:10.815867 master-0 kubenswrapper[7547]: I0308 03:52:10.815798 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0861ccd-5e86-4277-9082-95f3133508a0-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-5v6gs\" (UID: \"c0861ccd-5e86-4277-9082-95f3133508a0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:52:10.823473 master-0 kubenswrapper[7547]: I0308 03:52:10.823440 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/634c0f6d-bce6-42cf-9253-80d1bcc7c507-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-lrnks\" (UID: \"634c0f6d-bce6-42cf-9253-80d1bcc7c507\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks" Mar 08 03:52:10.824031 master-0 kubenswrapper[7547]: I0308 03:52:10.823792 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/139881ee-6cfa-4a7e-b002-63cece048d16-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-7bcsk\" (UID: \"139881ee-6cfa-4a7e-b002-63cece048d16\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk" Mar 08 03:52:10.824619 master-0 kubenswrapper[7547]: I0308 03:52:10.824600 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0861ccd-5e86-4277-9082-95f3133508a0-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-5v6gs\" (UID: \"c0861ccd-5e86-4277-9082-95f3133508a0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:52:10.835489 master-0 kubenswrapper[7547]: I0308 03:52:10.835439 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7rsc\" (UniqueName: \"kubernetes.io/projected/c0861ccd-5e86-4277-9082-95f3133508a0-kube-api-access-n7rsc\") pod \"cloud-credential-operator-55d85b7b47-5v6gs\" (UID: \"c0861ccd-5e86-4277-9082-95f3133508a0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:52:10.836412 master-0 kubenswrapper[7547]: I0308 03:52:10.836380 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cwmn\" (UniqueName: \"kubernetes.io/projected/634c0f6d-bce6-42cf-9253-80d1bcc7c507-kube-api-access-8cwmn\") pod \"cluster-samples-operator-664cb58b85-lrnks\" (UID: \"634c0f6d-bce6-42cf-9253-80d1bcc7c507\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks" Mar 08 03:52:10.840841 master-0 kubenswrapper[7547]: I0308 03:52:10.840801 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9z8g\" (UniqueName: \"kubernetes.io/projected/139881ee-6cfa-4a7e-b002-63cece048d16-kube-api-access-h9z8g\") pod \"control-plane-machine-set-operator-6686554ddc-7bcsk\" (UID: \"139881ee-6cfa-4a7e-b002-63cece048d16\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk" Mar 08 03:52:10.864786 master-0 kubenswrapper[7547]: I0308 03:52:10.864726 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" Mar 08 03:52:10.963589 master-0 kubenswrapper[7547]: I0308 03:52:10.963523 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk" Mar 08 03:52:10.978798 master-0 kubenswrapper[7547]: I0308 03:52:10.978734 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks" Mar 08 03:52:10.988221 master-0 kubenswrapper[7547]: I0308 03:52:10.988157 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:52:11.248798 master-0 kubenswrapper[7547]: I0308 03:52:11.248758 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk"] Mar 08 03:52:11.252457 master-0 kubenswrapper[7547]: W0308 03:52:11.252414 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod139881ee_6cfa_4a7e_b002_63cece048d16.slice/crio-a77b94c864745863d968467e91271f57be5b9449652226a9e1e9789e20eef38f WatchSource:0}: Error finding container a77b94c864745863d968467e91271f57be5b9449652226a9e1e9789e20eef38f: Status 404 returned error can't find the container with id a77b94c864745863d968467e91271f57be5b9449652226a9e1e9789e20eef38f Mar 08 03:52:11.345342 master-0 kubenswrapper[7547]: I0308 03:52:11.345244 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk" event={"ID":"139881ee-6cfa-4a7e-b002-63cece048d16","Type":"ContainerStarted","Data":"a77b94c864745863d968467e91271f57be5b9449652226a9e1e9789e20eef38f"} Mar 08 03:52:11.347191 master-0 kubenswrapper[7547]: I0308 03:52:11.347127 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" event={"ID":"19998073-79e7-4078-91b4-418a037caa38","Type":"ContainerStarted","Data":"2225b49859b76acb9ea5708c2c9c3e17ba7fe3b15b9def2ac0929094811fc07b"} Mar 08 03:52:11.347272 master-0 kubenswrapper[7547]: I0308 03:52:11.347207 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" event={"ID":"19998073-79e7-4078-91b4-418a037caa38","Type":"ContainerStarted","Data":"399bfaebb310269b0096a7a05e3f4655b777b39ab76bccec71929c119f7598f0"} Mar 08 03:52:11.391728 master-0 kubenswrapper[7547]: I0308 03:52:11.390550 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b"] Mar 08 03:52:11.391728 master-0 kubenswrapper[7547]: I0308 03:52:11.391455 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:52:11.393563 master-0 kubenswrapper[7547]: I0308 03:52:11.393483 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-4rdlv" Mar 08 03:52:11.393794 master-0 kubenswrapper[7547]: I0308 03:52:11.393771 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 08 03:52:11.394585 master-0 kubenswrapper[7547]: I0308 03:52:11.394037 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 08 03:52:11.410229 master-0 kubenswrapper[7547]: I0308 03:52:11.410196 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b"] Mar 08 03:52:11.422886 master-0 kubenswrapper[7547]: I0308 03:52:11.422847 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv7sd\" (UniqueName: \"kubernetes.io/projected/1b700d17-83d2-46c8-afbc-e5774822eabe-kube-api-access-cv7sd\") pod \"cluster-autoscaler-operator-69576476f7-bv67b\" (UID: \"1b700d17-83d2-46c8-afbc-e5774822eabe\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:52:11.423041 master-0 kubenswrapper[7547]: I0308 03:52:11.422999 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b700d17-83d2-46c8-afbc-e5774822eabe-cert\") pod \"cluster-autoscaler-operator-69576476f7-bv67b\" (UID: \"1b700d17-83d2-46c8-afbc-e5774822eabe\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:52:11.423041 master-0 kubenswrapper[7547]: I0308 03:52:11.423035 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b700d17-83d2-46c8-afbc-e5774822eabe-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-bv67b\" (UID: \"1b700d17-83d2-46c8-afbc-e5774822eabe\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:52:11.532039 master-0 kubenswrapper[7547]: I0308 03:52:11.523993 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b700d17-83d2-46c8-afbc-e5774822eabe-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-bv67b\" (UID: \"1b700d17-83d2-46c8-afbc-e5774822eabe\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:52:11.532039 master-0 kubenswrapper[7547]: I0308 03:52:11.524064 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv7sd\" (UniqueName: \"kubernetes.io/projected/1b700d17-83d2-46c8-afbc-e5774822eabe-kube-api-access-cv7sd\") pod \"cluster-autoscaler-operator-69576476f7-bv67b\" (UID: \"1b700d17-83d2-46c8-afbc-e5774822eabe\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:52:11.532039 master-0 kubenswrapper[7547]: I0308 03:52:11.524148 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b700d17-83d2-46c8-afbc-e5774822eabe-cert\") pod \"cluster-autoscaler-operator-69576476f7-bv67b\" (UID: \"1b700d17-83d2-46c8-afbc-e5774822eabe\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:52:11.532039 master-0 kubenswrapper[7547]: I0308 03:52:11.527097 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b700d17-83d2-46c8-afbc-e5774822eabe-cert\") pod \"cluster-autoscaler-operator-69576476f7-bv67b\" (UID: \"1b700d17-83d2-46c8-afbc-e5774822eabe\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:52:11.532039 master-0 kubenswrapper[7547]: I0308 03:52:11.527840 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b700d17-83d2-46c8-afbc-e5774822eabe-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-bv67b\" (UID: \"1b700d17-83d2-46c8-afbc-e5774822eabe\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:52:11.561650 master-0 kubenswrapper[7547]: I0308 03:52:11.561614 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv7sd\" (UniqueName: \"kubernetes.io/projected/1b700d17-83d2-46c8-afbc-e5774822eabe-kube-api-access-cv7sd\") pod \"cluster-autoscaler-operator-69576476f7-bv67b\" (UID: \"1b700d17-83d2-46c8-afbc-e5774822eabe\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:52:11.571143 master-0 kubenswrapper[7547]: I0308 03:52:11.571087 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks"] Mar 08 03:52:11.586860 master-0 kubenswrapper[7547]: I0308 03:52:11.586635 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs"] Mar 08 03:52:11.721608 master-0 kubenswrapper[7547]: I0308 03:52:11.721545 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:52:12.139569 master-0 kubenswrapper[7547]: I0308 03:52:12.138706 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b"] Mar 08 03:52:12.144199 master-0 kubenswrapper[7547]: W0308 03:52:12.144163 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b700d17_83d2_46c8_afbc_e5774822eabe.slice/crio-0362cae60811bf2651cbf0189c6f7ef23ef9a8b3134d671278cc424b4c2ad9ec WatchSource:0}: Error finding container 0362cae60811bf2651cbf0189c6f7ef23ef9a8b3134d671278cc424b4c2ad9ec: Status 404 returned error can't find the container with id 0362cae60811bf2651cbf0189c6f7ef23ef9a8b3134d671278cc424b4c2ad9ec Mar 08 03:52:12.356307 master-0 kubenswrapper[7547]: I0308 03:52:12.356246 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks" event={"ID":"634c0f6d-bce6-42cf-9253-80d1bcc7c507","Type":"ContainerStarted","Data":"d9c3b609dc5ba9405c265ea8312eb684aae5364c416fb8a8d02c96c1413b155d"} Mar 08 03:52:12.360361 master-0 kubenswrapper[7547]: I0308 03:52:12.360331 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" event={"ID":"c0861ccd-5e86-4277-9082-95f3133508a0","Type":"ContainerStarted","Data":"4e3a22af6350dc7838dab284ae8d96383b1cb0a0cdf2466bd2df60f8428f650b"} Mar 08 03:52:12.360450 master-0 kubenswrapper[7547]: I0308 03:52:12.360367 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" event={"ID":"c0861ccd-5e86-4277-9082-95f3133508a0","Type":"ContainerStarted","Data":"b259f65d2759400bc4903e27e05a8a6318da137f5779b59b0781ca65575183e7"} Mar 08 03:52:12.362098 master-0 kubenswrapper[7547]: I0308 03:52:12.362062 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" event={"ID":"1b700d17-83d2-46c8-afbc-e5774822eabe","Type":"ContainerStarted","Data":"b88ddfc5ea0775e6458c2baf75f9cb33a5fcb34e26bca9978908441767061df3"} Mar 08 03:52:12.362098 master-0 kubenswrapper[7547]: I0308 03:52:12.362094 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" event={"ID":"1b700d17-83d2-46c8-afbc-e5774822eabe","Type":"ContainerStarted","Data":"0362cae60811bf2651cbf0189c6f7ef23ef9a8b3134d671278cc424b4c2ad9ec"} Mar 08 03:52:13.495064 master-0 kubenswrapper[7547]: I0308 03:52:13.494937 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8"] Mar 08 03:52:13.498657 master-0 kubenswrapper[7547]: I0308 03:52:13.498625 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:13.502314 master-0 kubenswrapper[7547]: I0308 03:52:13.502253 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 08 03:52:13.502524 master-0 kubenswrapper[7547]: I0308 03:52:13.502486 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 03:52:13.502629 master-0 kubenswrapper[7547]: I0308 03:52:13.502608 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:52:13.502749 master-0 kubenswrapper[7547]: I0308 03:52:13.502721 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 08 03:52:13.503716 master-0 kubenswrapper[7547]: I0308 03:52:13.503679 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 08 03:52:13.503989 master-0 kubenswrapper[7547]: I0308 03:52:13.503960 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-lnbcj" Mar 08 03:52:13.556096 master-0 kubenswrapper[7547]: I0308 03:52:13.555773 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/efcc6976-2815-4d96-8efb-1333102ccfd0-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-mbxg8\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:13.556096 master-0 kubenswrapper[7547]: I0308 03:52:13.555920 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/efcc6976-2815-4d96-8efb-1333102ccfd0-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-mbxg8\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:13.556096 master-0 kubenswrapper[7547]: I0308 03:52:13.555977 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59nzz\" (UniqueName: \"kubernetes.io/projected/efcc6976-2815-4d96-8efb-1333102ccfd0-kube-api-access-59nzz\") pod \"cluster-cloud-controller-manager-operator-559568b945-mbxg8\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:13.556491 master-0 kubenswrapper[7547]: I0308 03:52:13.556124 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/efcc6976-2815-4d96-8efb-1333102ccfd0-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-mbxg8\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:13.556491 master-0 kubenswrapper[7547]: I0308 03:52:13.556260 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/efcc6976-2815-4d96-8efb-1333102ccfd0-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-mbxg8\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:13.659403 master-0 kubenswrapper[7547]: I0308 03:52:13.657691 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/efcc6976-2815-4d96-8efb-1333102ccfd0-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-mbxg8\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:13.659403 master-0 kubenswrapper[7547]: I0308 03:52:13.657776 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59nzz\" (UniqueName: \"kubernetes.io/projected/efcc6976-2815-4d96-8efb-1333102ccfd0-kube-api-access-59nzz\") pod \"cluster-cloud-controller-manager-operator-559568b945-mbxg8\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:13.659403 master-0 kubenswrapper[7547]: I0308 03:52:13.657868 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/efcc6976-2815-4d96-8efb-1333102ccfd0-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-mbxg8\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:13.659403 master-0 kubenswrapper[7547]: I0308 03:52:13.657917 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/efcc6976-2815-4d96-8efb-1333102ccfd0-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-mbxg8\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:13.659403 master-0 kubenswrapper[7547]: I0308 03:52:13.657999 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/efcc6976-2815-4d96-8efb-1333102ccfd0-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-mbxg8\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:13.659403 master-0 kubenswrapper[7547]: I0308 03:52:13.658087 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/efcc6976-2815-4d96-8efb-1333102ccfd0-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-mbxg8\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:13.659403 master-0 kubenswrapper[7547]: I0308 03:52:13.658534 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/efcc6976-2815-4d96-8efb-1333102ccfd0-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-mbxg8\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:13.660871 master-0 kubenswrapper[7547]: I0308 03:52:13.660797 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/efcc6976-2815-4d96-8efb-1333102ccfd0-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-mbxg8\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:13.663329 master-0 kubenswrapper[7547]: I0308 03:52:13.663280 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/efcc6976-2815-4d96-8efb-1333102ccfd0-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-mbxg8\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:13.672954 master-0 kubenswrapper[7547]: I0308 03:52:13.672913 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59nzz\" (UniqueName: \"kubernetes.io/projected/efcc6976-2815-4d96-8efb-1333102ccfd0-kube-api-access-59nzz\") pod \"cluster-cloud-controller-manager-operator-559568b945-mbxg8\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:13.824879 master-0 kubenswrapper[7547]: I0308 03:52:13.823154 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:13.827019 master-0 kubenswrapper[7547]: I0308 03:52:13.826926 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8"] Mar 08 03:52:13.831475 master-0 kubenswrapper[7547]: I0308 03:52:13.831440 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8"] Mar 08 03:52:13.831651 master-0 kubenswrapper[7547]: I0308 03:52:13.831573 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:52:13.833472 master-0 kubenswrapper[7547]: I0308 03:52:13.833438 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-slxzk" Mar 08 03:52:13.834889 master-0 kubenswrapper[7547]: I0308 03:52:13.834720 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 03:52:13.834889 master-0 kubenswrapper[7547]: I0308 03:52:13.834854 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 03:52:13.836504 master-0 kubenswrapper[7547]: I0308 03:52:13.836475 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 03:52:13.861172 master-0 kubenswrapper[7547]: I0308 03:52:13.860910 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:52:13.861172 master-0 kubenswrapper[7547]: I0308 03:52:13.861024 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nlq2\" (UniqueName: \"kubernetes.io/projected/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-kube-api-access-6nlq2\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:52:13.861172 master-0 kubenswrapper[7547]: I0308 03:52:13.861058 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-images\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:52:13.861172 master-0 kubenswrapper[7547]: I0308 03:52:13.861096 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-config\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:52:13.962802 master-0 kubenswrapper[7547]: I0308 03:52:13.962749 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nlq2\" (UniqueName: \"kubernetes.io/projected/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-kube-api-access-6nlq2\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:52:13.962802 master-0 kubenswrapper[7547]: I0308 03:52:13.962800 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-images\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:52:13.963046 master-0 kubenswrapper[7547]: I0308 03:52:13.962847 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-config\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:52:13.963046 master-0 kubenswrapper[7547]: I0308 03:52:13.962885 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:52:13.963046 master-0 kubenswrapper[7547]: E0308 03:52:13.963004 7547 secret.go:189] Couldn't get secret openshift-machine-api/machine-api-operator-tls: secret "machine-api-operator-tls" not found Mar 08 03:52:13.963046 master-0 kubenswrapper[7547]: E0308 03:52:13.963050 7547 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-machine-api-operator-tls podName:b70adfe9-94f1-44bc-85ce-498e5f0a1ca7 nodeName:}" failed. No retries permitted until 2026-03-08 03:52:14.463037118 +0000 UTC m=+297.408721631 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-machine-api-operator-tls") pod "machine-api-operator-84bf6db4f9-tdrf8" (UID: "b70adfe9-94f1-44bc-85ce-498e5f0a1ca7") : secret "machine-api-operator-tls" not found Mar 08 03:52:13.965754 master-0 kubenswrapper[7547]: I0308 03:52:13.965710 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-images\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:52:13.967761 master-0 kubenswrapper[7547]: I0308 03:52:13.967737 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-config\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:52:13.983913 master-0 kubenswrapper[7547]: I0308 03:52:13.983869 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nlq2\" (UniqueName: \"kubernetes.io/projected/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-kube-api-access-6nlq2\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:52:14.468913 master-0 kubenswrapper[7547]: I0308 03:52:14.468855 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:52:14.473306 master-0 kubenswrapper[7547]: I0308 03:52:14.473230 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:52:14.752342 master-0 kubenswrapper[7547]: I0308 03:52:14.752198 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:52:14.849943 master-0 kubenswrapper[7547]: W0308 03:52:14.849802 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefcc6976_2815_4d96_8efb_1333102ccfd0.slice/crio-75f4eb3be0a13ebaf0adfb5408aab17726c88687c273eeef67438587a2bdc267 WatchSource:0}: Error finding container 75f4eb3be0a13ebaf0adfb5408aab17726c88687c273eeef67438587a2bdc267: Status 404 returned error can't find the container with id 75f4eb3be0a13ebaf0adfb5408aab17726c88687c273eeef67438587a2bdc267 Mar 08 03:52:15.210928 master-0 kubenswrapper[7547]: I0308 03:52:15.210887 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8"] Mar 08 03:52:15.214159 master-0 kubenswrapper[7547]: W0308 03:52:15.214114 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb70adfe9_94f1_44bc_85ce_498e5f0a1ca7.slice/crio-2a6fcf9f9144c9d6bd5dcc37a1eb71dcefd7c65bbd0149ae8f69eff02142d6ae WatchSource:0}: Error finding container 2a6fcf9f9144c9d6bd5dcc37a1eb71dcefd7c65bbd0149ae8f69eff02142d6ae: Status 404 returned error can't find the container with id 2a6fcf9f9144c9d6bd5dcc37a1eb71dcefd7c65bbd0149ae8f69eff02142d6ae Mar 08 03:52:15.400901 master-0 kubenswrapper[7547]: I0308 03:52:15.400851 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks" event={"ID":"634c0f6d-bce6-42cf-9253-80d1bcc7c507","Type":"ContainerStarted","Data":"70b104f3e2ed80188bf6e0192c16ab4d28e4d8e8fc74c3bbdb091a061695368e"} Mar 08 03:52:15.400901 master-0 kubenswrapper[7547]: I0308 03:52:15.400894 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks" event={"ID":"634c0f6d-bce6-42cf-9253-80d1bcc7c507","Type":"ContainerStarted","Data":"37e70aaf4719f91dec95cfcc08645accb70c221f8a1ba736e0900e892d58d3b7"} Mar 08 03:52:15.403878 master-0 kubenswrapper[7547]: I0308 03:52:15.403781 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" event={"ID":"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7","Type":"ContainerStarted","Data":"9bdfec76a278642c3126aaacacb04ba27dd5bce04852e8c6a5f9bb00816b01db"} Mar 08 03:52:15.403878 master-0 kubenswrapper[7547]: I0308 03:52:15.403872 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" event={"ID":"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7","Type":"ContainerStarted","Data":"2a6fcf9f9144c9d6bd5dcc37a1eb71dcefd7c65bbd0149ae8f69eff02142d6ae"} Mar 08 03:52:15.406289 master-0 kubenswrapper[7547]: I0308 03:52:15.406255 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" event={"ID":"19998073-79e7-4078-91b4-418a037caa38","Type":"ContainerStarted","Data":"cff54322e0a4836432d190c57034a7703fc9c7cc470c2da361243bd1f845fca1"} Mar 08 03:52:15.407616 master-0 kubenswrapper[7547]: I0308 03:52:15.407583 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" event={"ID":"1b700d17-83d2-46c8-afbc-e5774822eabe","Type":"ContainerStarted","Data":"b9be8ece5a79f90d1b08d38693357868aba4825cbc7bfe504b163d5a110c50ad"} Mar 08 03:52:15.409392 master-0 kubenswrapper[7547]: I0308 03:52:15.409358 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk" event={"ID":"139881ee-6cfa-4a7e-b002-63cece048d16","Type":"ContainerStarted","Data":"ee626e564b55f5d917bbbc7e14f501fa99cbf93b071cbbb29f2046199512a0ec"} Mar 08 03:52:15.410460 master-0 kubenswrapper[7547]: I0308 03:52:15.410434 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" event={"ID":"efcc6976-2815-4d96-8efb-1333102ccfd0","Type":"ContainerStarted","Data":"75f4eb3be0a13ebaf0adfb5408aab17726c88687c273eeef67438587a2bdc267"} Mar 08 03:52:15.417406 master-0 kubenswrapper[7547]: I0308 03:52:15.417351 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks" podStartSLOduration=2.228036605 podStartE2EDuration="5.417339281s" podCreationTimestamp="2026-03-08 03:52:10 +0000 UTC" firstStartedPulling="2026-03-08 03:52:11.650012977 +0000 UTC m=+294.595697490" lastFinishedPulling="2026-03-08 03:52:14.839315663 +0000 UTC m=+297.785000166" observedRunningTime="2026-03-08 03:52:15.414878283 +0000 UTC m=+298.360562796" watchObservedRunningTime="2026-03-08 03:52:15.417339281 +0000 UTC m=+298.363023794" Mar 08 03:52:15.440151 master-0 kubenswrapper[7547]: I0308 03:52:15.440063 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk" podStartSLOduration=1.85682202 podStartE2EDuration="5.440034986s" podCreationTimestamp="2026-03-08 03:52:10 +0000 UTC" firstStartedPulling="2026-03-08 03:52:11.256169378 +0000 UTC m=+294.201853891" lastFinishedPulling="2026-03-08 03:52:14.839382294 +0000 UTC m=+297.785066857" observedRunningTime="2026-03-08 03:52:15.435702463 +0000 UTC m=+298.381386976" watchObservedRunningTime="2026-03-08 03:52:15.440034986 +0000 UTC m=+298.385719509" Mar 08 03:52:15.486938 master-0 kubenswrapper[7547]: I0308 03:52:15.486253 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" podStartSLOduration=1.875920748 podStartE2EDuration="5.486228823s" podCreationTimestamp="2026-03-08 03:52:10 +0000 UTC" firstStartedPulling="2026-03-08 03:52:11.229033568 +0000 UTC m=+294.174718081" lastFinishedPulling="2026-03-08 03:52:14.839341633 +0000 UTC m=+297.785026156" observedRunningTime="2026-03-08 03:52:15.474852195 +0000 UTC m=+298.420536758" watchObservedRunningTime="2026-03-08 03:52:15.486228823 +0000 UTC m=+298.431913356" Mar 08 03:52:15.486938 master-0 kubenswrapper[7547]: I0308 03:52:15.486377 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" podStartSLOduration=1.910061512 podStartE2EDuration="4.486372687s" podCreationTimestamp="2026-03-08 03:52:11 +0000 UTC" firstStartedPulling="2026-03-08 03:52:12.30427195 +0000 UTC m=+295.249956473" lastFinishedPulling="2026-03-08 03:52:14.880583125 +0000 UTC m=+297.826267648" observedRunningTime="2026-03-08 03:52:15.457505197 +0000 UTC m=+298.403189710" watchObservedRunningTime="2026-03-08 03:52:15.486372687 +0000 UTC m=+298.432057210" Mar 08 03:52:17.376202 master-0 kubenswrapper[7547]: I0308 03:52:17.376130 7547 scope.go:117] "RemoveContainer" containerID="4998ca5636ddd8d905b67c8fb24fdf903c161f9cdcca4bdb3b01719d5f1d5376" Mar 08 03:52:19.024322 master-0 kubenswrapper[7547]: I0308 03:52:19.022992 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 08 03:52:19.024322 master-0 kubenswrapper[7547]: I0308 03:52:19.024243 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:52:19.033518 master-0 kubenswrapper[7547]: I0308 03:52:19.031772 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-ccz75" Mar 08 03:52:19.033518 master-0 kubenswrapper[7547]: I0308 03:52:19.032216 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 03:52:19.037737 master-0 kubenswrapper[7547]: I0308 03:52:19.037685 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 08 03:52:19.126766 master-0 kubenswrapper[7547]: I0308 03:52:19.126678 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:52:19.126766 master-0 kubenswrapper[7547]: I0308 03:52:19.126764 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:52:19.126766 master-0 kubenswrapper[7547]: I0308 03:52:19.126792 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:52:19.228015 master-0 kubenswrapper[7547]: I0308 03:52:19.227608 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:52:19.228015 master-0 kubenswrapper[7547]: I0308 03:52:19.227895 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:52:19.228015 master-0 kubenswrapper[7547]: I0308 03:52:19.227950 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:52:19.228378 master-0 kubenswrapper[7547]: I0308 03:52:19.228037 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:52:19.228378 master-0 kubenswrapper[7547]: I0308 03:52:19.228080 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:52:19.256920 master-0 kubenswrapper[7547]: I0308 03:52:19.256883 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:52:19.356777 master-0 kubenswrapper[7547]: I0308 03:52:19.356529 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:52:20.439313 master-0 kubenswrapper[7547]: I0308 03:52:20.437790 7547 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 08 03:52:20.447479 master-0 kubenswrapper[7547]: I0308 03:52:20.447408 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" event={"ID":"efcc6976-2815-4d96-8efb-1333102ccfd0","Type":"ContainerStarted","Data":"d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9"} Mar 08 03:52:20.451843 master-0 kubenswrapper[7547]: I0308 03:52:20.451519 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" event={"ID":"c0861ccd-5e86-4277-9082-95f3133508a0","Type":"ContainerStarted","Data":"2fb58845ca39eceb501bc9ade78a5622e2b644931683a224c8b7004baa279f7a"} Mar 08 03:52:20.473081 master-0 kubenswrapper[7547]: I0308 03:52:20.473023 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" podStartSLOduration=2.108155489 podStartE2EDuration="10.473008066s" podCreationTimestamp="2026-03-08 03:52:10 +0000 UTC" firstStartedPulling="2026-03-08 03:52:11.702012521 +0000 UTC m=+294.647697034" lastFinishedPulling="2026-03-08 03:52:20.066865088 +0000 UTC m=+303.012549611" observedRunningTime="2026-03-08 03:52:20.47271654 +0000 UTC m=+303.418401063" watchObservedRunningTime="2026-03-08 03:52:20.473008066 +0000 UTC m=+303.418692579" Mar 08 03:52:21.460934 master-0 kubenswrapper[7547]: I0308 03:52:21.460874 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" event={"ID":"efcc6976-2815-4d96-8efb-1333102ccfd0","Type":"ContainerStarted","Data":"64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf"} Mar 08 03:52:21.460934 master-0 kubenswrapper[7547]: I0308 03:52:21.460930 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" event={"ID":"efcc6976-2815-4d96-8efb-1333102ccfd0","Type":"ContainerStarted","Data":"c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134"} Mar 08 03:52:21.463290 master-0 kubenswrapper[7547]: I0308 03:52:21.463219 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"9c95709c-c3cb-46fb-afe7-626c8013f3c6","Type":"ContainerStarted","Data":"5f3bdf25350f3735e74258b774375768d3fdf1215b280e2ea275e2a1b21b5161"} Mar 08 03:52:21.463345 master-0 kubenswrapper[7547]: I0308 03:52:21.463291 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"9c95709c-c3cb-46fb-afe7-626c8013f3c6","Type":"ContainerStarted","Data":"404a0d2a7b8ca71df97070480a7d3f018db46911d3635a93f748cb0ea044da91"} Mar 08 03:52:21.486577 master-0 kubenswrapper[7547]: I0308 03:52:21.486486 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" podStartSLOduration=3.297763652 podStartE2EDuration="8.486454912s" podCreationTimestamp="2026-03-08 03:52:13 +0000 UTC" firstStartedPulling="2026-03-08 03:52:14.867415325 +0000 UTC m=+297.813099838" lastFinishedPulling="2026-03-08 03:52:20.056106585 +0000 UTC m=+303.001791098" observedRunningTime="2026-03-08 03:52:21.480469802 +0000 UTC m=+304.426154335" watchObservedRunningTime="2026-03-08 03:52:21.486454912 +0000 UTC m=+304.432139445" Mar 08 03:52:21.501556 master-0 kubenswrapper[7547]: I0308 03:52:21.501101 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" podStartSLOduration=2.5010804269999998 podStartE2EDuration="2.501080427s" podCreationTimestamp="2026-03-08 03:52:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:52:21.500223167 +0000 UTC m=+304.445907680" watchObservedRunningTime="2026-03-08 03:52:21.501080427 +0000 UTC m=+304.446764950" Mar 08 03:52:24.492355 master-0 kubenswrapper[7547]: I0308 03:52:24.492146 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" event={"ID":"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7","Type":"ContainerStarted","Data":"029a27fb736bdc11d71d13f7cee2fad9d45777a98ca3db8fbbd922eea19ce40e"} Mar 08 03:52:24.525608 master-0 kubenswrapper[7547]: I0308 03:52:24.525467 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" podStartSLOduration=3.451941676 podStartE2EDuration="11.525438008s" podCreationTimestamp="2026-03-08 03:52:13 +0000 UTC" firstStartedPulling="2026-03-08 03:52:15.35490656 +0000 UTC m=+298.300591093" lastFinishedPulling="2026-03-08 03:52:23.428402902 +0000 UTC m=+306.374087425" observedRunningTime="2026-03-08 03:52:24.520511992 +0000 UTC m=+307.466196535" watchObservedRunningTime="2026-03-08 03:52:24.525438008 +0000 UTC m=+307.471122561" Mar 08 03:52:28.689263 master-0 kubenswrapper[7547]: I0308 03:52:28.684762 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n"] Mar 08 03:52:28.689263 master-0 kubenswrapper[7547]: I0308 03:52:28.685164 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" podUID="19998073-79e7-4078-91b4-418a037caa38" containerName="kube-rbac-proxy" containerID="cri-o://2225b49859b76acb9ea5708c2c9c3e17ba7fe3b15b9def2ac0929094811fc07b" gracePeriod=30 Mar 08 03:52:28.689263 master-0 kubenswrapper[7547]: I0308 03:52:28.685295 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" podUID="19998073-79e7-4078-91b4-418a037caa38" containerName="machine-approver-controller" containerID="cri-o://cff54322e0a4836432d190c57034a7703fc9c7cc470c2da361243bd1f845fca1" gracePeriod=30 Mar 08 03:52:29.378458 master-0 kubenswrapper[7547]: I0308 03:52:29.377988 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" Mar 08 03:52:29.492420 master-0 kubenswrapper[7547]: I0308 03:52:29.492314 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzh2d\" (UniqueName: \"kubernetes.io/projected/19998073-79e7-4078-91b4-418a037caa38-kube-api-access-qzh2d\") pod \"19998073-79e7-4078-91b4-418a037caa38\" (UID: \"19998073-79e7-4078-91b4-418a037caa38\") " Mar 08 03:52:29.492725 master-0 kubenswrapper[7547]: I0308 03:52:29.492445 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/19998073-79e7-4078-91b4-418a037caa38-auth-proxy-config\") pod \"19998073-79e7-4078-91b4-418a037caa38\" (UID: \"19998073-79e7-4078-91b4-418a037caa38\") " Mar 08 03:52:29.492725 master-0 kubenswrapper[7547]: I0308 03:52:29.492499 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19998073-79e7-4078-91b4-418a037caa38-config\") pod \"19998073-79e7-4078-91b4-418a037caa38\" (UID: \"19998073-79e7-4078-91b4-418a037caa38\") " Mar 08 03:52:29.492725 master-0 kubenswrapper[7547]: I0308 03:52:29.492602 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/19998073-79e7-4078-91b4-418a037caa38-machine-approver-tls\") pod \"19998073-79e7-4078-91b4-418a037caa38\" (UID: \"19998073-79e7-4078-91b4-418a037caa38\") " Mar 08 03:52:29.493407 master-0 kubenswrapper[7547]: I0308 03:52:29.493332 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19998073-79e7-4078-91b4-418a037caa38-config" (OuterVolumeSpecName: "config") pod "19998073-79e7-4078-91b4-418a037caa38" (UID: "19998073-79e7-4078-91b4-418a037caa38"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:52:29.493407 master-0 kubenswrapper[7547]: I0308 03:52:29.493359 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19998073-79e7-4078-91b4-418a037caa38-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "19998073-79e7-4078-91b4-418a037caa38" (UID: "19998073-79e7-4078-91b4-418a037caa38"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:52:29.497453 master-0 kubenswrapper[7547]: I0308 03:52:29.497370 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19998073-79e7-4078-91b4-418a037caa38-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "19998073-79e7-4078-91b4-418a037caa38" (UID: "19998073-79e7-4078-91b4-418a037caa38"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:52:29.497778 master-0 kubenswrapper[7547]: I0308 03:52:29.497741 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19998073-79e7-4078-91b4-418a037caa38-kube-api-access-qzh2d" (OuterVolumeSpecName: "kube-api-access-qzh2d") pod "19998073-79e7-4078-91b4-418a037caa38" (UID: "19998073-79e7-4078-91b4-418a037caa38"). InnerVolumeSpecName "kube-api-access-qzh2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:52:29.534073 master-0 kubenswrapper[7547]: I0308 03:52:29.534000 7547 generic.go:334] "Generic (PLEG): container finished" podID="19998073-79e7-4078-91b4-418a037caa38" containerID="cff54322e0a4836432d190c57034a7703fc9c7cc470c2da361243bd1f845fca1" exitCode=0 Mar 08 03:52:29.534073 master-0 kubenswrapper[7547]: I0308 03:52:29.534047 7547 generic.go:334] "Generic (PLEG): container finished" podID="19998073-79e7-4078-91b4-418a037caa38" containerID="2225b49859b76acb9ea5708c2c9c3e17ba7fe3b15b9def2ac0929094811fc07b" exitCode=0 Mar 08 03:52:29.534300 master-0 kubenswrapper[7547]: I0308 03:52:29.534079 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" event={"ID":"19998073-79e7-4078-91b4-418a037caa38","Type":"ContainerDied","Data":"cff54322e0a4836432d190c57034a7703fc9c7cc470c2da361243bd1f845fca1"} Mar 08 03:52:29.534300 master-0 kubenswrapper[7547]: I0308 03:52:29.534119 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" event={"ID":"19998073-79e7-4078-91b4-418a037caa38","Type":"ContainerDied","Data":"2225b49859b76acb9ea5708c2c9c3e17ba7fe3b15b9def2ac0929094811fc07b"} Mar 08 03:52:29.534300 master-0 kubenswrapper[7547]: I0308 03:52:29.534140 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" event={"ID":"19998073-79e7-4078-91b4-418a037caa38","Type":"ContainerDied","Data":"399bfaebb310269b0096a7a05e3f4655b777b39ab76bccec71929c119f7598f0"} Mar 08 03:52:29.534300 master-0 kubenswrapper[7547]: I0308 03:52:29.534169 7547 scope.go:117] "RemoveContainer" containerID="cff54322e0a4836432d190c57034a7703fc9c7cc470c2da361243bd1f845fca1" Mar 08 03:52:29.534300 master-0 kubenswrapper[7547]: I0308 03:52:29.534085 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n" Mar 08 03:52:29.556961 master-0 kubenswrapper[7547]: I0308 03:52:29.556700 7547 scope.go:117] "RemoveContainer" containerID="2225b49859b76acb9ea5708c2c9c3e17ba7fe3b15b9def2ac0929094811fc07b" Mar 08 03:52:29.584742 master-0 kubenswrapper[7547]: I0308 03:52:29.583199 7547 scope.go:117] "RemoveContainer" containerID="cff54322e0a4836432d190c57034a7703fc9c7cc470c2da361243bd1f845fca1" Mar 08 03:52:29.584742 master-0 kubenswrapper[7547]: E0308 03:52:29.584007 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cff54322e0a4836432d190c57034a7703fc9c7cc470c2da361243bd1f845fca1\": container with ID starting with cff54322e0a4836432d190c57034a7703fc9c7cc470c2da361243bd1f845fca1 not found: ID does not exist" containerID="cff54322e0a4836432d190c57034a7703fc9c7cc470c2da361243bd1f845fca1" Mar 08 03:52:29.584742 master-0 kubenswrapper[7547]: I0308 03:52:29.584061 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cff54322e0a4836432d190c57034a7703fc9c7cc470c2da361243bd1f845fca1"} err="failed to get container status \"cff54322e0a4836432d190c57034a7703fc9c7cc470c2da361243bd1f845fca1\": rpc error: code = NotFound desc = could not find container \"cff54322e0a4836432d190c57034a7703fc9c7cc470c2da361243bd1f845fca1\": container with ID starting with cff54322e0a4836432d190c57034a7703fc9c7cc470c2da361243bd1f845fca1 not found: ID does not exist" Mar 08 03:52:29.584742 master-0 kubenswrapper[7547]: I0308 03:52:29.584095 7547 scope.go:117] "RemoveContainer" containerID="2225b49859b76acb9ea5708c2c9c3e17ba7fe3b15b9def2ac0929094811fc07b" Mar 08 03:52:29.594543 master-0 kubenswrapper[7547]: I0308 03:52:29.594477 7547 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/19998073-79e7-4078-91b4-418a037caa38-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:52:29.594543 master-0 kubenswrapper[7547]: I0308 03:52:29.594536 7547 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19998073-79e7-4078-91b4-418a037caa38-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:52:29.594850 master-0 kubenswrapper[7547]: I0308 03:52:29.594558 7547 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/19998073-79e7-4078-91b4-418a037caa38-machine-approver-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 03:52:29.594850 master-0 kubenswrapper[7547]: I0308 03:52:29.594577 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzh2d\" (UniqueName: \"kubernetes.io/projected/19998073-79e7-4078-91b4-418a037caa38-kube-api-access-qzh2d\") on node \"master-0\" DevicePath \"\"" Mar 08 03:52:29.595616 master-0 kubenswrapper[7547]: E0308 03:52:29.595545 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2225b49859b76acb9ea5708c2c9c3e17ba7fe3b15b9def2ac0929094811fc07b\": container with ID starting with 2225b49859b76acb9ea5708c2c9c3e17ba7fe3b15b9def2ac0929094811fc07b not found: ID does not exist" containerID="2225b49859b76acb9ea5708c2c9c3e17ba7fe3b15b9def2ac0929094811fc07b" Mar 08 03:52:29.595715 master-0 kubenswrapper[7547]: I0308 03:52:29.595626 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2225b49859b76acb9ea5708c2c9c3e17ba7fe3b15b9def2ac0929094811fc07b"} err="failed to get container status \"2225b49859b76acb9ea5708c2c9c3e17ba7fe3b15b9def2ac0929094811fc07b\": rpc error: code = NotFound desc = could not find container \"2225b49859b76acb9ea5708c2c9c3e17ba7fe3b15b9def2ac0929094811fc07b\": container with ID starting with 2225b49859b76acb9ea5708c2c9c3e17ba7fe3b15b9def2ac0929094811fc07b not found: ID does not exist" Mar 08 03:52:29.595715 master-0 kubenswrapper[7547]: I0308 03:52:29.595672 7547 scope.go:117] "RemoveContainer" containerID="cff54322e0a4836432d190c57034a7703fc9c7cc470c2da361243bd1f845fca1" Mar 08 03:52:29.596450 master-0 kubenswrapper[7547]: I0308 03:52:29.596371 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cff54322e0a4836432d190c57034a7703fc9c7cc470c2da361243bd1f845fca1"} err="failed to get container status \"cff54322e0a4836432d190c57034a7703fc9c7cc470c2da361243bd1f845fca1\": rpc error: code = NotFound desc = could not find container \"cff54322e0a4836432d190c57034a7703fc9c7cc470c2da361243bd1f845fca1\": container with ID starting with cff54322e0a4836432d190c57034a7703fc9c7cc470c2da361243bd1f845fca1 not found: ID does not exist" Mar 08 03:52:29.596538 master-0 kubenswrapper[7547]: I0308 03:52:29.596450 7547 scope.go:117] "RemoveContainer" containerID="2225b49859b76acb9ea5708c2c9c3e17ba7fe3b15b9def2ac0929094811fc07b" Mar 08 03:52:29.596939 master-0 kubenswrapper[7547]: I0308 03:52:29.596893 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n"] Mar 08 03:52:29.597046 master-0 kubenswrapper[7547]: I0308 03:52:29.597008 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2225b49859b76acb9ea5708c2c9c3e17ba7fe3b15b9def2ac0929094811fc07b"} err="failed to get container status \"2225b49859b76acb9ea5708c2c9c3e17ba7fe3b15b9def2ac0929094811fc07b\": rpc error: code = NotFound desc = could not find container \"2225b49859b76acb9ea5708c2c9c3e17ba7fe3b15b9def2ac0929094811fc07b\": container with ID starting with 2225b49859b76acb9ea5708c2c9c3e17ba7fe3b15b9def2ac0929094811fc07b not found: ID does not exist" Mar 08 03:52:29.602412 master-0 kubenswrapper[7547]: I0308 03:52:29.602319 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-zhs8n"] Mar 08 03:52:29.633979 master-0 kubenswrapper[7547]: I0308 03:52:29.633818 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd"] Mar 08 03:52:29.634313 master-0 kubenswrapper[7547]: E0308 03:52:29.634268 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19998073-79e7-4078-91b4-418a037caa38" containerName="machine-approver-controller" Mar 08 03:52:29.634395 master-0 kubenswrapper[7547]: I0308 03:52:29.634309 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="19998073-79e7-4078-91b4-418a037caa38" containerName="machine-approver-controller" Mar 08 03:52:29.634395 master-0 kubenswrapper[7547]: E0308 03:52:29.634350 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19998073-79e7-4078-91b4-418a037caa38" containerName="kube-rbac-proxy" Mar 08 03:52:29.634395 master-0 kubenswrapper[7547]: I0308 03:52:29.634371 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="19998073-79e7-4078-91b4-418a037caa38" containerName="kube-rbac-proxy" Mar 08 03:52:29.634675 master-0 kubenswrapper[7547]: I0308 03:52:29.634628 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="19998073-79e7-4078-91b4-418a037caa38" containerName="kube-rbac-proxy" Mar 08 03:52:29.634753 master-0 kubenswrapper[7547]: I0308 03:52:29.634711 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="19998073-79e7-4078-91b4-418a037caa38" containerName="machine-approver-controller" Mar 08 03:52:29.636216 master-0 kubenswrapper[7547]: I0308 03:52:29.636164 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:52:29.638893 master-0 kubenswrapper[7547]: I0308 03:52:29.638796 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 03:52:29.639924 master-0 kubenswrapper[7547]: I0308 03:52:29.639860 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 03:52:29.640227 master-0 kubenswrapper[7547]: I0308 03:52:29.640184 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 03:52:29.642120 master-0 kubenswrapper[7547]: I0308 03:52:29.642062 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 03:52:29.642393 master-0 kubenswrapper[7547]: I0308 03:52:29.642345 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 03:52:29.643075 master-0 kubenswrapper[7547]: I0308 03:52:29.643000 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-92fqc" Mar 08 03:52:29.824869 master-0 kubenswrapper[7547]: I0308 03:52:29.824748 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/127c3f92-8283-4179-9e40-a12dcabaaa12-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:52:29.825456 master-0 kubenswrapper[7547]: I0308 03:52:29.824923 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/127c3f92-8283-4179-9e40-a12dcabaaa12-config\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:52:29.825456 master-0 kubenswrapper[7547]: I0308 03:52:29.825020 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/127c3f92-8283-4179-9e40-a12dcabaaa12-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:52:29.825456 master-0 kubenswrapper[7547]: I0308 03:52:29.825156 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdn9r\" (UniqueName: \"kubernetes.io/projected/127c3f92-8283-4179-9e40-a12dcabaaa12-kube-api-access-zdn9r\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:52:29.926727 master-0 kubenswrapper[7547]: I0308 03:52:29.926547 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/127c3f92-8283-4179-9e40-a12dcabaaa12-config\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:52:29.926727 master-0 kubenswrapper[7547]: I0308 03:52:29.926637 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/127c3f92-8283-4179-9e40-a12dcabaaa12-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:52:29.927097 master-0 kubenswrapper[7547]: I0308 03:52:29.926751 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdn9r\" (UniqueName: \"kubernetes.io/projected/127c3f92-8283-4179-9e40-a12dcabaaa12-kube-api-access-zdn9r\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:52:29.927097 master-0 kubenswrapper[7547]: I0308 03:52:29.926803 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/127c3f92-8283-4179-9e40-a12dcabaaa12-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:52:29.927456 master-0 kubenswrapper[7547]: I0308 03:52:29.927399 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/127c3f92-8283-4179-9e40-a12dcabaaa12-config\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:52:29.928083 master-0 kubenswrapper[7547]: I0308 03:52:29.927999 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/127c3f92-8283-4179-9e40-a12dcabaaa12-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:52:29.931600 master-0 kubenswrapper[7547]: I0308 03:52:29.931545 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/127c3f92-8283-4179-9e40-a12dcabaaa12-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:52:29.958695 master-0 kubenswrapper[7547]: I0308 03:52:29.958624 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdn9r\" (UniqueName: \"kubernetes.io/projected/127c3f92-8283-4179-9e40-a12dcabaaa12-kube-api-access-zdn9r\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:52:29.966575 master-0 kubenswrapper[7547]: I0308 03:52:29.966518 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:52:29.988777 master-0 kubenswrapper[7547]: W0308 03:52:29.988715 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod127c3f92_8283_4179_9e40_a12dcabaaa12.slice/crio-e60e7e3ed5830ae078195a56d09f656f257e2641daffbfad2ebaca1e467bb613 WatchSource:0}: Error finding container e60e7e3ed5830ae078195a56d09f656f257e2641daffbfad2ebaca1e467bb613: Status 404 returned error can't find the container with id e60e7e3ed5830ae078195a56d09f656f257e2641daffbfad2ebaca1e467bb613 Mar 08 03:52:30.545800 master-0 kubenswrapper[7547]: I0308 03:52:30.545744 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" event={"ID":"127c3f92-8283-4179-9e40-a12dcabaaa12","Type":"ContainerStarted","Data":"62261ca410614fc5246abc7ac14619f8464f27657d0264235d202b419c622a76"} Mar 08 03:52:30.545800 master-0 kubenswrapper[7547]: I0308 03:52:30.545791 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" event={"ID":"127c3f92-8283-4179-9e40-a12dcabaaa12","Type":"ContainerStarted","Data":"e60e7e3ed5830ae078195a56d09f656f257e2641daffbfad2ebaca1e467bb613"} Mar 08 03:52:31.245798 master-0 kubenswrapper[7547]: I0308 03:52:31.245707 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19998073-79e7-4078-91b4-418a037caa38" path="/var/lib/kubelet/pods/19998073-79e7-4078-91b4-418a037caa38/volumes" Mar 08 03:52:31.564120 master-0 kubenswrapper[7547]: I0308 03:52:31.563952 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" event={"ID":"127c3f92-8283-4179-9e40-a12dcabaaa12","Type":"ContainerStarted","Data":"8eae55733094f3b0776d6b5120bc7a09315bf36094aa0d778ddab4301f0dda90"} Mar 08 03:52:32.137189 master-0 kubenswrapper[7547]: I0308 03:52:32.137103 7547 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 08 03:52:32.137622 master-0 kubenswrapper[7547]: I0308 03:52:32.137461 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" containerID="cri-o://b00978d6151280d243ba1f6c8276b934ba5c5276b57bc3800284f048820f905f" gracePeriod=30 Mar 08 03:52:32.137622 master-0 kubenswrapper[7547]: I0308 03:52:32.137582 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" containerID="cri-o://28bcae4d70566beaa13732bd5095c7d8d6a2ad6f8be2ed4c2e4b067a051fc9f1" gracePeriod=30 Mar 08 03:52:32.139677 master-0 kubenswrapper[7547]: I0308 03:52:32.139620 7547 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 03:52:32.140197 master-0 kubenswrapper[7547]: E0308 03:52:32.140116 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:52:32.140197 master-0 kubenswrapper[7547]: I0308 03:52:32.140161 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:52:32.140197 master-0 kubenswrapper[7547]: E0308 03:52:32.140195 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:52:32.140544 master-0 kubenswrapper[7547]: I0308 03:52:32.140216 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:52:32.140544 master-0 kubenswrapper[7547]: E0308 03:52:32.140255 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:52:32.140544 master-0 kubenswrapper[7547]: I0308 03:52:32.140274 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:52:32.140544 master-0 kubenswrapper[7547]: E0308 03:52:32.140302 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 08 03:52:32.140544 master-0 kubenswrapper[7547]: I0308 03:52:32.140320 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 08 03:52:32.140889 master-0 kubenswrapper[7547]: I0308 03:52:32.140589 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:52:32.140889 master-0 kubenswrapper[7547]: I0308 03:52:32.140616 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 08 03:52:32.140889 master-0 kubenswrapper[7547]: I0308 03:52:32.140638 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:52:32.140889 master-0 kubenswrapper[7547]: I0308 03:52:32.140671 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:52:32.141136 master-0 kubenswrapper[7547]: E0308 03:52:32.140941 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:52:32.141136 master-0 kubenswrapper[7547]: I0308 03:52:32.140969 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:52:32.141270 master-0 kubenswrapper[7547]: I0308 03:52:32.141249 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:52:32.149276 master-0 kubenswrapper[7547]: I0308 03:52:32.149217 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:52:32.266040 master-0 kubenswrapper[7547]: I0308 03:52:32.265865 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a12e67e5b53279c862df229026c8d16c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a12e67e5b53279c862df229026c8d16c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:52:32.266968 master-0 kubenswrapper[7547]: I0308 03:52:32.266098 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a12e67e5b53279c862df229026c8d16c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a12e67e5b53279c862df229026c8d16c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:52:32.334857 master-0 kubenswrapper[7547]: I0308 03:52:32.334787 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:52:32.338142 master-0 kubenswrapper[7547]: I0308 03:52:32.338078 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" podStartSLOduration=3.338063595 podStartE2EDuration="3.338063595s" podCreationTimestamp="2026-03-08 03:52:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:52:31.595286157 +0000 UTC m=+314.540970710" watchObservedRunningTime="2026-03-08 03:52:32.338063595 +0000 UTC m=+315.283748108" Mar 08 03:52:32.339712 master-0 kubenswrapper[7547]: I0308 03:52:32.339689 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 03:52:32.364002 master-0 kubenswrapper[7547]: I0308 03:52:32.363924 7547 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="15e9b25c-cc8d-42a7-8aef-8761fb1719cb" Mar 08 03:52:32.367791 master-0 kubenswrapper[7547]: I0308 03:52:32.367753 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a12e67e5b53279c862df229026c8d16c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a12e67e5b53279c862df229026c8d16c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:52:32.367896 master-0 kubenswrapper[7547]: I0308 03:52:32.367799 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a12e67e5b53279c862df229026c8d16c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a12e67e5b53279c862df229026c8d16c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:52:32.367896 master-0 kubenswrapper[7547]: I0308 03:52:32.367836 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a12e67e5b53279c862df229026c8d16c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a12e67e5b53279c862df229026c8d16c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:52:32.368006 master-0 kubenswrapper[7547]: I0308 03:52:32.367959 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a12e67e5b53279c862df229026c8d16c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a12e67e5b53279c862df229026c8d16c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:52:32.468710 master-0 kubenswrapper[7547]: I0308 03:52:32.468637 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 08 03:52:32.468710 master-0 kubenswrapper[7547]: I0308 03:52:32.468690 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 08 03:52:32.468973 master-0 kubenswrapper[7547]: I0308 03:52:32.468727 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 08 03:52:32.468973 master-0 kubenswrapper[7547]: I0308 03:52:32.468757 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 08 03:52:32.468973 master-0 kubenswrapper[7547]: I0308 03:52:32.468776 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config" (OuterVolumeSpecName: "config") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:52:32.468973 master-0 kubenswrapper[7547]: I0308 03:52:32.468884 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets" (OuterVolumeSpecName: "secrets") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:52:32.468973 master-0 kubenswrapper[7547]: I0308 03:52:32.468932 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:52:32.469178 master-0 kubenswrapper[7547]: I0308 03:52:32.468929 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:52:32.469178 master-0 kubenswrapper[7547]: I0308 03:52:32.469065 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 08 03:52:32.469260 master-0 kubenswrapper[7547]: I0308 03:52:32.469179 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs" (OuterVolumeSpecName: "logs") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:52:32.469611 master-0 kubenswrapper[7547]: I0308 03:52:32.469571 7547 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:52:32.469658 master-0 kubenswrapper[7547]: I0308 03:52:32.469607 7547 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 08 03:52:32.469658 master-0 kubenswrapper[7547]: I0308 03:52:32.469630 7547 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 08 03:52:32.469658 master-0 kubenswrapper[7547]: I0308 03:52:32.469649 7547 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") on node \"master-0\" DevicePath \"\"" Mar 08 03:52:32.469800 master-0 kubenswrapper[7547]: I0308 03:52:32.469670 7547 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:52:32.577125 master-0 kubenswrapper[7547]: I0308 03:52:32.577007 7547 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="28bcae4d70566beaa13732bd5095c7d8d6a2ad6f8be2ed4c2e4b067a051fc9f1" exitCode=0 Mar 08 03:52:32.577125 master-0 kubenswrapper[7547]: I0308 03:52:32.577077 7547 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="b00978d6151280d243ba1f6c8276b934ba5c5276b57bc3800284f048820f905f" exitCode=0 Mar 08 03:52:32.577125 master-0 kubenswrapper[7547]: I0308 03:52:32.577123 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:52:32.577647 master-0 kubenswrapper[7547]: I0308 03:52:32.577151 7547 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="758a2c2e2af7455b02804a595f36886f4047114b8dbd25a8393a292e35b7254e" Mar 08 03:52:32.577647 master-0 kubenswrapper[7547]: I0308 03:52:32.577200 7547 scope.go:117] "RemoveContainer" containerID="a234e673236026ace29cfb9074b221693da101dadaf715ffecb4dfd643bf0e5f" Mar 08 03:52:32.580098 master-0 kubenswrapper[7547]: I0308 03:52:32.580031 7547 generic.go:334] "Generic (PLEG): container finished" podID="d191ff84-f4e4-4d99-8cbb-c10771e68baf" containerID="ab476bbfa4b9ad96fb2348ef6d3d71a1b60822ddb1c07515c6d2e7af7a64fce8" exitCode=0 Mar 08 03:52:32.580235 master-0 kubenswrapper[7547]: I0308 03:52:32.580148 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"d191ff84-f4e4-4d99-8cbb-c10771e68baf","Type":"ContainerDied","Data":"ab476bbfa4b9ad96fb2348ef6d3d71a1b60822ddb1c07515c6d2e7af7a64fce8"} Mar 08 03:52:32.641599 master-0 kubenswrapper[7547]: I0308 03:52:32.641496 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:52:32.682113 master-0 kubenswrapper[7547]: W0308 03:52:32.682050 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda12e67e5b53279c862df229026c8d16c.slice/crio-9af5ebd3eee3c3de99e27a671d715ba12c7da929014abc4a9a4424a8fb8aad4e WatchSource:0}: Error finding container 9af5ebd3eee3c3de99e27a671d715ba12c7da929014abc4a9a4424a8fb8aad4e: Status 404 returned error can't find the container with id 9af5ebd3eee3c3de99e27a671d715ba12c7da929014abc4a9a4424a8fb8aad4e Mar 08 03:52:33.243233 master-0 kubenswrapper[7547]: I0308 03:52:33.243150 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f78c05e1499b533b83f091333d61f045" path="/var/lib/kubelet/pods/f78c05e1499b533b83f091333d61f045/volumes" Mar 08 03:52:33.243725 master-0 kubenswrapper[7547]: I0308 03:52:33.243673 7547 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="" Mar 08 03:52:33.269705 master-0 kubenswrapper[7547]: I0308 03:52:33.269650 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 08 03:52:33.270066 master-0 kubenswrapper[7547]: I0308 03:52:33.269704 7547 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="15e9b25c-cc8d-42a7-8aef-8761fb1719cb" Mar 08 03:52:33.275217 master-0 kubenswrapper[7547]: I0308 03:52:33.275151 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 08 03:52:33.275286 master-0 kubenswrapper[7547]: I0308 03:52:33.275223 7547 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="15e9b25c-cc8d-42a7-8aef-8761fb1719cb" Mar 08 03:52:33.592657 master-0 kubenswrapper[7547]: I0308 03:52:33.592544 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a12e67e5b53279c862df229026c8d16c","Type":"ContainerStarted","Data":"d92fdcd0bd88e0c579bd858a04d7e6e266a7f72aec3885543e0de2cee51140ac"} Mar 08 03:52:33.592657 master-0 kubenswrapper[7547]: I0308 03:52:33.592650 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a12e67e5b53279c862df229026c8d16c","Type":"ContainerStarted","Data":"f5cec83dc05dfae95933e7d5e4646a470fd6b2150eeed3507d1b115fc1dfcb34"} Mar 08 03:52:33.593084 master-0 kubenswrapper[7547]: I0308 03:52:33.592678 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a12e67e5b53279c862df229026c8d16c","Type":"ContainerStarted","Data":"9d1a3af9468d450b8ce515e818a31e6bfe522f30f01bccb1080ebaabf3f6d3f1"} Mar 08 03:52:33.593084 master-0 kubenswrapper[7547]: I0308 03:52:33.592699 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a12e67e5b53279c862df229026c8d16c","Type":"ContainerStarted","Data":"9af5ebd3eee3c3de99e27a671d715ba12c7da929014abc4a9a4424a8fb8aad4e"} Mar 08 03:52:33.942469 master-0 kubenswrapper[7547]: I0308 03:52:33.942353 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:52:34.094339 master-0 kubenswrapper[7547]: I0308 03:52:34.094280 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d191ff84-f4e4-4d99-8cbb-c10771e68baf-kubelet-dir\") pod \"d191ff84-f4e4-4d99-8cbb-c10771e68baf\" (UID: \"d191ff84-f4e4-4d99-8cbb-c10771e68baf\") " Mar 08 03:52:34.094548 master-0 kubenswrapper[7547]: I0308 03:52:34.094435 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d191ff84-f4e4-4d99-8cbb-c10771e68baf-kube-api-access\") pod \"d191ff84-f4e4-4d99-8cbb-c10771e68baf\" (UID: \"d191ff84-f4e4-4d99-8cbb-c10771e68baf\") " Mar 08 03:52:34.094548 master-0 kubenswrapper[7547]: I0308 03:52:34.094470 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d191ff84-f4e4-4d99-8cbb-c10771e68baf-var-lock\") pod \"d191ff84-f4e4-4d99-8cbb-c10771e68baf\" (UID: \"d191ff84-f4e4-4d99-8cbb-c10771e68baf\") " Mar 08 03:52:34.094548 master-0 kubenswrapper[7547]: I0308 03:52:34.094481 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d191ff84-f4e4-4d99-8cbb-c10771e68baf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d191ff84-f4e4-4d99-8cbb-c10771e68baf" (UID: "d191ff84-f4e4-4d99-8cbb-c10771e68baf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:52:34.094683 master-0 kubenswrapper[7547]: I0308 03:52:34.094614 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d191ff84-f4e4-4d99-8cbb-c10771e68baf-var-lock" (OuterVolumeSpecName: "var-lock") pod "d191ff84-f4e4-4d99-8cbb-c10771e68baf" (UID: "d191ff84-f4e4-4d99-8cbb-c10771e68baf"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:52:34.094951 master-0 kubenswrapper[7547]: I0308 03:52:34.094927 7547 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d191ff84-f4e4-4d99-8cbb-c10771e68baf-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:52:34.094951 master-0 kubenswrapper[7547]: I0308 03:52:34.094943 7547 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d191ff84-f4e4-4d99-8cbb-c10771e68baf-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:52:34.098541 master-0 kubenswrapper[7547]: I0308 03:52:34.098492 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d191ff84-f4e4-4d99-8cbb-c10771e68baf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d191ff84-f4e4-4d99-8cbb-c10771e68baf" (UID: "d191ff84-f4e4-4d99-8cbb-c10771e68baf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:52:34.196601 master-0 kubenswrapper[7547]: I0308 03:52:34.196387 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d191ff84-f4e4-4d99-8cbb-c10771e68baf-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:52:34.610141 master-0 kubenswrapper[7547]: I0308 03:52:34.609961 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a12e67e5b53279c862df229026c8d16c","Type":"ContainerStarted","Data":"6979155324a9775c0f334fc4aa6afa070463810c3191479ea2bb2dbfe2843ea3"} Mar 08 03:52:34.612240 master-0 kubenswrapper[7547]: I0308 03:52:34.612195 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"d191ff84-f4e4-4d99-8cbb-c10771e68baf","Type":"ContainerDied","Data":"490c7966b29451303b6f42ccaf5b249c853bf84b3cd4be6c7f5b23f3365fe971"} Mar 08 03:52:34.612240 master-0 kubenswrapper[7547]: I0308 03:52:34.612233 7547 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="490c7966b29451303b6f42ccaf5b249c853bf84b3cd4be6c7f5b23f3365fe971" Mar 08 03:52:34.612410 master-0 kubenswrapper[7547]: I0308 03:52:34.612306 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:52:34.652950 master-0 kubenswrapper[7547]: I0308 03:52:34.652817 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.6527909469999997 podStartE2EDuration="2.652790947s" podCreationTimestamp="2026-03-08 03:52:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:52:34.646350046 +0000 UTC m=+317.592034599" watchObservedRunningTime="2026-03-08 03:52:34.652790947 +0000 UTC m=+317.598475490" Mar 08 03:52:42.642424 master-0 kubenswrapper[7547]: I0308 03:52:42.642308 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:52:42.642424 master-0 kubenswrapper[7547]: I0308 03:52:42.642386 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:52:42.642424 master-0 kubenswrapper[7547]: I0308 03:52:42.642408 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:52:42.642424 master-0 kubenswrapper[7547]: I0308 03:52:42.642428 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:52:42.648611 master-0 kubenswrapper[7547]: I0308 03:52:42.648569 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:52:42.649310 master-0 kubenswrapper[7547]: I0308 03:52:42.649242 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:52:42.687081 master-0 kubenswrapper[7547]: I0308 03:52:42.687006 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:52:43.692451 master-0 kubenswrapper[7547]: I0308 03:52:43.692382 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:52:50.759705 master-0 kubenswrapper[7547]: I0308 03:52:50.759603 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8"] Mar 08 03:52:50.760467 master-0 kubenswrapper[7547]: I0308 03:52:50.759900 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" podUID="efcc6976-2815-4d96-8efb-1333102ccfd0" containerName="cluster-cloud-controller-manager" containerID="cri-o://d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9" gracePeriod=30 Mar 08 03:52:50.760467 master-0 kubenswrapper[7547]: I0308 03:52:50.760148 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" podUID="efcc6976-2815-4d96-8efb-1333102ccfd0" containerName="kube-rbac-proxy" containerID="cri-o://64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf" gracePeriod=30 Mar 08 03:52:50.760467 master-0 kubenswrapper[7547]: I0308 03:52:50.760359 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" podUID="efcc6976-2815-4d96-8efb-1333102ccfd0" containerName="config-sync-controllers" containerID="cri-o://c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134" gracePeriod=30 Mar 08 03:52:50.912723 master-0 kubenswrapper[7547]: I0308 03:52:50.912672 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:51.040237 master-0 kubenswrapper[7547]: I0308 03:52:51.040112 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/efcc6976-2815-4d96-8efb-1333102ccfd0-cloud-controller-manager-operator-tls\") pod \"efcc6976-2815-4d96-8efb-1333102ccfd0\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " Mar 08 03:52:51.040237 master-0 kubenswrapper[7547]: I0308 03:52:51.040169 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/efcc6976-2815-4d96-8efb-1333102ccfd0-auth-proxy-config\") pod \"efcc6976-2815-4d96-8efb-1333102ccfd0\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " Mar 08 03:52:51.040237 master-0 kubenswrapper[7547]: I0308 03:52:51.040198 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/efcc6976-2815-4d96-8efb-1333102ccfd0-images\") pod \"efcc6976-2815-4d96-8efb-1333102ccfd0\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " Mar 08 03:52:51.040237 master-0 kubenswrapper[7547]: I0308 03:52:51.040230 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59nzz\" (UniqueName: \"kubernetes.io/projected/efcc6976-2815-4d96-8efb-1333102ccfd0-kube-api-access-59nzz\") pod \"efcc6976-2815-4d96-8efb-1333102ccfd0\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " Mar 08 03:52:51.040550 master-0 kubenswrapper[7547]: I0308 03:52:51.040251 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/efcc6976-2815-4d96-8efb-1333102ccfd0-host-etc-kube\") pod \"efcc6976-2815-4d96-8efb-1333102ccfd0\" (UID: \"efcc6976-2815-4d96-8efb-1333102ccfd0\") " Mar 08 03:52:51.040550 master-0 kubenswrapper[7547]: I0308 03:52:51.040436 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/efcc6976-2815-4d96-8efb-1333102ccfd0-host-etc-kube" (OuterVolumeSpecName: "host-etc-kube") pod "efcc6976-2815-4d96-8efb-1333102ccfd0" (UID: "efcc6976-2815-4d96-8efb-1333102ccfd0"). InnerVolumeSpecName "host-etc-kube". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:52:51.040807 master-0 kubenswrapper[7547]: I0308 03:52:51.040743 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efcc6976-2815-4d96-8efb-1333102ccfd0-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "efcc6976-2815-4d96-8efb-1333102ccfd0" (UID: "efcc6976-2815-4d96-8efb-1333102ccfd0"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:52:51.040807 master-0 kubenswrapper[7547]: I0308 03:52:51.040757 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efcc6976-2815-4d96-8efb-1333102ccfd0-images" (OuterVolumeSpecName: "images") pod "efcc6976-2815-4d96-8efb-1333102ccfd0" (UID: "efcc6976-2815-4d96-8efb-1333102ccfd0"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:52:51.045084 master-0 kubenswrapper[7547]: I0308 03:52:51.045045 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efcc6976-2815-4d96-8efb-1333102ccfd0-cloud-controller-manager-operator-tls" (OuterVolumeSpecName: "cloud-controller-manager-operator-tls") pod "efcc6976-2815-4d96-8efb-1333102ccfd0" (UID: "efcc6976-2815-4d96-8efb-1333102ccfd0"). InnerVolumeSpecName "cloud-controller-manager-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:52:51.049088 master-0 kubenswrapper[7547]: I0308 03:52:51.049041 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efcc6976-2815-4d96-8efb-1333102ccfd0-kube-api-access-59nzz" (OuterVolumeSpecName: "kube-api-access-59nzz") pod "efcc6976-2815-4d96-8efb-1333102ccfd0" (UID: "efcc6976-2815-4d96-8efb-1333102ccfd0"). InnerVolumeSpecName "kube-api-access-59nzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:52:51.141411 master-0 kubenswrapper[7547]: I0308 03:52:51.141373 7547 reconciler_common.go:293] "Volume detached for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/efcc6976-2815-4d96-8efb-1333102ccfd0-cloud-controller-manager-operator-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 03:52:51.141411 master-0 kubenswrapper[7547]: I0308 03:52:51.141405 7547 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/efcc6976-2815-4d96-8efb-1333102ccfd0-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:52:51.141411 master-0 kubenswrapper[7547]: I0308 03:52:51.141419 7547 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/efcc6976-2815-4d96-8efb-1333102ccfd0-images\") on node \"master-0\" DevicePath \"\"" Mar 08 03:52:51.141411 master-0 kubenswrapper[7547]: I0308 03:52:51.141428 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59nzz\" (UniqueName: \"kubernetes.io/projected/efcc6976-2815-4d96-8efb-1333102ccfd0-kube-api-access-59nzz\") on node \"master-0\" DevicePath \"\"" Mar 08 03:52:51.141411 master-0 kubenswrapper[7547]: I0308 03:52:51.141438 7547 reconciler_common.go:293] "Volume detached for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/efcc6976-2815-4d96-8efb-1333102ccfd0-host-etc-kube\") on node \"master-0\" DevicePath \"\"" Mar 08 03:52:51.749434 master-0 kubenswrapper[7547]: I0308 03:52:51.749365 7547 generic.go:334] "Generic (PLEG): container finished" podID="efcc6976-2815-4d96-8efb-1333102ccfd0" containerID="64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf" exitCode=0 Mar 08 03:52:51.749434 master-0 kubenswrapper[7547]: I0308 03:52:51.749406 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" event={"ID":"efcc6976-2815-4d96-8efb-1333102ccfd0","Type":"ContainerDied","Data":"64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf"} Mar 08 03:52:51.749670 master-0 kubenswrapper[7547]: I0308 03:52:51.749450 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" event={"ID":"efcc6976-2815-4d96-8efb-1333102ccfd0","Type":"ContainerDied","Data":"c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134"} Mar 08 03:52:51.749670 master-0 kubenswrapper[7547]: I0308 03:52:51.749474 7547 scope.go:117] "RemoveContainer" containerID="64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf" Mar 08 03:52:51.749670 master-0 kubenswrapper[7547]: I0308 03:52:51.749534 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" Mar 08 03:52:51.749670 master-0 kubenswrapper[7547]: I0308 03:52:51.749417 7547 generic.go:334] "Generic (PLEG): container finished" podID="efcc6976-2815-4d96-8efb-1333102ccfd0" containerID="c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134" exitCode=0 Mar 08 03:52:51.749670 master-0 kubenswrapper[7547]: I0308 03:52:51.749658 7547 generic.go:334] "Generic (PLEG): container finished" podID="efcc6976-2815-4d96-8efb-1333102ccfd0" containerID="d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9" exitCode=0 Mar 08 03:52:51.749808 master-0 kubenswrapper[7547]: I0308 03:52:51.749678 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" event={"ID":"efcc6976-2815-4d96-8efb-1333102ccfd0","Type":"ContainerDied","Data":"d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9"} Mar 08 03:52:51.749808 master-0 kubenswrapper[7547]: I0308 03:52:51.749693 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8" event={"ID":"efcc6976-2815-4d96-8efb-1333102ccfd0","Type":"ContainerDied","Data":"75f4eb3be0a13ebaf0adfb5408aab17726c88687c273eeef67438587a2bdc267"} Mar 08 03:52:51.781730 master-0 kubenswrapper[7547]: I0308 03:52:51.781680 7547 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8"] Mar 08 03:52:51.782262 master-0 kubenswrapper[7547]: I0308 03:52:51.782194 7547 scope.go:117] "RemoveContainer" containerID="c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134" Mar 08 03:52:51.785837 master-0 kubenswrapper[7547]: I0308 03:52:51.785770 7547 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-mbxg8"] Mar 08 03:52:51.826276 master-0 kubenswrapper[7547]: I0308 03:52:51.824601 7547 scope.go:117] "RemoveContainer" containerID="d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9" Mar 08 03:52:51.832920 master-0 kubenswrapper[7547]: I0308 03:52:51.832875 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv"] Mar 08 03:52:51.833930 master-0 kubenswrapper[7547]: E0308 03:52:51.833203 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efcc6976-2815-4d96-8efb-1333102ccfd0" containerName="cluster-cloud-controller-manager" Mar 08 03:52:51.833930 master-0 kubenswrapper[7547]: I0308 03:52:51.833223 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="efcc6976-2815-4d96-8efb-1333102ccfd0" containerName="cluster-cloud-controller-manager" Mar 08 03:52:51.833930 master-0 kubenswrapper[7547]: E0308 03:52:51.833239 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d191ff84-f4e4-4d99-8cbb-c10771e68baf" containerName="installer" Mar 08 03:52:51.833930 master-0 kubenswrapper[7547]: I0308 03:52:51.833248 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="d191ff84-f4e4-4d99-8cbb-c10771e68baf" containerName="installer" Mar 08 03:52:51.833930 master-0 kubenswrapper[7547]: E0308 03:52:51.833265 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efcc6976-2815-4d96-8efb-1333102ccfd0" containerName="kube-rbac-proxy" Mar 08 03:52:51.833930 master-0 kubenswrapper[7547]: I0308 03:52:51.833273 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="efcc6976-2815-4d96-8efb-1333102ccfd0" containerName="kube-rbac-proxy" Mar 08 03:52:51.833930 master-0 kubenswrapper[7547]: E0308 03:52:51.833287 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efcc6976-2815-4d96-8efb-1333102ccfd0" containerName="config-sync-controllers" Mar 08 03:52:51.833930 master-0 kubenswrapper[7547]: I0308 03:52:51.833294 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="efcc6976-2815-4d96-8efb-1333102ccfd0" containerName="config-sync-controllers" Mar 08 03:52:51.834768 master-0 kubenswrapper[7547]: I0308 03:52:51.834732 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="efcc6976-2815-4d96-8efb-1333102ccfd0" containerName="cluster-cloud-controller-manager" Mar 08 03:52:51.834768 master-0 kubenswrapper[7547]: I0308 03:52:51.834757 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="efcc6976-2815-4d96-8efb-1333102ccfd0" containerName="config-sync-controllers" Mar 08 03:52:51.834768 master-0 kubenswrapper[7547]: I0308 03:52:51.834765 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="d191ff84-f4e4-4d99-8cbb-c10771e68baf" containerName="installer" Mar 08 03:52:51.834768 master-0 kubenswrapper[7547]: I0308 03:52:51.834775 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="efcc6976-2815-4d96-8efb-1333102ccfd0" containerName="kube-rbac-proxy" Mar 08 03:52:51.835798 master-0 kubenswrapper[7547]: I0308 03:52:51.835762 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:52:51.838550 master-0 kubenswrapper[7547]: I0308 03:52:51.838370 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-lnbcj" Mar 08 03:52:51.839491 master-0 kubenswrapper[7547]: I0308 03:52:51.838887 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 03:52:51.839491 master-0 kubenswrapper[7547]: I0308 03:52:51.838935 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 08 03:52:51.841975 master-0 kubenswrapper[7547]: I0308 03:52:51.840032 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 08 03:52:51.841975 master-0 kubenswrapper[7547]: I0308 03:52:51.840032 7547 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 08 03:52:51.841975 master-0 kubenswrapper[7547]: I0308 03:52:51.840232 7547 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:52:51.847214 master-0 kubenswrapper[7547]: I0308 03:52:51.847129 7547 scope.go:117] "RemoveContainer" containerID="64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf" Mar 08 03:52:51.849550 master-0 kubenswrapper[7547]: E0308 03:52:51.849482 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf\": container with ID starting with 64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf not found: ID does not exist" containerID="64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf" Mar 08 03:52:51.849666 master-0 kubenswrapper[7547]: I0308 03:52:51.849563 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf"} err="failed to get container status \"64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf\": rpc error: code = NotFound desc = could not find container \"64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf\": container with ID starting with 64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf not found: ID does not exist" Mar 08 03:52:51.849666 master-0 kubenswrapper[7547]: I0308 03:52:51.849606 7547 scope.go:117] "RemoveContainer" containerID="c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134" Mar 08 03:52:51.850398 master-0 kubenswrapper[7547]: E0308 03:52:51.850342 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134\": container with ID starting with c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134 not found: ID does not exist" containerID="c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134" Mar 08 03:52:51.850523 master-0 kubenswrapper[7547]: I0308 03:52:51.850393 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134"} err="failed to get container status \"c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134\": rpc error: code = NotFound desc = could not find container \"c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134\": container with ID starting with c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134 not found: ID does not exist" Mar 08 03:52:51.850523 master-0 kubenswrapper[7547]: I0308 03:52:51.850422 7547 scope.go:117] "RemoveContainer" containerID="d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9" Mar 08 03:52:51.851059 master-0 kubenswrapper[7547]: E0308 03:52:51.850967 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9\": container with ID starting with d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9 not found: ID does not exist" containerID="d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9" Mar 08 03:52:51.851148 master-0 kubenswrapper[7547]: I0308 03:52:51.851059 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9"} err="failed to get container status \"d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9\": rpc error: code = NotFound desc = could not find container \"d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9\": container with ID starting with d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9 not found: ID does not exist" Mar 08 03:52:51.851148 master-0 kubenswrapper[7547]: I0308 03:52:51.851092 7547 scope.go:117] "RemoveContainer" containerID="64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf" Mar 08 03:52:51.851682 master-0 kubenswrapper[7547]: I0308 03:52:51.851571 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf"} err="failed to get container status \"64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf\": rpc error: code = NotFound desc = could not find container \"64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf\": container with ID starting with 64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf not found: ID does not exist" Mar 08 03:52:51.851682 master-0 kubenswrapper[7547]: I0308 03:52:51.851668 7547 scope.go:117] "RemoveContainer" containerID="c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134" Mar 08 03:52:51.852491 master-0 kubenswrapper[7547]: I0308 03:52:51.852426 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134"} err="failed to get container status \"c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134\": rpc error: code = NotFound desc = could not find container \"c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134\": container with ID starting with c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134 not found: ID does not exist" Mar 08 03:52:51.852603 master-0 kubenswrapper[7547]: I0308 03:52:51.852518 7547 scope.go:117] "RemoveContainer" containerID="d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9" Mar 08 03:52:51.853579 master-0 kubenswrapper[7547]: I0308 03:52:51.853515 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9"} err="failed to get container status \"d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9\": rpc error: code = NotFound desc = could not find container \"d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9\": container with ID starting with d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9 not found: ID does not exist" Mar 08 03:52:51.853579 master-0 kubenswrapper[7547]: I0308 03:52:51.853560 7547 scope.go:117] "RemoveContainer" containerID="64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf" Mar 08 03:52:51.857143 master-0 kubenswrapper[7547]: I0308 03:52:51.855575 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf"} err="failed to get container status \"64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf\": rpc error: code = NotFound desc = could not find container \"64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf\": container with ID starting with 64fb2c8886467e00d90db0501929cd3905eae7b3b38fbfdd208d15a3524f6fcf not found: ID does not exist" Mar 08 03:52:51.857143 master-0 kubenswrapper[7547]: I0308 03:52:51.855654 7547 scope.go:117] "RemoveContainer" containerID="c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134" Mar 08 03:52:51.858610 master-0 kubenswrapper[7547]: I0308 03:52:51.858549 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134"} err="failed to get container status \"c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134\": rpc error: code = NotFound desc = could not find container \"c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134\": container with ID starting with c90ce88e45cb4f3ff0fbd630cc99faceba0353930c1ba8a8de2460941554e134 not found: ID does not exist" Mar 08 03:52:51.858862 master-0 kubenswrapper[7547]: I0308 03:52:51.858625 7547 scope.go:117] "RemoveContainer" containerID="d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9" Mar 08 03:52:51.859279 master-0 kubenswrapper[7547]: I0308 03:52:51.859197 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9"} err="failed to get container status \"d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9\": rpc error: code = NotFound desc = could not find container \"d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9\": container with ID starting with d3b47d413a69ffd8db28b44cea31a8bbb895696445c355f71f0fce4460815cc9 not found: ID does not exist" Mar 08 03:52:51.952023 master-0 kubenswrapper[7547]: I0308 03:52:51.951936 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:52:51.952511 master-0 kubenswrapper[7547]: I0308 03:52:51.952463 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:52:51.952594 master-0 kubenswrapper[7547]: I0308 03:52:51.952566 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:52:51.953222 master-0 kubenswrapper[7547]: I0308 03:52:51.952722 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:52:51.953222 master-0 kubenswrapper[7547]: I0308 03:52:51.952947 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmsj5\" (UniqueName: \"kubernetes.io/projected/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-kube-api-access-hmsj5\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:52:52.054280 master-0 kubenswrapper[7547]: I0308 03:52:52.054151 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmsj5\" (UniqueName: \"kubernetes.io/projected/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-kube-api-access-hmsj5\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:52:52.054280 master-0 kubenswrapper[7547]: I0308 03:52:52.054261 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:52:52.054495 master-0 kubenswrapper[7547]: I0308 03:52:52.054306 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:52:52.054495 master-0 kubenswrapper[7547]: I0308 03:52:52.054337 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:52:52.054495 master-0 kubenswrapper[7547]: I0308 03:52:52.054356 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:52:52.054495 master-0 kubenswrapper[7547]: I0308 03:52:52.054435 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:52:52.055108 master-0 kubenswrapper[7547]: I0308 03:52:52.055086 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:52:52.055590 master-0 kubenswrapper[7547]: I0308 03:52:52.055544 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:52:52.066574 master-0 kubenswrapper[7547]: I0308 03:52:52.066538 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:52:52.072845 master-0 kubenswrapper[7547]: I0308 03:52:52.072776 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmsj5\" (UniqueName: \"kubernetes.io/projected/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-kube-api-access-hmsj5\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:52:52.177769 master-0 kubenswrapper[7547]: I0308 03:52:52.177704 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:52:52.200306 master-0 kubenswrapper[7547]: W0308 03:52:52.200227 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33ed331b_89e9_45f8_ab3c_4533a77cc7b6.slice/crio-db3e211f71e6d36cf104a5781a02c4e98905e1bbc8fec6cc754858473d74a96c WatchSource:0}: Error finding container db3e211f71e6d36cf104a5781a02c4e98905e1bbc8fec6cc754858473d74a96c: Status 404 returned error can't find the container with id db3e211f71e6d36cf104a5781a02c4e98905e1bbc8fec6cc754858473d74a96c Mar 08 03:52:52.763773 master-0 kubenswrapper[7547]: I0308 03:52:52.763687 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" event={"ID":"33ed331b-89e9-45f8-ab3c-4533a77cc7b6","Type":"ContainerStarted","Data":"91401e988995822ef2518e319b542e37936c2f35288b0309adc7d3f08edb68f5"} Mar 08 03:52:52.763977 master-0 kubenswrapper[7547]: I0308 03:52:52.763765 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" event={"ID":"33ed331b-89e9-45f8-ab3c-4533a77cc7b6","Type":"ContainerStarted","Data":"db3e211f71e6d36cf104a5781a02c4e98905e1bbc8fec6cc754858473d74a96c"} Mar 08 03:52:53.245171 master-0 kubenswrapper[7547]: I0308 03:52:53.245077 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efcc6976-2815-4d96-8efb-1333102ccfd0" path="/var/lib/kubelet/pods/efcc6976-2815-4d96-8efb-1333102ccfd0/volumes" Mar 08 03:52:53.776964 master-0 kubenswrapper[7547]: I0308 03:52:53.776858 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" event={"ID":"33ed331b-89e9-45f8-ab3c-4533a77cc7b6","Type":"ContainerStarted","Data":"33606e9eb37da9d182b32bf13cca73f6fa2418440a52ef4405c6cab7916eb9eb"} Mar 08 03:52:53.776964 master-0 kubenswrapper[7547]: I0308 03:52:53.776932 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" event={"ID":"33ed331b-89e9-45f8-ab3c-4533a77cc7b6","Type":"ContainerStarted","Data":"310c1bcf18c66c58fd78fe7a8197fd35c5d130edf5caca232ed46868d02e501d"} Mar 08 03:52:53.805897 master-0 kubenswrapper[7547]: I0308 03:52:53.805772 7547 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" podStartSLOduration=2.805745407 podStartE2EDuration="2.805745407s" podCreationTimestamp="2026-03-08 03:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:52:53.803330171 +0000 UTC m=+336.749014714" watchObservedRunningTime="2026-03-08 03:52:53.805745407 +0000 UTC m=+336.751429960" Mar 08 03:52:58.636259 master-0 kubenswrapper[7547]: I0308 03:52:58.636153 7547 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 03:52:58.637277 master-0 kubenswrapper[7547]: I0308 03:52:58.637227 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:52:58.638964 master-0 kubenswrapper[7547]: I0308 03:52:58.637507 7547 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 08 03:52:58.638964 master-0 kubenswrapper[7547]: I0308 03:52:58.637947 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" containerID="cri-o://a87790639e12c044cf9f716dfd6c742c89b97ffde357a755afcc44a38db6328d" gracePeriod=15 Mar 08 03:52:58.638964 master-0 kubenswrapper[7547]: I0308 03:52:58.638024 7547 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://512d196861598af69e92dd9aa3d25b53c40e97b92520ddd9df4d73c8065df7e5" gracePeriod=15 Mar 08 03:52:58.643638 master-0 kubenswrapper[7547]: I0308 03:52:58.642728 7547 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 03:52:58.643638 master-0 kubenswrapper[7547]: E0308 03:52:58.643331 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 08 03:52:58.643638 master-0 kubenswrapper[7547]: I0308 03:52:58.643364 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 08 03:52:58.643638 master-0 kubenswrapper[7547]: E0308 03:52:58.643395 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 08 03:52:58.643638 master-0 kubenswrapper[7547]: I0308 03:52:58.643411 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 08 03:52:58.643638 master-0 kubenswrapper[7547]: E0308 03:52:58.643582 7547 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 08 03:52:58.644290 master-0 kubenswrapper[7547]: I0308 03:52:58.643661 7547 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 08 03:52:58.644290 master-0 kubenswrapper[7547]: I0308 03:52:58.644017 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 08 03:52:58.644290 master-0 kubenswrapper[7547]: I0308 03:52:58.644043 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 08 03:52:58.644290 master-0 kubenswrapper[7547]: I0308 03:52:58.644072 7547 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 08 03:52:58.648257 master-0 kubenswrapper[7547]: I0308 03:52:58.647942 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:52:58.692991 master-0 kubenswrapper[7547]: I0308 03:52:58.691733 7547 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 03:52:58.723384 master-0 kubenswrapper[7547]: E0308 03:52:58.723277 7547 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:52:58.766041 master-0 kubenswrapper[7547]: I0308 03:52:58.765973 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:52:58.766310 master-0 kubenswrapper[7547]: I0308 03:52:58.766281 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:52:58.766546 master-0 kubenswrapper[7547]: I0308 03:52:58.766514 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:52:58.766748 master-0 kubenswrapper[7547]: I0308 03:52:58.766720 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:52:58.767180 master-0 kubenswrapper[7547]: I0308 03:52:58.767150 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:52:58.767380 master-0 kubenswrapper[7547]: I0308 03:52:58.767351 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:52:58.767544 master-0 kubenswrapper[7547]: I0308 03:52:58.767518 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:52:58.767719 master-0 kubenswrapper[7547]: I0308 03:52:58.767692 7547 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:52:58.819536 master-0 kubenswrapper[7547]: I0308 03:52:58.819444 7547 generic.go:334] "Generic (PLEG): container finished" podID="9c95709c-c3cb-46fb-afe7-626c8013f3c6" containerID="5f3bdf25350f3735e74258b774375768d3fdf1215b280e2ea275e2a1b21b5161" exitCode=0 Mar 08 03:52:58.819536 master-0 kubenswrapper[7547]: I0308 03:52:58.819529 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"9c95709c-c3cb-46fb-afe7-626c8013f3c6","Type":"ContainerDied","Data":"5f3bdf25350f3735e74258b774375768d3fdf1215b280e2ea275e2a1b21b5161"} Mar 08 03:52:58.820713 master-0 kubenswrapper[7547]: I0308 03:52:58.820626 7547 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:52:58.821615 master-0 kubenswrapper[7547]: I0308 03:52:58.821537 7547 status_manager.go:851] "Failed to get status for pod" podUID="9c95709c-c3cb-46fb-afe7-626c8013f3c6" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:52:58.822663 master-0 kubenswrapper[7547]: I0308 03:52:58.822604 7547 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="512d196861598af69e92dd9aa3d25b53c40e97b92520ddd9df4d73c8065df7e5" exitCode=0 Mar 08 03:52:58.869292 master-0 kubenswrapper[7547]: I0308 03:52:58.869240 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:52:58.869292 master-0 kubenswrapper[7547]: I0308 03:52:58.869310 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:52:58.869648 master-0 kubenswrapper[7547]: I0308 03:52:58.869431 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:52:58.869648 master-0 kubenswrapper[7547]: I0308 03:52:58.869528 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:52:58.869648 master-0 kubenswrapper[7547]: I0308 03:52:58.869620 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:52:58.869995 master-0 kubenswrapper[7547]: I0308 03:52:58.869702 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:52:58.869995 master-0 kubenswrapper[7547]: I0308 03:52:58.869710 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:52:58.869995 master-0 kubenswrapper[7547]: I0308 03:52:58.869754 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:52:58.869995 master-0 kubenswrapper[7547]: I0308 03:52:58.869855 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:52:58.869995 master-0 kubenswrapper[7547]: I0308 03:52:58.869898 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:52:58.869995 master-0 kubenswrapper[7547]: I0308 03:52:58.869912 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:52:58.869995 master-0 kubenswrapper[7547]: I0308 03:52:58.869945 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:52:58.869995 master-0 kubenswrapper[7547]: I0308 03:52:58.869983 7547 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:52:58.870753 master-0 kubenswrapper[7547]: I0308 03:52:58.870043 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:52:58.870753 master-0 kubenswrapper[7547]: I0308 03:52:58.870127 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:52:58.870753 master-0 kubenswrapper[7547]: I0308 03:52:58.870165 7547 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:52:58.981475 master-0 kubenswrapper[7547]: I0308 03:52:58.981374 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:52:58.989929 master-0 kubenswrapper[7547]: I0308 03:52:58.989863 7547 patch_prober.go:28] interesting pod/bootstrap-kube-apiserver-master-0 container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.32.10:6443/readyz\": dial tcp 192.168.32.10:6443: connect: connection refused" start-of-body= Mar 08 03:52:58.990057 master-0 kubenswrapper[7547]: I0308 03:52:58.989957 7547 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.32.10:6443/readyz\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:52:58.991187 master-0 kubenswrapper[7547]: E0308 03:52:58.991001 7547 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event=< Mar 08 03:52:58.991187 master-0 kubenswrapper[7547]: &Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189ac15621128abf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.32.10:6443/readyz": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:52:58.991187 master-0 kubenswrapper[7547]: body: Mar 08 03:52:58.991187 master-0 kubenswrapper[7547]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:52:58.989923007 +0000 UTC m=+341.935607560,LastTimestamp:2026-03-08 03:52:58.989923007 +0000 UTC m=+341.935607560,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Mar 08 03:52:58.991187 master-0 kubenswrapper[7547]: > Mar 08 03:52:59.023254 master-0 kubenswrapper[7547]: W0308 03:52:59.023184 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf417e14665db2ffffa887ce21c9ff0ed.slice/crio-e169486121bbc52c7ca877ad3d815dc4a35f6b8ee220e0fd43b9661c26e26d92 WatchSource:0}: Error finding container e169486121bbc52c7ca877ad3d815dc4a35f6b8ee220e0fd43b9661c26e26d92: Status 404 returned error can't find the container with id e169486121bbc52c7ca877ad3d815dc4a35f6b8ee220e0fd43b9661c26e26d92 Mar 08 03:52:59.024130 master-0 kubenswrapper[7547]: I0308 03:52:59.024015 7547 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:52:59.082857 master-0 kubenswrapper[7547]: W0308 03:52:59.082757 7547 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdcecc61ff5eeb08bd2a3ac12599e4f9.slice/crio-65320fce1a0608c5e233ad7039ccb30dfdee6ba6adad349424d74cf44c08e2db WatchSource:0}: Error finding container 65320fce1a0608c5e233ad7039ccb30dfdee6ba6adad349424d74cf44c08e2db: Status 404 returned error can't find the container with id 65320fce1a0608c5e233ad7039ccb30dfdee6ba6adad349424d74cf44c08e2db Mar 08 03:52:59.845672 master-0 kubenswrapper[7547]: I0308 03:52:59.845590 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"0bbd3b73d51b06514693db13893aa6ce69354b9ab4f18d355441678c9479dc95"} Mar 08 03:52:59.845672 master-0 kubenswrapper[7547]: I0308 03:52:59.845671 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"e169486121bbc52c7ca877ad3d815dc4a35f6b8ee220e0fd43b9661c26e26d92"} Mar 08 03:52:59.847252 master-0 kubenswrapper[7547]: I0308 03:52:59.847187 7547 status_manager.go:851] "Failed to get status for pod" podUID="9c95709c-c3cb-46fb-afe7-626c8013f3c6" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:52:59.848265 master-0 kubenswrapper[7547]: I0308 03:52:59.848184 7547 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:52:59.850524 master-0 kubenswrapper[7547]: I0308 03:52:59.850398 7547 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="1570887c60156b6fbbdb4d53007ec6f0d11589a7feaf962ad0cf0545fdd489d2" exitCode=0 Mar 08 03:52:59.850668 master-0 kubenswrapper[7547]: I0308 03:52:59.850482 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerDied","Data":"1570887c60156b6fbbdb4d53007ec6f0d11589a7feaf962ad0cf0545fdd489d2"} Mar 08 03:52:59.850742 master-0 kubenswrapper[7547]: I0308 03:52:59.850671 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"65320fce1a0608c5e233ad7039ccb30dfdee6ba6adad349424d74cf44c08e2db"} Mar 08 03:52:59.852137 master-0 kubenswrapper[7547]: E0308 03:52:59.852076 7547 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:52:59.853068 master-0 kubenswrapper[7547]: I0308 03:52:59.853015 7547 status_manager.go:851] "Failed to get status for pod" podUID="9c95709c-c3cb-46fb-afe7-626c8013f3c6" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:52:59.854441 master-0 kubenswrapper[7547]: I0308 03:52:59.854378 7547 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:53:00.240891 master-0 kubenswrapper[7547]: I0308 03:53:00.237673 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:00.240891 master-0 kubenswrapper[7547]: I0308 03:53:00.238646 7547 status_manager.go:851] "Failed to get status for pod" podUID="9c95709c-c3cb-46fb-afe7-626c8013f3c6" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:53:00.240891 master-0 kubenswrapper[7547]: I0308 03:53:00.239278 7547 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:53:00.300303 master-0 kubenswrapper[7547]: I0308 03:53:00.300243 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kubelet-dir\") pod \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " Mar 08 03:53:00.300450 master-0 kubenswrapper[7547]: I0308 03:53:00.300337 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access\") pod \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " Mar 08 03:53:00.300450 master-0 kubenswrapper[7547]: I0308 03:53:00.300374 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-var-lock\") pod \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " Mar 08 03:53:00.300450 master-0 kubenswrapper[7547]: I0308 03:53:00.300382 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9c95709c-c3cb-46fb-afe7-626c8013f3c6" (UID: "9c95709c-c3cb-46fb-afe7-626c8013f3c6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:53:00.300631 master-0 kubenswrapper[7547]: I0308 03:53:00.300569 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-var-lock" (OuterVolumeSpecName: "var-lock") pod "9c95709c-c3cb-46fb-afe7-626c8013f3c6" (UID: "9c95709c-c3cb-46fb-afe7-626c8013f3c6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:53:00.300757 master-0 kubenswrapper[7547]: I0308 03:53:00.300725 7547 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:53:00.300757 master-0 kubenswrapper[7547]: I0308 03:53:00.300753 7547 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:53:00.314585 master-0 kubenswrapper[7547]: I0308 03:53:00.314414 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9c95709c-c3cb-46fb-afe7-626c8013f3c6" (UID: "9c95709c-c3cb-46fb-afe7-626c8013f3c6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:53:00.401717 master-0 kubenswrapper[7547]: I0308 03:53:00.401621 7547 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:53:00.862760 master-0 kubenswrapper[7547]: I0308 03:53:00.860925 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"865460c774c2766f8b86ebf8237c6f8af6ae97a526279d303aebe43f358dbff8"} Mar 08 03:53:00.862760 master-0 kubenswrapper[7547]: I0308 03:53:00.860966 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"77d9f19c7fff32bc633b77d809c0704eaf44b3aee7eeaf009773338793ad2dd5"} Mar 08 03:53:00.862760 master-0 kubenswrapper[7547]: I0308 03:53:00.860976 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"80a13278743d26b7b1321c7095283277668741654b1e182af894d61a0ac675ff"} Mar 08 03:53:00.863719 master-0 kubenswrapper[7547]: I0308 03:53:00.863703 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:00.863852 master-0 kubenswrapper[7547]: I0308 03:53:00.863706 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"9c95709c-c3cb-46fb-afe7-626c8013f3c6","Type":"ContainerDied","Data":"404a0d2a7b8ca71df97070480a7d3f018db46911d3635a93f748cb0ea044da91"} Mar 08 03:53:00.863899 master-0 kubenswrapper[7547]: I0308 03:53:00.863857 7547 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="404a0d2a7b8ca71df97070480a7d3f018db46911d3635a93f748cb0ea044da91" Mar 08 03:53:00.906730 master-0 kubenswrapper[7547]: E0308 03:53:00.906596 7547 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f77c8e18b751d90bc0dfe2d4e304050.slice/crio-conmon-a87790639e12c044cf9f716dfd6c742c89b97ffde357a755afcc44a38db6328d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f77c8e18b751d90bc0dfe2d4e304050.slice/crio-a87790639e12c044cf9f716dfd6c742c89b97ffde357a755afcc44a38db6328d.scope\": RecentStats: unable to find data in memory cache]" Mar 08 03:53:01.401074 master-0 kubenswrapper[7547]: I0308 03:53:01.401016 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:53:01.417423 master-0 kubenswrapper[7547]: I0308 03:53:01.417363 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 08 03:53:01.417616 master-0 kubenswrapper[7547]: I0308 03:53:01.417470 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 08 03:53:01.417616 master-0 kubenswrapper[7547]: I0308 03:53:01.417543 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 08 03:53:01.417616 master-0 kubenswrapper[7547]: I0308 03:53:01.417546 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:53:01.417616 master-0 kubenswrapper[7547]: I0308 03:53:01.417578 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 08 03:53:01.417737 master-0 kubenswrapper[7547]: I0308 03:53:01.417597 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:53:01.417737 master-0 kubenswrapper[7547]: I0308 03:53:01.417643 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:53:01.417737 master-0 kubenswrapper[7547]: I0308 03:53:01.417662 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 08 03:53:01.417737 master-0 kubenswrapper[7547]: I0308 03:53:01.417719 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets" (OuterVolumeSpecName: "secrets") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:53:01.417737 master-0 kubenswrapper[7547]: I0308 03:53:01.417721 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs" (OuterVolumeSpecName: "logs") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:53:01.417900 master-0 kubenswrapper[7547]: I0308 03:53:01.417795 7547 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 08 03:53:01.417900 master-0 kubenswrapper[7547]: I0308 03:53:01.417870 7547 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config" (OuterVolumeSpecName: "config") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:53:01.418350 master-0 kubenswrapper[7547]: I0308 03:53:01.418323 7547 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:53:01.418398 master-0 kubenswrapper[7547]: I0308 03:53:01.418349 7547 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 08 03:53:01.418398 master-0 kubenswrapper[7547]: I0308 03:53:01.418365 7547 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:53:01.418398 master-0 kubenswrapper[7547]: I0308 03:53:01.418377 7547 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 08 03:53:01.418398 master-0 kubenswrapper[7547]: I0308 03:53:01.418390 7547 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:53:01.418528 master-0 kubenswrapper[7547]: I0308 03:53:01.418402 7547 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") on node \"master-0\" DevicePath \"\"" Mar 08 03:53:01.878324 master-0 kubenswrapper[7547]: I0308 03:53:01.878217 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"222e8ca389049069b4efae8be97f8ff91fe671c190224c8b6f05f39079d825cf"} Mar 08 03:53:01.878324 master-0 kubenswrapper[7547]: I0308 03:53:01.878279 7547 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"5bc596a566a004204d8781e6880a298269208812a64f684e8f90b164a5a846fe"} Mar 08 03:53:01.878902 master-0 kubenswrapper[7547]: I0308 03:53:01.878881 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:01.882444 master-0 kubenswrapper[7547]: I0308 03:53:01.882372 7547 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="a87790639e12c044cf9f716dfd6c742c89b97ffde357a755afcc44a38db6328d" exitCode=0 Mar 08 03:53:01.882444 master-0 kubenswrapper[7547]: I0308 03:53:01.882437 7547 scope.go:117] "RemoveContainer" containerID="512d196861598af69e92dd9aa3d25b53c40e97b92520ddd9df4d73c8065df7e5" Mar 08 03:53:01.882642 master-0 kubenswrapper[7547]: I0308 03:53:01.882613 7547 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:53:01.911017 master-0 kubenswrapper[7547]: I0308 03:53:01.907795 7547 scope.go:117] "RemoveContainer" containerID="a87790639e12c044cf9f716dfd6c742c89b97ffde357a755afcc44a38db6328d" Mar 08 03:53:01.923227 master-0 kubenswrapper[7547]: I0308 03:53:01.923196 7547 scope.go:117] "RemoveContainer" containerID="92b1c53e472127e182a48a6a8f941f7dd97106f322656ce4711b76ad8c4fc359" Mar 08 03:53:01.936259 master-0 kubenswrapper[7547]: I0308 03:53:01.936212 7547 scope.go:117] "RemoveContainer" containerID="512d196861598af69e92dd9aa3d25b53c40e97b92520ddd9df4d73c8065df7e5" Mar 08 03:53:01.936697 master-0 kubenswrapper[7547]: E0308 03:53:01.936629 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"512d196861598af69e92dd9aa3d25b53c40e97b92520ddd9df4d73c8065df7e5\": container with ID starting with 512d196861598af69e92dd9aa3d25b53c40e97b92520ddd9df4d73c8065df7e5 not found: ID does not exist" containerID="512d196861598af69e92dd9aa3d25b53c40e97b92520ddd9df4d73c8065df7e5" Mar 08 03:53:01.936697 master-0 kubenswrapper[7547]: I0308 03:53:01.936671 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"512d196861598af69e92dd9aa3d25b53c40e97b92520ddd9df4d73c8065df7e5"} err="failed to get container status \"512d196861598af69e92dd9aa3d25b53c40e97b92520ddd9df4d73c8065df7e5\": rpc error: code = NotFound desc = could not find container \"512d196861598af69e92dd9aa3d25b53c40e97b92520ddd9df4d73c8065df7e5\": container with ID starting with 512d196861598af69e92dd9aa3d25b53c40e97b92520ddd9df4d73c8065df7e5 not found: ID does not exist" Mar 08 03:53:01.936792 master-0 kubenswrapper[7547]: I0308 03:53:01.936697 7547 scope.go:117] "RemoveContainer" containerID="a87790639e12c044cf9f716dfd6c742c89b97ffde357a755afcc44a38db6328d" Mar 08 03:53:01.939059 master-0 kubenswrapper[7547]: E0308 03:53:01.936975 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a87790639e12c044cf9f716dfd6c742c89b97ffde357a755afcc44a38db6328d\": container with ID starting with a87790639e12c044cf9f716dfd6c742c89b97ffde357a755afcc44a38db6328d not found: ID does not exist" containerID="a87790639e12c044cf9f716dfd6c742c89b97ffde357a755afcc44a38db6328d" Mar 08 03:53:01.939059 master-0 kubenswrapper[7547]: I0308 03:53:01.936993 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a87790639e12c044cf9f716dfd6c742c89b97ffde357a755afcc44a38db6328d"} err="failed to get container status \"a87790639e12c044cf9f716dfd6c742c89b97ffde357a755afcc44a38db6328d\": rpc error: code = NotFound desc = could not find container \"a87790639e12c044cf9f716dfd6c742c89b97ffde357a755afcc44a38db6328d\": container with ID starting with a87790639e12c044cf9f716dfd6c742c89b97ffde357a755afcc44a38db6328d not found: ID does not exist" Mar 08 03:53:01.939059 master-0 kubenswrapper[7547]: I0308 03:53:01.937014 7547 scope.go:117] "RemoveContainer" containerID="92b1c53e472127e182a48a6a8f941f7dd97106f322656ce4711b76ad8c4fc359" Mar 08 03:53:01.939059 master-0 kubenswrapper[7547]: E0308 03:53:01.937189 7547 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92b1c53e472127e182a48a6a8f941f7dd97106f322656ce4711b76ad8c4fc359\": container with ID starting with 92b1c53e472127e182a48a6a8f941f7dd97106f322656ce4711b76ad8c4fc359 not found: ID does not exist" containerID="92b1c53e472127e182a48a6a8f941f7dd97106f322656ce4711b76ad8c4fc359" Mar 08 03:53:01.939059 master-0 kubenswrapper[7547]: I0308 03:53:01.937203 7547 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92b1c53e472127e182a48a6a8f941f7dd97106f322656ce4711b76ad8c4fc359"} err="failed to get container status \"92b1c53e472127e182a48a6a8f941f7dd97106f322656ce4711b76ad8c4fc359\": rpc error: code = NotFound desc = could not find container \"92b1c53e472127e182a48a6a8f941f7dd97106f322656ce4711b76ad8c4fc359\": container with ID starting with 92b1c53e472127e182a48a6a8f941f7dd97106f322656ce4711b76ad8c4fc359 not found: ID does not exist" Mar 08 03:53:03.245859 master-0 kubenswrapper[7547]: I0308 03:53:03.245742 7547 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f77c8e18b751d90bc0dfe2d4e304050" path="/var/lib/kubelet/pods/5f77c8e18b751d90bc0dfe2d4e304050/volumes" Mar 08 03:53:03.246903 master-0 kubenswrapper[7547]: I0308 03:53:03.246669 7547 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 08 03:53:04.024805 master-0 kubenswrapper[7547]: I0308 03:53:04.024724 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:04.024805 master-0 kubenswrapper[7547]: I0308 03:53:04.024801 7547 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:04.033697 master-0 kubenswrapper[7547]: I0308 03:53:04.033632 7547 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:05.245348 master-0 kubenswrapper[7547]: I0308 03:53:05.245260 7547 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 08 03:53:07.242322 master-0 kubenswrapper[7547]: I0308 03:53:07.242268 7547 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 08 03:53:07.764011 master-0 kubenswrapper[7547]: I0308 03:53:07.761647 7547 request.go:700] Waited for 1.15762535s, retries: 1, retry-after: 5s - retry-reason: 503 - request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-image-registry/secrets?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dimage-registry-operator-tls&resourceVersion=10362&timeout=54m18s&timeoutSeconds=3258&watch=true Mar 08 03:53:07.801791 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 08 03:53:07.824404 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 08 03:53:07.824648 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 08 03:53:07.825885 master-0 systemd[1]: kubelet.service: Consumed 46.643s CPU time. Mar 08 03:53:07.842705 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 08 03:53:07.953565 master-0 kubenswrapper[18592]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:53:07.953565 master-0 kubenswrapper[18592]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 08 03:53:07.953565 master-0 kubenswrapper[18592]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:53:07.953565 master-0 kubenswrapper[18592]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:53:07.953565 master-0 kubenswrapper[18592]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 08 03:53:07.953565 master-0 kubenswrapper[18592]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:53:07.954191 master-0 kubenswrapper[18592]: I0308 03:53:07.953606 18592 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 08 03:53:07.957070 master-0 kubenswrapper[18592]: W0308 03:53:07.957040 18592 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:53:07.957070 master-0 kubenswrapper[18592]: W0308 03:53:07.957062 18592 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:53:07.957070 master-0 kubenswrapper[18592]: W0308 03:53:07.957066 18592 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:53:07.957070 master-0 kubenswrapper[18592]: W0308 03:53:07.957071 18592 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957077 18592 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957081 18592 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957116 18592 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957120 18592 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957125 18592 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957130 18592 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957133 18592 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957137 18592 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957142 18592 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957146 18592 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957154 18592 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957158 18592 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957162 18592 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957167 18592 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957171 18592 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957175 18592 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957178 18592 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957183 18592 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957186 18592 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:53:07.957259 master-0 kubenswrapper[18592]: W0308 03:53:07.957190 18592 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957196 18592 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957200 18592 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957206 18592 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957211 18592 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957222 18592 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957226 18592 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957230 18592 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957253 18592 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957257 18592 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957261 18592 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957264 18592 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957269 18592 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957273 18592 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957278 18592 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957282 18592 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957288 18592 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957292 18592 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957296 18592 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957300 18592 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:53:07.957962 master-0 kubenswrapper[18592]: W0308 03:53:07.957303 18592 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957307 18592 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957312 18592 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957316 18592 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957320 18592 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957323 18592 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957327 18592 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957332 18592 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957339 18592 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957343 18592 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957347 18592 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957350 18592 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957354 18592 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957358 18592 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957362 18592 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957367 18592 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957370 18592 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957374 18592 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957380 18592 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:53:07.958649 master-0 kubenswrapper[18592]: W0308 03:53:07.957395 18592 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: W0308 03:53:07.957401 18592 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: W0308 03:53:07.957405 18592 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: W0308 03:53:07.957410 18592 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: W0308 03:53:07.957414 18592 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: W0308 03:53:07.957419 18592 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: W0308 03:53:07.957424 18592 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: W0308 03:53:07.957429 18592 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: W0308 03:53:07.957434 18592 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: W0308 03:53:07.957441 18592 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: I0308 03:53:07.957540 18592 flags.go:64] FLAG: --address="0.0.0.0" Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: I0308 03:53:07.957550 18592 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: I0308 03:53:07.957568 18592 flags.go:64] FLAG: --anonymous-auth="true" Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: I0308 03:53:07.957575 18592 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: I0308 03:53:07.957583 18592 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: I0308 03:53:07.957588 18592 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: I0308 03:53:07.957594 18592 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: I0308 03:53:07.957599 18592 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: I0308 03:53:07.957603 18592 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: I0308 03:53:07.957608 18592 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: I0308 03:53:07.957613 18592 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 08 03:53:07.959320 master-0 kubenswrapper[18592]: I0308 03:53:07.957628 18592 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957636 18592 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957641 18592 flags.go:64] FLAG: --cgroup-root="" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957646 18592 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957650 18592 flags.go:64] FLAG: --client-ca-file="" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957655 18592 flags.go:64] FLAG: --cloud-config="" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957659 18592 flags.go:64] FLAG: --cloud-provider="" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957663 18592 flags.go:64] FLAG: --cluster-dns="[]" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957669 18592 flags.go:64] FLAG: --cluster-domain="" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957673 18592 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957680 18592 flags.go:64] FLAG: --config-dir="" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957684 18592 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957689 18592 flags.go:64] FLAG: --container-log-max-files="5" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957694 18592 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957698 18592 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957703 18592 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957707 18592 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957712 18592 flags.go:64] FLAG: --contention-profiling="false" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957718 18592 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957723 18592 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957728 18592 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957732 18592 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957738 18592 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957742 18592 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957747 18592 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 08 03:53:07.960069 master-0 kubenswrapper[18592]: I0308 03:53:07.957752 18592 flags.go:64] FLAG: --enable-load-reader="false" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957756 18592 flags.go:64] FLAG: --enable-server="true" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957762 18592 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957768 18592 flags.go:64] FLAG: --event-burst="100" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957773 18592 flags.go:64] FLAG: --event-qps="50" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957777 18592 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957782 18592 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957792 18592 flags.go:64] FLAG: --eviction-hard="" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957797 18592 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957802 18592 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957808 18592 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957813 18592 flags.go:64] FLAG: --eviction-soft="" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957817 18592 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957847 18592 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957852 18592 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957856 18592 flags.go:64] FLAG: --experimental-mounter-path="" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957861 18592 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957865 18592 flags.go:64] FLAG: --fail-swap-on="true" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957869 18592 flags.go:64] FLAG: --feature-gates="" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957877 18592 flags.go:64] FLAG: --file-check-frequency="20s" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957881 18592 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.957886 18592 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.958119 18592 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.958128 18592 flags.go:64] FLAG: --healthz-port="10248" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.958132 18592 flags.go:64] FLAG: --help="false" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.958137 18592 flags.go:64] FLAG: --hostname-override="" Mar 08 03:53:07.960932 master-0 kubenswrapper[18592]: I0308 03:53:07.958141 18592 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958146 18592 flags.go:64] FLAG: --http-check-frequency="20s" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958151 18592 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958155 18592 flags.go:64] FLAG: --image-credential-provider-config="" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958159 18592 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958164 18592 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958168 18592 flags.go:64] FLAG: --image-service-endpoint="" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958172 18592 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958176 18592 flags.go:64] FLAG: --kube-api-burst="100" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958180 18592 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958184 18592 flags.go:64] FLAG: --kube-api-qps="50" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958188 18592 flags.go:64] FLAG: --kube-reserved="" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958193 18592 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958200 18592 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958205 18592 flags.go:64] FLAG: --kubelet-cgroups="" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958209 18592 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958214 18592 flags.go:64] FLAG: --lock-file="" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958218 18592 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958223 18592 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958227 18592 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958249 18592 flags.go:64] FLAG: --log-json-split-stream="false" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958254 18592 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958258 18592 flags.go:64] FLAG: --log-text-split-stream="false" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958262 18592 flags.go:64] FLAG: --logging-format="text" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958266 18592 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 08 03:53:07.961813 master-0 kubenswrapper[18592]: I0308 03:53:07.958271 18592 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958275 18592 flags.go:64] FLAG: --manifest-url="" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958280 18592 flags.go:64] FLAG: --manifest-url-header="" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958286 18592 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958291 18592 flags.go:64] FLAG: --max-open-files="1000000" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958297 18592 flags.go:64] FLAG: --max-pods="110" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958301 18592 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958306 18592 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958310 18592 flags.go:64] FLAG: --memory-manager-policy="None" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958314 18592 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958318 18592 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958323 18592 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958327 18592 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958346 18592 flags.go:64] FLAG: --node-status-max-images="50" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958354 18592 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958359 18592 flags.go:64] FLAG: --oom-score-adj="-999" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958363 18592 flags.go:64] FLAG: --pod-cidr="" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958371 18592 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958407 18592 flags.go:64] FLAG: --pod-manifest-path="" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958412 18592 flags.go:64] FLAG: --pod-max-pids="-1" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958421 18592 flags.go:64] FLAG: --pods-per-core="0" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958427 18592 flags.go:64] FLAG: --port="10250" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958439 18592 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958449 18592 flags.go:64] FLAG: --provider-id="" Mar 08 03:53:07.962762 master-0 kubenswrapper[18592]: I0308 03:53:07.958460 18592 flags.go:64] FLAG: --qos-reserved="" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958465 18592 flags.go:64] FLAG: --read-only-port="10255" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958470 18592 flags.go:64] FLAG: --register-node="true" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958474 18592 flags.go:64] FLAG: --register-schedulable="true" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958483 18592 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958491 18592 flags.go:64] FLAG: --registry-burst="10" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958499 18592 flags.go:64] FLAG: --registry-qps="5" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958503 18592 flags.go:64] FLAG: --reserved-cpus="" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958507 18592 flags.go:64] FLAG: --reserved-memory="" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958517 18592 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958526 18592 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958530 18592 flags.go:64] FLAG: --rotate-certificates="false" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958538 18592 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958546 18592 flags.go:64] FLAG: --runonce="false" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958551 18592 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958559 18592 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958564 18592 flags.go:64] FLAG: --seccomp-default="false" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958568 18592 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958573 18592 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958577 18592 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958581 18592 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958585 18592 flags.go:64] FLAG: --storage-driver-password="root" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958590 18592 flags.go:64] FLAG: --storage-driver-secure="false" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958594 18592 flags.go:64] FLAG: --storage-driver-table="stats" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958598 18592 flags.go:64] FLAG: --storage-driver-user="root" Mar 08 03:53:07.964265 master-0 kubenswrapper[18592]: I0308 03:53:07.958602 18592 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: I0308 03:53:07.958607 18592 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: I0308 03:53:07.958611 18592 flags.go:64] FLAG: --system-cgroups="" Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: I0308 03:53:07.958615 18592 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: I0308 03:53:07.958625 18592 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: I0308 03:53:07.958629 18592 flags.go:64] FLAG: --tls-cert-file="" Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: I0308 03:53:07.958634 18592 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: I0308 03:53:07.958645 18592 flags.go:64] FLAG: --tls-min-version="" Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: I0308 03:53:07.958650 18592 flags.go:64] FLAG: --tls-private-key-file="" Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: I0308 03:53:07.958655 18592 flags.go:64] FLAG: --topology-manager-policy="none" Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: I0308 03:53:07.958660 18592 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: I0308 03:53:07.958665 18592 flags.go:64] FLAG: --topology-manager-scope="container" Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: I0308 03:53:07.958669 18592 flags.go:64] FLAG: --v="2" Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: I0308 03:53:07.958676 18592 flags.go:64] FLAG: --version="false" Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: I0308 03:53:07.958682 18592 flags.go:64] FLAG: --vmodule="" Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: I0308 03:53:07.958687 18592 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: I0308 03:53:07.958692 18592 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: W0308 03:53:07.958815 18592 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: W0308 03:53:07.958836 18592 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: W0308 03:53:07.958841 18592 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: W0308 03:53:07.958845 18592 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: W0308 03:53:07.958849 18592 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: W0308 03:53:07.958853 18592 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: W0308 03:53:07.958856 18592 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:53:07.965200 master-0 kubenswrapper[18592]: W0308 03:53:07.958860 18592 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958863 18592 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958867 18592 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958870 18592 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958875 18592 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958880 18592 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958884 18592 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958888 18592 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958892 18592 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958896 18592 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958900 18592 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958904 18592 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958910 18592 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958914 18592 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958919 18592 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958924 18592 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958928 18592 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958932 18592 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958937 18592 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:53:07.966255 master-0 kubenswrapper[18592]: W0308 03:53:07.958940 18592 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.958944 18592 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.958947 18592 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.958951 18592 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.958956 18592 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.958960 18592 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.958963 18592 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.958967 18592 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.958971 18592 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.958974 18592 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.958978 18592 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.958982 18592 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.958985 18592 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.958989 18592 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.958992 18592 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.958996 18592 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.958999 18592 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.959003 18592 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.959006 18592 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.959010 18592 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:53:07.966939 master-0 kubenswrapper[18592]: W0308 03:53:07.959014 18592 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959018 18592 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959022 18592 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959026 18592 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959029 18592 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959033 18592 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959036 18592 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959041 18592 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959046 18592 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959050 18592 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959054 18592 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959058 18592 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959061 18592 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959065 18592 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959069 18592 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959072 18592 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959076 18592 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959080 18592 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959083 18592 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:53:07.967581 master-0 kubenswrapper[18592]: W0308 03:53:07.959087 18592 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:53:07.968227 master-0 kubenswrapper[18592]: W0308 03:53:07.959092 18592 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:53:07.968227 master-0 kubenswrapper[18592]: W0308 03:53:07.959095 18592 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:53:07.968227 master-0 kubenswrapper[18592]: W0308 03:53:07.959099 18592 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:53:07.968227 master-0 kubenswrapper[18592]: W0308 03:53:07.959103 18592 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:53:07.968227 master-0 kubenswrapper[18592]: W0308 03:53:07.959107 18592 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:53:07.968227 master-0 kubenswrapper[18592]: W0308 03:53:07.959110 18592 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:53:07.968227 master-0 kubenswrapper[18592]: I0308 03:53:07.959123 18592 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 03:53:07.968227 master-0 kubenswrapper[18592]: I0308 03:53:07.963453 18592 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 08 03:53:07.968227 master-0 kubenswrapper[18592]: I0308 03:53:07.963481 18592 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 08 03:53:07.968227 master-0 kubenswrapper[18592]: W0308 03:53:07.963548 18592 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:53:07.968227 master-0 kubenswrapper[18592]: W0308 03:53:07.963555 18592 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:53:07.968227 master-0 kubenswrapper[18592]: W0308 03:53:07.963559 18592 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:53:07.968227 master-0 kubenswrapper[18592]: W0308 03:53:07.963564 18592 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:53:07.968227 master-0 kubenswrapper[18592]: W0308 03:53:07.963568 18592 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:53:07.968227 master-0 kubenswrapper[18592]: W0308 03:53:07.963571 18592 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:53:07.968227 master-0 kubenswrapper[18592]: W0308 03:53:07.963575 18592 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963580 18592 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963583 18592 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963587 18592 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963591 18592 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963594 18592 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963598 18592 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963602 18592 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963607 18592 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963612 18592 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963616 18592 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963620 18592 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963623 18592 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963627 18592 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963631 18592 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963635 18592 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963638 18592 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963643 18592 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963647 18592 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:53:07.968781 master-0 kubenswrapper[18592]: W0308 03:53:07.963651 18592 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963655 18592 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963659 18592 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963662 18592 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963667 18592 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963674 18592 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963679 18592 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963683 18592 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963687 18592 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963692 18592 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963696 18592 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963700 18592 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963704 18592 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963708 18592 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963712 18592 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963715 18592 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963719 18592 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963723 18592 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963727 18592 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:53:07.969432 master-0 kubenswrapper[18592]: W0308 03:53:07.963731 18592 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963736 18592 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963740 18592 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963744 18592 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963748 18592 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963752 18592 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963755 18592 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963759 18592 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963763 18592 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963767 18592 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963771 18592 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963775 18592 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963780 18592 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963784 18592 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963787 18592 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963791 18592 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963795 18592 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963799 18592 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963802 18592 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:53:07.970141 master-0 kubenswrapper[18592]: W0308 03:53:07.963806 18592 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:53:07.971021 master-0 kubenswrapper[18592]: W0308 03:53:07.963811 18592 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:53:07.971021 master-0 kubenswrapper[18592]: W0308 03:53:07.963814 18592 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:53:07.971021 master-0 kubenswrapper[18592]: W0308 03:53:07.963818 18592 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:53:07.971021 master-0 kubenswrapper[18592]: W0308 03:53:07.963852 18592 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:53:07.971021 master-0 kubenswrapper[18592]: W0308 03:53:07.963857 18592 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:53:07.971021 master-0 kubenswrapper[18592]: W0308 03:53:07.963861 18592 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:53:07.971021 master-0 kubenswrapper[18592]: W0308 03:53:07.963868 18592 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:53:07.971021 master-0 kubenswrapper[18592]: W0308 03:53:07.963873 18592 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:53:07.971021 master-0 kubenswrapper[18592]: I0308 03:53:07.963880 18592 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 03:53:07.971021 master-0 kubenswrapper[18592]: W0308 03:53:07.963999 18592 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:53:07.971021 master-0 kubenswrapper[18592]: W0308 03:53:07.964007 18592 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:53:07.971021 master-0 kubenswrapper[18592]: W0308 03:53:07.964014 18592 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:53:07.971021 master-0 kubenswrapper[18592]: W0308 03:53:07.964018 18592 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:53:07.971021 master-0 kubenswrapper[18592]: W0308 03:53:07.964022 18592 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:53:07.971021 master-0 kubenswrapper[18592]: W0308 03:53:07.964026 18592 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964030 18592 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964034 18592 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964039 18592 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964045 18592 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964049 18592 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964053 18592 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964057 18592 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964061 18592 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964065 18592 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964069 18592 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964073 18592 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964077 18592 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964081 18592 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964085 18592 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964089 18592 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964092 18592 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964096 18592 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964100 18592 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:53:07.971510 master-0 kubenswrapper[18592]: W0308 03:53:07.964104 18592 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964108 18592 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964111 18592 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964115 18592 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964119 18592 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964123 18592 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964126 18592 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964131 18592 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964136 18592 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964141 18592 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964181 18592 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964185 18592 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964206 18592 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964211 18592 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964217 18592 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964221 18592 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964225 18592 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964230 18592 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964234 18592 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:53:07.972188 master-0 kubenswrapper[18592]: W0308 03:53:07.964238 18592 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964242 18592 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964246 18592 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964251 18592 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964255 18592 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964259 18592 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964264 18592 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964268 18592 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964273 18592 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964277 18592 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964281 18592 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964286 18592 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964290 18592 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964295 18592 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964299 18592 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964303 18592 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964307 18592 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964312 18592 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964317 18592 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964321 18592 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:53:07.972802 master-0 kubenswrapper[18592]: W0308 03:53:07.964326 18592 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:53:07.974356 master-0 kubenswrapper[18592]: W0308 03:53:07.964330 18592 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:53:07.974356 master-0 kubenswrapper[18592]: W0308 03:53:07.964334 18592 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:53:07.974356 master-0 kubenswrapper[18592]: W0308 03:53:07.964337 18592 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:53:07.974356 master-0 kubenswrapper[18592]: W0308 03:53:07.964341 18592 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:53:07.974356 master-0 kubenswrapper[18592]: W0308 03:53:07.964345 18592 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:53:07.974356 master-0 kubenswrapper[18592]: W0308 03:53:07.964350 18592 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:53:07.974356 master-0 kubenswrapper[18592]: W0308 03:53:07.964354 18592 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:53:07.974356 master-0 kubenswrapper[18592]: W0308 03:53:07.964358 18592 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:53:07.974356 master-0 kubenswrapper[18592]: I0308 03:53:07.964365 18592 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 03:53:07.974356 master-0 kubenswrapper[18592]: I0308 03:53:07.964532 18592 server.go:940] "Client rotation is on, will bootstrap in background" Mar 08 03:53:07.974356 master-0 kubenswrapper[18592]: I0308 03:53:07.966196 18592 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 08 03:53:07.974356 master-0 kubenswrapper[18592]: I0308 03:53:07.966251 18592 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 08 03:53:07.974356 master-0 kubenswrapper[18592]: I0308 03:53:07.966475 18592 server.go:997] "Starting client certificate rotation" Mar 08 03:53:07.974356 master-0 kubenswrapper[18592]: I0308 03:53:07.966485 18592 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 08 03:53:07.974917 master-0 kubenswrapper[18592]: I0308 03:53:07.967014 18592 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 03:53:07.974917 master-0 kubenswrapper[18592]: I0308 03:53:07.970374 18592 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-09 03:37:12 +0000 UTC, rotation deadline is 2026-03-08 21:02:31.947105702 +0000 UTC Mar 08 03:53:07.974917 master-0 kubenswrapper[18592]: I0308 03:53:07.970440 18592 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 17h9m23.976668185s for next certificate rotation Mar 08 03:53:07.974917 master-0 kubenswrapper[18592]: I0308 03:53:07.972771 18592 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 03:53:07.975652 master-0 kubenswrapper[18592]: I0308 03:53:07.975371 18592 log.go:25] "Validated CRI v1 runtime API" Mar 08 03:53:07.980689 master-0 kubenswrapper[18592]: I0308 03:53:07.980561 18592 log.go:25] "Validated CRI v1 image API" Mar 08 03:53:07.982598 master-0 kubenswrapper[18592]: I0308 03:53:07.982190 18592 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 08 03:53:08.003849 master-0 kubenswrapper[18592]: I0308 03:53:08.002697 18592 fs.go:135] Filesystem UUIDs: map[67898fbb-3e32-465e-b6f9-207afe668b6e:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 08 03:53:08.034742 master-0 kubenswrapper[18592]: I0308 03:53:08.002742 18592 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/005a8f12bd7e675962c64889aeb13228b895d026517d2c45f10276c0ab4cd89e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/005a8f12bd7e675962c64889aeb13228b895d026517d2c45f10276c0ab4cd89e/userdata/shm major:0 minor:91 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0362cae60811bf2651cbf0189c6f7ef23ef9a8b3134d671278cc424b4c2ad9ec/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0362cae60811bf2651cbf0189c6f7ef23ef9a8b3134d671278cc424b4c2ad9ec/userdata/shm major:0 minor:874 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0ec1fcf833bb575029f4371f595adf3e92b6ae14914f83458d311cb85210d774/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0ec1fcf833bb575029f4371f595adf3e92b6ae14914f83458d311cb85210d774/userdata/shm major:0 minor:114 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/133c0043d3a977b4007520994c1530f26391f82433e16ae8b2e991aa2092980b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/133c0043d3a977b4007520994c1530f26391f82433e16ae8b2e991aa2092980b/userdata/shm major:0 minor:285 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/14fe5cb6383f1129ecf327e882bdb7904f8ad1a8a2cc2647d9ee96534b6ccb93/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/14fe5cb6383f1129ecf327e882bdb7904f8ad1a8a2cc2647d9ee96534b6ccb93/userdata/shm major:0 minor:271 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/15db86a68d791074ec87b05b5a5fc2f19c6862a1ebcfb5de4931251a55e195a3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/15db86a68d791074ec87b05b5a5fc2f19c6862a1ebcfb5de4931251a55e195a3/userdata/shm major:0 minor:538 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/201fec6682662fad0acc461d7db4c8e108b597d14fa258495dcdfa10f6e193b5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/201fec6682662fad0acc461d7db4c8e108b597d14fa258495dcdfa10f6e193b5/userdata/shm major:0 minor:528 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2221ffbcc38435886b634ed23b63c6c48586f85323431d02b258500a200b9a2b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2221ffbcc38435886b634ed23b63c6c48586f85323431d02b258500a200b9a2b/userdata/shm major:0 minor:778 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/22cc81f0c9d90fe64f682c3bbb7bbcefc904c4ee2c036d7eedf6b66887f69fae/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/22cc81f0c9d90fe64f682c3bbb7bbcefc904c4ee2c036d7eedf6b66887f69fae/userdata/shm major:0 minor:100 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/24f253f45cdf8fe83c1408fe0ce3848ec429687603d0f7eff71df3320c693f47/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/24f253f45cdf8fe83c1408fe0ce3848ec429687603d0f7eff71df3320c693f47/userdata/shm major:0 minor:529 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2899b4e2a1cabd8aea96b1bf0db490c7e98f0e9564c40236186985f7b516039b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2899b4e2a1cabd8aea96b1bf0db490c7e98f0e9564c40236186985f7b516039b/userdata/shm major:0 minor:269 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2a6fcf9f9144c9d6bd5dcc37a1eb71dcefd7c65bbd0149ae8f69eff02142d6ae/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2a6fcf9f9144c9d6bd5dcc37a1eb71dcefd7c65bbd0149ae8f69eff02142d6ae/userdata/shm major:0 minor:897 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2a8e902cd252f0c879e3e1c00047d04c3e8646bfeed72f034a41537b464f6d14/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2a8e902cd252f0c879e3e1c00047d04c3e8646bfeed72f034a41537b464f6d14/userdata/shm major:0 minor:273 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2f60507250e058dbc73dd9a2defceea722aafde0bbc43ed7857b1626b36814fe/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2f60507250e058dbc73dd9a2defceea722aafde0bbc43ed7857b1626b36814fe/userdata/shm major:0 minor:526 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3235d3bd9c5f6c6a7e16ad74c79046e87f4d03278e4096c568a5930f544fbbf0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3235d3bd9c5f6c6a7e16ad74c79046e87f4d03278e4096c568a5930f544fbbf0/userdata/shm major:0 minor:227 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/34c7c8e7bf6b2608a8ed06595ef49b7b5823fe04e62631a07f3cbbca5adb876a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/34c7c8e7bf6b2608a8ed06595ef49b7b5823fe04e62631a07f3cbbca5adb876a/userdata/shm major:0 minor:527 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3d64c10d51d4d9009da402a9f2c51b81830f1695b7370548200097f367d254f2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3d64c10d51d4d9009da402a9f2c51b81830f1695b7370548200097f367d254f2/userdata/shm major:0 minor:265 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3f36e6c60bfd2bd17ffa26adda31fc5cb46b7dc64ce396281d544af5a25539b6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3f36e6c60bfd2bd17ffa26adda31fc5cb46b7dc64ce396281d544af5a25539b6/userdata/shm major:0 minor:460 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4025d1a6cb66b179d453ec8f3c19442902ea80a04085eeeda4fa9c48c774a80e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4025d1a6cb66b179d453ec8f3c19442902ea80a04085eeeda4fa9c48c774a80e/userdata/shm major:0 minor:644 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/40effd583e5274c7bd2d572652f2adc6b94ce532187f776e02a024e30b5ff7e5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/40effd583e5274c7bd2d572652f2adc6b94ce532187f776e02a024e30b5ff7e5/userdata/shm major:0 minor:406 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4482916b3b4b521cf75927dd45a05e0a2072a49de37c125a72612ca885ff96ce/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4482916b3b4b521cf75927dd45a05e0a2072a49de37c125a72612ca885ff96ce/userdata/shm major:0 minor:279 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/47433e6e63affa1ba02608e11b299ca5af00d1c85e6731e35f43a4b241522538/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/47433e6e63affa1ba02608e11b299ca5af00d1c85e6731e35f43a4b241522538/userdata/shm major:0 minor:286 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/48d46b7645a64ea18f3fa334445c914bbcaaadce3a50f149dedad680b9f63699/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/48d46b7645a64ea18f3fa334445c914bbcaaadce3a50f149dedad680b9f63699/userdata/shm major:0 minor:536 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/49aec4971047b96e14ae56703fe099426b567477422c0add4be258e7ae9b7ff1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/49aec4971047b96e14ae56703fe099426b567477422c0add4be258e7ae9b7ff1/userdata/shm major:0 minor:259 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/59d3785f249896d312f7b27b04e39fa314d9b06309adfc4aa055444977f4fa7e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/59d3785f249896d312f7b27b04e39fa314d9b06309adfc4aa055444977f4fa7e/userdata/shm major:0 minor:584 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5b8c31076d1db49fd8c133661fbbc131a58892112131cf3118f58212505e7460/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5b8c31076d1db49fd8c133661fbbc131a58892112131cf3118f58212505e7460/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/65320fce1a0608c5e233ad7039ccb30dfdee6ba6adad349424d74cf44c08e2db/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/65320fce1a0608c5e233ad7039ccb30dfdee6ba6adad349424d74cf44c08e2db/userdata/shm major:0 minor:98 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6588c21791f0b9fd7a866ced5165aad3ddf504a15e8585434bc4836ba3395293/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6588c21791f0b9fd7a866ced5165aad3ddf504a15e8585434bc4836ba3395293/userdata/shm major:0 minor:142 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/718b320dd5408c5dda7ee606dfc45fd377e09b1616f83b01ddd6bedbab6de149/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/718b320dd5408c5dda7ee606dfc45fd377e09b1616f83b01ddd6bedbab6de149/userdata/shm major:0 minor:587 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7f9b714050ae3b09bb8faf754e7059ffdbc5afd8e225d14b9c0ab424f1262da7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7f9b714050ae3b09bb8faf754e7059ffdbc5afd8e225d14b9c0ab424f1262da7/userdata/shm major:0 minor:456 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8b3d4b079b2ebb85b87310aca0e7ee26b306a1f0013e66da4d3495d792aa5402/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8b3d4b079b2ebb85b87310aca0e7ee26b306a1f0013e66da4d3495d792aa5402/userdata/shm major:0 minor:388 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/946c23d9475281a0fd499a7ff53f910e9f2c222a2716d7d2886b8590024362cc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/946c23d9475281a0fd499a7ff53f910e9f2c222a2716d7d2886b8590024362cc/userdata/shm major:0 minor:443 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/95267fdf9dd0903f45fd631ce455eb67b79bbeabc1d7f2fb9fb37ed66199c9e6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/95267fdf9dd0903f45fd631ce455eb67b79bbeabc1d7f2fb9fb37ed66199c9e6/userdata/shm major:0 minor:537 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9903faf2a78afa21fe51e82e71e6a8b65942f5c695c1b737493cfec8a1911541/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9903faf2a78afa21fe51e82e71e6a8b65942f5c695c1b737493cfec8a1911541/userdata/shm major:0 minor:320 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9af5ebd3eee3c3de99e27a671d715ba12c7da929014abc4a9a4424a8fb8aad4e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9af5ebd3eee3c3de99e27a671d715ba12c7da929014abc4a9a4424a8fb8aad4e/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9b135e9cc968b9e23fda104dcc5dd8cbf50632e21d670c61642446eb2eb45282/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9b135e9cc968b9e23fda104dcc5dd8cbf50632e21d670c61642446eb2eb45282/userdata/shm major:0 minor:263 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a5455725a1362a8e870442eb2f0235fbea46c1d047d2183683f1ca346ec9c059/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a5455725a1362a8e870442eb2f0235fbea46c1d047d2183683f1ca346ec9c059/userdata/shm major:0 minor:289 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a77b94c864745863d968467e91271f57be5b9449652226a9e1e9789e20eef38f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a77b94c864745863d968467e91271f57be5b9449652226a9e1e9789e20eef38f/userdata/shm major:0 minor:855 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/adfc23f0d784d89240f88962b8f79cdf84a79077cb7581e94d0e19b479eeafaa/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/adfc23f0d784d89240f88962b8f79cdf84a79077cb7581e94d0e19b479eeafaa/userdata/shm major:0 minor:283 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b1a0209e0a9a4093bed7068dc639e8f6b3aa1b820bc97e5ac17eab47d3a362ec/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b1a0209e0a9a4093bed7068dc639e8f6b3aa1b820bc97e5ac17eab47d3a362ec/userdata/shm major:0 minor:535 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b259f65d2759400bc4903e27e05a8a6318da137f5779b59b0781ca65575183e7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b259f65d2759400bc4903e27e05a8a6318da137f5779b59b0781ca65575183e7/userdata/shm major:0 minor:859 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b57da2053e178390131e07c76308479420cda5a65a12ef7fa425c01959c1b9c5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b57da2053e178390131e07c76308479420cda5a65a12ef7fa425c01959c1b9c5/userdata/shm major:0 minor:548 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b6803c40b59bc228703c8b2ce51f78af3f050cad56ceb99d544a076dbfccb803/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b6803c40b59bc228703c8b2ce51f78af3f050cad56ceb99d544a076dbfccb803/userdata/shm major:0 minor:605 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b7854fff4e6d290ba66d677fb1f4c348702f3c168d271f4daa5e0ff010a39d54/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b7854fff4e6d290ba66d677fb1f4c348702f3c168d271f4daa5e0ff010a39d54/userdata/shm major:0 minor:524 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bc8712c641a3b3ccd887956343c178e81a448d9908293a089aa942f0944b3018/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bc8712c641a3b3ccd887956343c178e81a448d9908293a089aa942f0944b3018/userdata/shm major:0 minor:329 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c4262d5f7cd90d77070e291ffede65804485b27ce848841d5c9b49cfb475af2e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c4262d5f7cd90d77070e291ffede65804485b27ce848841d5c9b49cfb475af2e/userdata/shm major:0 minor:646 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c5cc16f26a63d054e0857f2a2f1278a7512a2a20bea66d9521aa218fb1539d3c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c5cc16f26a63d054e0857f2a2f1278a7512a2a20bea66d9521aa218fb1539d3c/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c6cc97386aa9d8bb895c877b8849fa8dc27a6fe973ccc0760f4274b321682e77/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c6cc97386aa9d8bb895c877b8849fa8dc27a6fe973ccc0760f4274b321682e77/userdata/shm major:0 minor:522 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c7817ab55e81e53bd9a1c875e0b10710e15527bb4f619ad1dc5011c4087c74fe/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c7817ab55e81e53bd9a1c875e0b10710e15527bb4f619ad1dc5011c4087c74fe/userdata/shm major:0 minor:321 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cbd8c33fbce7b1c9cf78530bc91c2ad9c46d9601ea6ef0914dea487c85e63f0d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cbd8c33fbce7b1c9cf78530bc91c2ad9c46d9601ea6ef0914dea487c85e63f0d/userdata/shm major:0 minor:782 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d47d14f256ba67306efe8da7bbcadc67f946b747f7e0a1d658a9687f1f0a1a37/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d47d14f256ba67306efe8da7bbcadc67f946b747f7e0a1d658a9687f1f0a1a37/userdata/shm major:0 minor:275 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d73b051671cc575452964e4ec7abae8ed2cf8ae1de2a3be5460a27e068329e94/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d73b051671cc575452964e4ec7abae8ed2cf8ae1de2a3be5460a27e068329e94/userdata/shm major:0 minor:257 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d9c3b609dc5ba9405c265ea8312eb684aae5364c416fb8a8d02c96c1413b155d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d9c3b609dc5ba9405c265ea8312eb684aae5364c416fb8a8d02c96c1413b155d/userdata/shm major:0 minor:856 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/db3e211f71e6d36cf104a5781a02c4e98905e1bbc8fec6cc754858473d74a96c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/db3e211f71e6d36cf104a5781a02c4e98905e1bbc8fec6cc754858473d74a96c/userdata/shm major:0 minor:335 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/dc0e0feca8d08363d7aeeb0e56e61f125a0c90431bd31ea7f3ad61c6ddd5d77c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/dc0e0feca8d08363d7aeeb0e56e61f125a0c90431bd31ea7f3ad61c6ddd5d77c/userdata/shm major:0 minor:322 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/df9f6505570d879efae2662d6149a2ae417f35b1bed956f7339c92d857b81707/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/df9f6505570d879efae2662d6149a2ae417f35b1bed956f7339c92d857b81707/userdata/shm major:0 minor:281 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e169486121bbc52c7ca877ad3d815dc4a35f6b8ee220e0fd43b9661c26e26d92/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e169486121bbc52c7ca877ad3d815dc4a35f6b8ee220e0fd43b9661c26e26d92/userdata/shm major:0 minor:89 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e60e7e3ed5830ae078195a56d09f656f257e2641daffbfad2ebaca1e467bb613/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e60e7e3ed5830ae078195a56d09f656f257e2641daffbfad2ebaca1e467bb613/userdata/shm major:0 minor:849 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e8733f46dd1d2647e586c0cc9b5a4ebea38d695f856a8c74190015b70d99a33e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e8733f46dd1d2647e586c0cc9b5a4ebea38d695f856a8c74190015b70d99a33e/userdata/shm major:0 minor:129 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ec83c044c04d6837d5d5f7d4c71e74473794e6ee1e718df488cf45a934fcc03a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ec83c044c04d6837d5d5f7d4c71e74473794e6ee1e718df488cf45a934fcc03a/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ed400b0e1b21fe5e4ef5385a05444bf39db4c2fd9c754a3d6c45427d3b29ef99/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ed400b0e1b21fe5e4ef5385a05444bf39db4c2fd9c754a3d6c45427d3b29ef99/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f7ac4af0eeac6f90547286a05d56708b5e0e75b0367c4826038733ce85075489/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f7ac4af0eeac6f90547286a05d56708b5e0e75b0367c4826038733ce85075489/userdata/shm major:0 minor:521 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f8dce45144c680f78255eabd0603f42f2f97c7b82b1aee0c5c17224722da19a3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f8dce45144c680f78255eabd0603f42f2f97c7b82b1aee0c5c17224722da19a3/userdata/shm major:0 minor:390 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ff14038e05786c394b22fac9e7aff676f3eef7f98d2a8dbbbe2bd0a62e05aecf/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ff14038e05786c394b22fac9e7aff676f3eef7f98d2a8dbbbe2bd0a62e05aecf/userdata/shm major:0 minor:531 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0031e3a9-b253-4dda-a890-bf3e4d8737e8/volumes/kubernetes.io~projected/kube-api-access-qhms8:{mountpoint:/var/lib/kubelet/pods/0031e3a9-b253-4dda-a890-bf3e4d8737e8/volumes/kubernetes.io~projected/kube-api-access-qhms8 major:0 minor:837 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0418ff42-7eac-4266-97b5-4df88623d066/volumes/kubernetes.io~projected/kube-api-access-kmpdd:{mountpoint:/var/lib/kubelet/pods/0418ff42-7eac-4266-97b5-4df88623d066/volumes/kubernetes.io~projected/kube-api-access-kmpdd major:0 minor:245 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0418ff42-7eac-4266-97b5-4df88623d066/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls:{mountpoint:/var/lib/kubelet/pods/0418ff42-7eac-4266-97b5-4df88623d066/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls major:0 minor:508 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0918ba32-8e55-48d0-8e50-027c0dcb4bbd/volumes/kubernetes.io~projected/kube-api-access-mghmh:{mountpoint:/var/lib/kubelet/pods/0918ba32-8e55-48d0-8e50-027c0dcb4bbd/volumes/kubernetes.io~projected/kube-api-access-mghmh major:0 minor:252 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0918ba32-8e55-48d0-8e50-027c0dcb4bbd/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/0918ba32-8e55-48d0-8e50-027c0dcb4bbd/volumes/kubernetes.io~secret/serving-cert major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/093f17f0-2818-4e24-b3c3-6ab4da9d21fb/volumes/kubernetes.io~projected/kube-api-access-7nk8r:{mountpoint:/var/lib/kubelet/pods/093f17f0-2818-4e24-b3c3-6ab4da9d21fb/volumes/kubernetes.io~projected/kube-api-access-7nk8r major:0 minor:105 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0d377285-0336-41b7-b48f-c44a7b563498/volumes/kubernetes.io~projected/kube-api-access-7qn5v:{mountpoint:/var/lib/kubelet/pods/0d377285-0336-41b7-b48f-c44a7b563498/volumes/kubernetes.io~projected/kube-api-access-7qn5v major:0 minor:249 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0d377285-0336-41b7-b48f-c44a7b563498/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/0d377285-0336-41b7-b48f-c44a7b563498/volumes/kubernetes.io~secret/serving-cert major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0ebf1330-e044-4ff5-8b48-2d667e0c5625/volumes/kubernetes.io~projected/kube-api-access-hccv4:{mountpoint:/var/lib/kubelet/pods/0ebf1330-e044-4ff5-8b48-2d667e0c5625/volumes/kubernetes.io~projected/kube-api-access-hccv4 major:0 minor:241 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0ebf1330-e044-4ff5-8b48-2d667e0c5625/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/0ebf1330-e044-4ff5-8b48-2d667e0c5625/volumes/kubernetes.io~secret/serving-cert major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79/volumes/kubernetes.io~projected/kube-api-access-7fzmf:{mountpoint:/var/lib/kubelet/pods/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79/volumes/kubernetes.io~projected/kube-api-access-7fzmf major:0 minor:786 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:784 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79/volumes/kubernetes.io~secret/webhook-cert major:0 minor:785 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/127c3f92-8283-4179-9e40-a12dcabaaa12/volumes/kubernetes.io~projected/kube-api-access-zdn9r:{mountpoint:/var/lib/kubelet/pods/127c3f92-8283-4179-9e40-a12dcabaaa12/volumes/kubernetes.io~projected/kube-api-access-zdn9r major:0 minor:836 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/127c3f92-8283-4179-9e40-a12dcabaaa12/volumes/kubernetes.io~secret/machine-approver-tls:{mountpoint:/var/lib/kubelet/pods/127c3f92-8283-4179-9e40-a12dcabaaa12/volumes/kubernetes.io~secret/machine-approver-tls major:0 minor:822 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/139881ee-6cfa-4a7e-b002-63cece048d16/volumes/kubernetes.io~projected/kube-api-access-h9z8g:{mountpoint:/var/lib/kubelet/pods/139881ee-6cfa-4a7e-b002-63cece048d16/volumes/kubernetes.io~projected/kube-api-access-h9z8g major:0 minor:848 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/139881ee-6cfa-4a7e-b002-63cece048d16/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls:{mountpoint:/var/lib/kubelet/pods/139881ee-6cfa-4a7e-b002-63cece048d16/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls major:0 minor:838 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1482d789-884b-4337-b598-f0e2b71eb9f2/volumes/kubernetes.io~projected/kube-api-access-m2h62:{mountpoint:/var/lib/kubelet/pods/1482d789-884b-4337-b598-f0e2b71eb9f2/volumes/kubernetes.io~projected/kube-api-access-m2h62 major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1482d789-884b-4337-b598-f0e2b71eb9f2/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/1482d789-884b-4337-b598-f0e2b71eb9f2/volumes/kubernetes.io~secret/srv-cert major:0 minor:516 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/164586b1-f133-4427-8ab6-eb0839b79738/volumes/kubernetes.io~projected/kube-api-access-r4stz:{mountpoint:/var/lib/kubelet/pods/164586b1-f133-4427-8ab6-eb0839b79738/volumes/kubernetes.io~projected/kube-api-access-r4stz major:0 minor:139 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/164586b1-f133-4427-8ab6-eb0839b79738/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/164586b1-f133-4427-8ab6-eb0839b79738/volumes/kubernetes.io~secret/webhook-cert major:0 minor:138 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1b69fbf6-1ca5-413e-bffd-965730bcec1b/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/1b69fbf6-1ca5-413e-bffd-965730bcec1b/volumes/kubernetes.io~projected/ca-certs major:0 minor:471 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1b69fbf6-1ca5-413e-bffd-965730bcec1b/volumes/kubernetes.io~projected/kube-api-access-nfz27:{mountpoint:/var/lib/kubelet/pods/1b69fbf6-1ca5-413e-bffd-965730bcec1b/volumes/kubernetes.io~projected/kube-api-access-nfz27 major:0 minor:472 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1b69fbf6-1ca5-413e-bffd-965730bcec1b/volumes/kubernetes.io~secret/catalogserver-certs:{mountpoint:/var/lib/kubelet/pods/1b69fbf6-1ca5-413e-bffd-965730bcec1b/volumes/kubernetes.io~secret/catalogserver-certs major:0 minor:586 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1b700d17-83d2-46c8-afbc-e5774822eabe/volumes/kubernetes.io~projected/kube-api-access-cv7sd:{mountpoint:/var/lib/kubelet/pods/1b700d17-83d2-46c8-afbc-e5774822eabe/volumes/kubernetes.io~projected/kube-api-access-cv7sd major:0 minor:867 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1b700d17-83d2-46c8-afbc-e5774822eabe/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/1b700d17-83d2-46c8-afbc-e5774822eabe/volumes/kubernetes.io~secret/cert major:0 minor:866 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1cbcb403-a424-4496-8c5c-5eb5e42dfb93/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/1cbcb403-a424-4496-8c5c-5eb5e42dfb93/volumes/kubernetes.io~projected/kube-api-access major:0 minor:244 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1cbcb403-a424-4496-8c5c-5eb5e42dfb93/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/1cbcb403-a424-4496-8c5c-5eb5e42dfb93/volumes/kubernetes.io~secret/serving-cert major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1eb851be-f157-48ea-9a39-1361b68d2639/volumes/kubernetes.io~projected/kube-api-access-nqhzl:{mountpoint:/var/lib/kubelet/pods/1eb851be-f157-48ea-9a39-1361b68d2639/volumes/kubernetes.io~projected/kube-api-access-nqhzl major:0 minor:246 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1eb851be-f157-48ea-9a39-1361b68d2639/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/1eb851be-f157-48ea-9a39-1361b68d2639/volumes/kubernetes.io~secret/webhook-certs major:0 minor:514 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2262647b-c315-477a-93bd-f168c1810475/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/2262647b-c315-477a-93bd-f168c1810475/volumes/kubernetes.io~projected/kube-api-access major:0 minor:824 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2262647b-c315-477a-93bd-f168c1810475/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/2262647b-c315-477a-93bd-f168c1810475/volumes/kubernetes.io~secret/serving-cert major:0 minor:823 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/232c421d-96f0-4894-b8d8-74f43d02bbd3/volumes/kubernetes.io~projected/kube-api-access-fx4fw:{mountpoint:/var/lib/kubelet/pods/232c421d-96f0-4894-b8d8-74f43d02bbd3/volumes/kubernetes.io~projected/kube-api-access-fx4fw major:0 minor:238 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/232c421d-96f0-4894-b8d8-74f43d02bbd3/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/232c421d-96f0-4894-b8d8-74f43d02bbd3/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:517 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/232c421d-96f0-4894-b8d8-74f43d02bbd3/volumes/kubernetes.io~secret/node-tuning-operator-tls:{mountpoint:/var/lib/kubelet/pods/232c421d-96f0-4894-b8d8-74f43d02bbd3/volumes/kubernetes.io~secret/node-tuning-operator-tls major:0 minor:509 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/26180f77-0b1a-4d0f-9ed0-a12fdee69817/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/26180f77-0b1a-4d0f-9ed0-a12fdee69817/volumes/kubernetes.io~projected/kube-api-access major:0 minor:253 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/26180f77-0b1a-4d0f-9ed0-a12fdee69817/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/26180f77-0b1a-4d0f-9ed0-a12fdee69817/volumes/kubernetes.io~secret/serving-cert major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2b3d1dc7-22f9-4c0c-802a-d7314894b255/volumes/kubernetes.io~projected/kube-api-access-zfgc6:{mountpoint:/var/lib/kubelet/pods/2b3d1dc7-22f9-4c0c-802a-d7314894b255/volumes/kubernetes.io~projected/kube-api-access-zfgc6 major:0 minor:318 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2dd4279d-a1a9-450a-a061-9008cd1ea8e0/volumes/kubernetes.io~projected/kube-api-access-pnzt7:{mountpoint:/var/lib/kubelet/pods/2dd4279d-a1a9-450a-a061-9008cd1ea8e0/volumes/kubernetes.io~projected/kube-api-access-pnzt7 major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2dd4279d-a1a9-450a-a061-9008cd1ea8e0/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/2dd4279d-a1a9-450a-a061-9008cd1ea8e0/volumes/kubernetes.io~secret/srv-cert major:0 minor:519 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2f59fe81-deee-4ced-ae9d-f17752c82c4b/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/2f59fe81-deee-4ced-ae9d-f17752c82c4b/volumes/kubernetes.io~projected/ca-certs major:0 minor:455 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2f59fe81-deee-4ced-ae9d-f17752c82c4b/volumes/kubernetes.io~projected/kube-api-access-bm7bw:{mountpoint:/var/lib/kubelet/pods/2f59fe81-deee-4ced-ae9d-f17752c82c4b/volumes/kubernetes.io~projected/kube-api-access-bm7bw major:0 minor:473 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/30211469-7108-4820-a988-26fc4ced734e/volumes/kubernetes.io~projected/kube-api-access-fncng:{mountpoint:/var/lib/kubelet/pods/30211469-7108-4820-a988-26fc4ced734e/volumes/kubernetes.io~projected/kube-api-access-fncng major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/30211469-7108-4820-a988-26fc4ced734e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/30211469-7108-4820-a988-26fc4ced734e/volumes/kubernetes.io~secret/serving-cert major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/33ed331b-89e9-45f8-ab3c-4533a77cc7b6/volumes/kubernetes.io~projected/kube-api-access-hmsj5:{mountpoint:/var/lib/kubelet/pods/33ed331b-89e9-45f8-ab3c-4533a77cc7b6/volumes/kubernetes.io~projected/kube-api-access-hmsj5 major:0 minor:303 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/33ed331b-89e9-45f8-ab3c-4533a77cc7b6/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls:{mountpoint:/var/lib/kubelet/pods/33ed331b-89e9-45f8-ab3c-4533a77cc7b6/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls major:0 minor:112 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3ddfd0e7-fe76-41bc-b316-94505df81002/volumes/kubernetes.io~projected/kube-api-access-bgc7c:{mountpoint:/var/lib/kubelet/pods/3ddfd0e7-fe76-41bc-b316-94505df81002/volumes/kubernetes.io~projected/kube-api-access-bgc7c major:0 minor:92 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3ddfd0e7-fe76-41bc-b316-94505df81002/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/3ddfd0e7-fe76-41bc-b316-94505df81002/volumes/kubernetes.io~secret/metrics-tls major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2/volumes/kubernetes.io~projected/kube-api-access-pb87l:{mountpoint:/var/lib/kubelet/pods/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2/volumes/kubernetes.io~projected/kube-api-access-pb87l major:0 minor:642 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2/volumes/kubernetes.io~secret/serving-cert major:0 minor:611 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/49ec083d-dc74-457e-b10f-3bde04e9e75e/volumes/kubernetes.io~projected/kube-api-access-zcjr9:{mountpoint:/var/lib/kubelet/pods/49ec083d-dc74-457e-b10f-3bde04e9e75e/volumes/kubernetes.io~projected/kube-api-access-zcjr9 major:0 minor:442 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/49ec083d-dc74-457e-b10f-3bde04e9e75e/volumes/kubernetes.io~secret/signing-key:{mountpoint:/var/lib/kubelet/pods/49ec083d-dc74-457e-b10f-3bde04e9e75e/volumes/kubernetes.io~secret/signing-key major:0 minor:441 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4a19441e-e61b-4d58-85db-813ae88e1f9b/volumes/kubernetes.io~projected/kube-api-access-dw7bx:{mountpoint:/var/lib/kubelet/pods/4a19441e-e61b-4d58-85db-813ae88e1f9b/volumes/kubernetes.io~projected/kube-api-access-dw7bx major:0 minor:118 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4c5a0c1d-867a-4ce4-9570-ea66452c8db3/volumes/kubernetes.io~projected/kube-api-access-mkzb2:{mountpoint:/var/lib/kubelet/pods/4c5a0c1d-867a-4ce4-9570-ea66452c8db3/volumes/kubernetes.io~projected/kube-api-access-mkzb2 major:0 minor:256 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/52b495ac-bb28-44f3-b925-3c54f86d5ec4/volumes/kubernetes.io~projected/kube-api-access-dd549:{mountpoint:/var/lib/kubelet/pods/52b495ac-bb28-44f3-b925-3c54f86d5ec4/volumes/kubernetes.io~projected/kube-api-access-dd549 major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/54ad284e-d40e-4e69-b898-f5093952a0e6/volumes/kubernetes.io~projected/kube-api-access-9lfcj:{mountpoint:/var/lib/kubelet/pods/54ad284e-d40e-4e69-b898-f5093952a0e6/volumes/kubernetes.io~projected/kube-api-access-9lfcj major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/54ad284e-d40e-4e69-b898-f5093952a0e6/volumes/kubernetes.io~secret/marketplace-operator-metrics:{mountpoint:/var/lib/kubelet/pods/54ad284e-d40e-4e69-b898-f5093952a0e6/volumes/kubernetes.io~secret/marketplace-operator-metrics major:0 minor:520 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a/volumes/kubernetes.io~projected/kube-api-access-cfvnn:{mountpoint:/var/lib/kubelet/pods/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a/volumes/kubernetes.io~projected/kube-api-access-cfvnn major:0 minor:255 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a/volumes/kubernetes.io~secret/serving-cert major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a7752f9-7b9a-451f-997a-e9f696d38b34/volumes/kubernetes.io~projected/kube-api-access-8b5zb:{mountpoint:/var/lib/kubelet/pods/5a7752f9-7b9a-451f-997a-e9f696d38b34/volumes/kubernetes.io~projected/kube-api-access-8b5zb major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a7752f9-7b9a-451f-997a-e9f696d38b34/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/5a7752f9-7b9a-451f-997a-e9f696d38b34/volumes/kubernetes.io~secret/etcd-client major:0 minor:209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a7752f9-7b9a-451f-997a-e9f696d38b34/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/5a7752f9-7b9a-451f-997a-e9f696d38b34/volumes/kubernetes.io~secret/serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/634c0f6d-bce6-42cf-9253-80d1bcc7c507/volumes/kubernetes.io~projected/kube-api-access-8cwmn:{mountpoint:/var/lib/kubelet/pods/634c0f6d-bce6-42cf-9253-80d1bcc7c507/volumes/kubernetes.io~projected/kube-api-access-8cwmn major:0 minor:846 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/634c0f6d-bce6-42cf-9253-80d1bcc7c507/volumes/kubernetes.io~secret/samples-operator-tls:{mountpoint:/var/lib/kubelet/pods/634c0f6d-bce6-42cf-9253-80d1bcc7c507/volumes/kubernetes.io~secret/samples-operator-tls major:0 minor:840 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:247 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f/volumes/kubernetes.io~projected/kube-api-access-fw7mr:{mountpoint:/var/lib/kubelet/pods/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f/volumes/kubernetes.io~projected/kube-api-access-fw7mr major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f/volumes/kubernetes.io~secret/image-registry-operator-tls:{mountpoint:/var/lib/kubelet/pods/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f/volumes/kubernetes.io~secret/image-registry-operator-tls major:0 minor:510 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6bee226a-2a66-4032-8aba-2c8b82abcb6a/volumes/kubernetes.io~projected/kube-api-access-tp98d:{mountpoint:/var/lib/kubelet/pods/6bee226a-2a66-4032-8aba-2c8b82abcb6a/volumes/kubernetes.io~projected/kube-api-access-tp98d major:0 minor:831 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6cde5024-edf7-4fa4-8964-cabe7899578b/volumes/kubernetes.io~projected/kube-api-access-x997v:{mountpoint:/var/lib/kubelet/pods/6cde5024-edf7-4fa4-8964-cabe7899578b/volumes/kubernetes.io~projected/kube-api-access-x997v major:0 minor:237 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6cde5024-edf7-4fa4-8964-cabe7899578b/volumes/kubernetes.io~secret/package-server-manager-serving-cert:{mountpoint:/var/lib/kubelet/pods/6cde5024-edf7-4fa4-8964-cabe7899578b/volumes/kubernetes.io~secret/package-server-manager-serving-cert major:0 minor:503 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/76ba45a2-8945-4afe-b913-126c26725867/volumes/kubernetes.io~projected/kube-api-access-dtts2:{mountpoint:/var/lib/kubelet/pods/76ba45a2-8945-4afe-b913-126c26725867/volumes/kubernetes.io~projected/kube-api-access-dtts2 major:0 minor:583 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/76ba45a2-8945-4afe-b913-126c26725867/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/76ba45a2-8945-4afe-b913-126c26725867/volumes/kubernetes.io~secret/encryption-config major:0 minor:581 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/76ba45a2-8945-4afe-b913-126c26725867/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/76ba45a2-8945-4afe-b913-126c26725867/volumes/kubernetes.io~secret/etcd-client major:0 minor:576 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/76ba45a2-8945-4afe-b913-126c26725867/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/76ba45a2-8945-4afe-b913-126c26725867/volumes/kubernetes.io~secret/serving-cert major:0 minor:582 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6/volumes/kubernetes.io~projected/kube-api-access-nfz6w:{mountpoint:/var/lib/kubelet/pods/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6/volumes/kubernetes.io~projected/kube-api-access-nfz6w major:0 minor:769 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6/volumes/kubernetes.io~secret/metrics-tls major:0 minor:747 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7e5935ea-8d95-45e3-b836-c7892953ef3d/volumes/kubernetes.io~projected/kube-api-access-c6gml:{mountpoint:/var/lib/kubelet/pods/7e5935ea-8d95-45e3-b836-c7892953ef3d/volumes/kubernetes.io~projected/kube-api-access-c6gml major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7e5935ea-8d95-45e3-b836-c7892953ef3d/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/7e5935ea-8d95-45e3-b836-c7892953ef3d/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7ff63c73-62a3-44b4-acd3-1b3df175794f/volumes/kubernetes.io~projected/kube-api-access-vfqc5:{mountpoint:/var/lib/kubelet/pods/7ff63c73-62a3-44b4-acd3-1b3df175794f/volumes/kubernetes.io~projected/kube-api-access-vfqc5 major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7ff63c73-62a3-44b4-acd3-1b3df175794f/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/7ff63c73-62a3-44b4-acd3-1b3df175794f/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/84d353ae-3992-4c17-a20e-3415edd92509/volumes/kubernetes.io~projected/kube-api-access-7smmf:{mountpoint:/var/lib/kubelet/pods/84d353ae-3992-4c17-a20e-3415edd92509/volumes/kubernetes.io~projected/kube-api-access-7smmf major:0 minor:387 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8e283f49-b85d-4789-a71f-3fcb5033cdf0/volumes/kubernetes.io~projected/kube-api-access-zm59c:{mountpoint:/var/lib/kubelet/pods/8e283f49-b85d-4789-a71f-3fcb5033cdf0/volumes/kubernetes.io~projected/kube-api-access-zm59c major:0 minor:319 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8efdcef9-9b31-4567-b7f9-cb59a894273d/volumes/kubernetes.io~projected/kube-api-access-cpsx7:{mountpoint:/var/lib/kubelet/pods/8efdcef9-9b31-4567-b7f9-cb59a894273d/volumes/kubernetes.io~projected/kube-api-access-cpsx7 major:0 minor:254 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8efdcef9-9b31-4567-b7f9-cb59a894273d/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/8efdcef9-9b31-4567-b7f9-cb59a894273d/volumes/kubernetes.io~secret/metrics-tls major:0 minor:513 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9ec89e27-4360-48f2-a7ca-5d823bda4510/volumes/kubernetes.io~projected/kube-api-access-vndvf:{mountpoint:/var/lib/kubelet/pods/9ec89e27-4360-48f2-a7ca-5d823bda4510/volumes/kubernetes.io~projected/kube-api-access-vndvf major:0 minor:386 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a60bc804-52e7-422a-87fd-ac4c5aa90cb3/volumes/kubernetes.io~projected/kube-api-access-zxkm6:{mountpoint:/var/lib/kubelet/pods/a60bc804-52e7-422a-87fd-ac4c5aa90cb3/volumes/kubernetes.io~projected/kube-api-access-zxkm6 major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a60bc804-52e7-422a-87fd-ac4c5aa90cb3/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a60bc804-52e7-422a-87fd-ac4c5aa90cb3/volumes/kubernetes.io~secret/serving-cert major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a6c4695c-da78-46b6-8f92-ca93c5ebb96b/volumes/kubernetes.io~projected/kube-api-access-bd7d5:{mountpoint:/var/lib/kubelet/pods/a6c4695c-da78-46b6-8f92-ca93c5ebb96b/volumes/kubernetes.io~projected/kube-api-access-bd7d5 major:0 minor:641 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a6c4695c-da78-46b6-8f92-ca93c5ebb96b/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a6c4695c-da78-46b6-8f92-ca93c5ebb96b/volumes/kubernetes.io~secret/serving-cert major:0 minor:612 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b3eea925-73b3-4693-8f0e-6dd26107f60a/volumes/kubernetes.io~projected/kube-api-access-6sx5s:{mountpoint:/var/lib/kubelet/pods/b3eea925-73b3-4693-8f0e-6dd26107f60a/volumes/kubernetes.io~projected/kube-api-access-6sx5s major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b3eea925-73b3-4693-8f0e-6dd26107f60a/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/b3eea925-73b3-4693-8f0e-6dd26107f60a/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7/volumes/kubernetes.io~projected/kube-api-access-6nlq2:{mountpoint:/var/lib/kubelet/pods/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7/volumes/kubernetes.io~projected/kube-api-access-6nlq2 major:0 minor:894 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7/volumes/kubernetes.io~secret/machine-api-operator-tls:{mountpoint:/var/lib/kubelet/pods/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7/volumes/kubernetes.io~secret/machine-api-operator-tls major:0 minor:895 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f/volumes/kubernetes.io~projected/kube-api-access-4v45k:{mountpoint:/var/lib/kubelet/pods/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f/volumes/kubernetes.io~projected/kube-api-access-4v45k major:0 minor:449 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f/volumes/kubernetes.io~secret/encryption-config major:0 minor:440 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f/volumes/kubernetes.io~secret/etcd-client major:0 minor:439 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f/volumes/kubernetes.io~secret/serving-cert major:0 minor:434 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c0861ccd-5e86-4277-9082-95f3133508a0/volumes/kubernetes.io~projected/kube-api-access-n7rsc:{mountpoint:/var/lib/kubelet/pods/c0861ccd-5e86-4277-9082-95f3133508a0/volumes/kubernetes.io~projected/kube-api-access-n7rsc major:0 minor:841 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c0861ccd-5e86-4277-9082-95f3133508a0/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/c0861ccd-5e86-4277-9082-95f3133508a0/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert major:0 minor:839 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c9de4939-680a-4e3e-89fd-e20ecb8b10f2/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/c9de4939-680a-4e3e-89fd-e20ecb8b10f2/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:250 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c9de4939-680a-4e3e-89fd-e20ecb8b10f2/volumes/kubernetes.io~projected/kube-api-access-29dpg:{mountpoint:/var/lib/kubelet/pods/c9de4939-680a-4e3e-89fd-e20ecb8b10f2/volumes/kubernetes.io~projected/kube-api-access-29dpg major:0 minor:248 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c9de4939-680a-4e3e-89fd-e20ecb8b10f2/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/c9de4939-680a-4e3e-89fd-e20ecb8b10f2/volumes/kubernetes.io~secret/metrics-tls major:0 minor:518 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d2cd5b23-e622-4b96-aee8-dbc942b73b4a/volumes/kubernetes.io~projected/kube-api-access-jljzc:{mountpoint:/var/lib/kubelet/pods/d2cd5b23-e622-4b96-aee8-dbc942b73b4a/volumes/kubernetes.io~projected/kube-api-access-jljzc major:0 minor:775 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d5044ffd-0686-4679-9894-e696faf33699/volumes/kubernetes.io~projected/kube-api-access-mmhtb:{mountpoint:/var/lib/kubelet/pods/d5044ffd-0686-4679-9894-e696faf33699/volumes/kubernetes.io~projected/kube-api-access-mmhtb major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d5044ffd-0686-4679-9894-e696faf33699/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/d5044ffd-0686-4679-9894-e696faf33699/volumes/kubernetes.io~secret/metrics-certs major:0 minor:530 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d831cb23-7411-4072-8273-c167d9afca28/volumes/kubernetes.io~projected/kube-api-access-dwkwt:{mountpoint:/var/lib/kubelet/pods/d831cb23-7411-4072-8273-c167d9afca28/volumes/kubernetes.io~projected/kube-api-access-dwkwt major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d831cb23-7411-4072-8273-c167d9afca28/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/d831cb23-7411-4072-8273-c167d9afca28/volumes/kubernetes.io~secret/cert major:0 minor:512 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d831cb23-7411-4072-8273-c167d9afca28/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls:{mountpoint:/var/lib/kubelet/pods/d831cb23-7411-4072-8273-c167d9afca28/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls major:0 minor:515 fsType:tmpfs blockSize:0} /var/li Mar 08 03:53:08.035144 master-0 kubenswrapper[18592]: b/kubelet/pods/e187516f-8f33-4c17-81d6-60c10b580bb0/volumes/kubernetes.io~empty-dir/etc-tuned:{mountpoint:/var/lib/kubelet/pods/e187516f-8f33-4c17-81d6-60c10b580bb0/volumes/kubernetes.io~empty-dir/etc-tuned major:0 minor:756 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e187516f-8f33-4c17-81d6-60c10b580bb0/volumes/kubernetes.io~empty-dir/tmp:{mountpoint:/var/lib/kubelet/pods/e187516f-8f33-4c17-81d6-60c10b580bb0/volumes/kubernetes.io~empty-dir/tmp major:0 minor:755 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e187516f-8f33-4c17-81d6-60c10b580bb0/volumes/kubernetes.io~projected/kube-api-access-vg9kg:{mountpoint:/var/lib/kubelet/pods/e187516f-8f33-4c17-81d6-60c10b580bb0/volumes/kubernetes.io~projected/kube-api-access-vg9kg major:0 minor:689 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e4541b7b-3f7f-4851-9bd9-26fcda5cab13/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/e4541b7b-3f7f-4851-9bd9-26fcda5cab13/volumes/kubernetes.io~projected/kube-api-access major:0 minor:240 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e4541b7b-3f7f-4851-9bd9-26fcda5cab13/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e4541b7b-3f7f-4851-9bd9-26fcda5cab13/volumes/kubernetes.io~secret/serving-cert major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e78b283b-981e-48d7-a5f2-53f8401766ea/volumes/kubernetes.io~projected/kube-api-access-rchj5:{mountpoint:/var/lib/kubelet/pods/e78b283b-981e-48d7-a5f2-53f8401766ea/volumes/kubernetes.io~projected/kube-api-access-rchj5 major:0 minor:251 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e78b283b-981e-48d7-a5f2-53f8401766ea/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/e78b283b-981e-48d7-a5f2-53f8401766ea/volumes/kubernetes.io~secret/proxy-tls major:0 minor:511 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e93b5361-30e6-44fd-a59e-2bc410c59480/volumes/kubernetes.io~projected/kube-api-access-4kc5q:{mountpoint:/var/lib/kubelet/pods/e93b5361-30e6-44fd-a59e-2bc410c59480/volumes/kubernetes.io~projected/kube-api-access-4kc5q major:0 minor:317 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ee586416-6f56-4ea4-ad62-95de1e6df23b/volumes/kubernetes.io~projected/kube-api-access-sxxhh:{mountpoint:/var/lib/kubelet/pods/ee586416-6f56-4ea4-ad62-95de1e6df23b/volumes/kubernetes.io~projected/kube-api-access-sxxhh major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ee586416-6f56-4ea4-ad62-95de1e6df23b/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ee586416-6f56-4ea4-ad62-95de1e6df23b/volumes/kubernetes.io~secret/serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f0f5f3f3-0856-4da3-9157-15f65c6aba6e/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/f0f5f3f3-0856-4da3-9157-15f65c6aba6e/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f0f5f3f3-0856-4da3-9157-15f65c6aba6e/volumes/kubernetes.io~projected/kube-api-access-2vklx:{mountpoint:/var/lib/kubelet/pods/f0f5f3f3-0856-4da3-9157-15f65c6aba6e/volumes/kubernetes.io~projected/kube-api-access-2vklx major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f0f5f3f3-0856-4da3-9157-15f65c6aba6e/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/f0f5f3f3-0856-4da3-9157-15f65c6aba6e/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:126 fsType:tmpfs blockSize:0} overlay_0-102:{mountpoint:/var/lib/containers/storage/overlay/159f70f0295d276199db67e711ddadd5fa50e846ee6c064a9f037adbe03c5960/merged major:0 minor:102 fsType:overlay blockSize:0} overlay_0-110:{mountpoint:/var/lib/containers/storage/overlay/548d8d0a6f9b5a1ac12432bf89c742ecb035e08ac66e47370f04af2499168993/merged major:0 minor:110 fsType:overlay blockSize:0} overlay_0-113:{mountpoint:/var/lib/containers/storage/overlay/554bee3be73d6209d37a327c3d569d11cf75a2e510dfcfdd35235a819675ac09/merged major:0 minor:113 fsType:overlay blockSize:0} overlay_0-116:{mountpoint:/var/lib/containers/storage/overlay/1b06806633b7877a750d9205fc7919bf2454ea5da4d239059bb94a650c396de9/merged major:0 minor:116 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/f4bcba9ea7a2a47a2ec69ef757a6ece5bd493a6f40bb501912affa429fc33f84/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-132:{mountpoint:/var/lib/containers/storage/overlay/5770c755fb57991d02f24c9c55b95e329ffdc777a197559deca6c85b2066f1e6/merged major:0 minor:132 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/39c37fc286cbfdda810328109d5c626e28cbbea3f0c88e91d5056e30b6a6badf/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/8c2bc3b43f7c74fc829c708cfc407eb7d0becab6bcfc35aef288288139d72e16/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-140:{mountpoint:/var/lib/containers/storage/overlay/ddb1448d1eb6712ee303b77c4ae081aff6e6d861313662abd10d8c9c6f60efcf/merged major:0 minor:140 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/f89a5c80aa1c00850dfa9a29cae8433e43c2a966c88d623e77d0949c12d0b069/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/8a3b37e48224698e7a5aaf8e61fb6beb92dab875c519e7b8e4435d4d2f692716/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-157:{mountpoint:/var/lib/containers/storage/overlay/24786a76047617ce0a0d69a958ef739a1cf940d4efe226ba35a6039040ac417d/merged major:0 minor:157 fsType:overlay blockSize:0} overlay_0-158:{mountpoint:/var/lib/containers/storage/overlay/126f320739db4f221784c36cb60b60aa19ababc3f76609e8a185043dcc806152/merged major:0 minor:158 fsType:overlay blockSize:0} overlay_0-159:{mountpoint:/var/lib/containers/storage/overlay/871523128e2c0557743aa69d441527d5c16812b9c373dbabf4dc8a4daf386022/merged major:0 minor:159 fsType:overlay blockSize:0} overlay_0-162:{mountpoint:/var/lib/containers/storage/overlay/1061be851c6ccb11e8b03404b0e9200c76dadc90e98ec6be6dc2368f42d38a14/merged major:0 minor:162 fsType:overlay blockSize:0} overlay_0-166:{mountpoint:/var/lib/containers/storage/overlay/917048e2c782fc1a37a81f33899a0f241a6460b9e6ba982339ba86525379d46b/merged major:0 minor:166 fsType:overlay blockSize:0} overlay_0-174:{mountpoint:/var/lib/containers/storage/overlay/cbd801d2a28c3f00fe101072390e7e2d5657c4a2fe48742e7960360869a21ca5/merged major:0 minor:174 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/6eb1fbc6279a817180f1d34052d7f45b3694f592e5bc718b726fa23d8458ab4e/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/584d394e662410989c23408aee166d692041db1664ed67d7e4510dc6d136e9c8/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/28c79e3444b687b213817f4d6d7d0a36990b7c23e1e607c84f2b01b7cc09d655/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/61f59d1cb37f5b7067be0bbf8ff1a728f10ca3e8daddec97c07531154aeb42f4/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/78eaff81c4fedc9d73da1bcd07ee623dfc5d1416466b9577dd75c31b9479d4ed/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/202561aa60da5c15aaa47e5826d80ab313db28b1800b634c2baae69b4a2d7437/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-261:{mountpoint:/var/lib/containers/storage/overlay/6291218680ab4337ece6a9e70cfaef9e0a7d7a472c5f31ad52212bcc84310ad6/merged major:0 minor:261 fsType:overlay blockSize:0} overlay_0-266:{mountpoint:/var/lib/containers/storage/overlay/f93bdc2936ca0c10c0d8056847178328412dc5e4a74a09cf60c44f11383d8691/merged major:0 minor:266 fsType:overlay blockSize:0} overlay_0-277:{mountpoint:/var/lib/containers/storage/overlay/40d944ab7563914f374aa3c8068539483fbb6b24296502a2c688d1c14199db6f/merged major:0 minor:277 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/d57159928d5faa6c29b4ed8265a1460cf6a81432463a4a5bd8181f8ed5329d37/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/1c436f9c07c25cf14919f8e86213eb0b5e5f71577655d9feaaa9e15540b80466/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-295:{mountpoint:/var/lib/containers/storage/overlay/d4692cd1ab9db5084d70e40c2bb93ca90f631359549979c56ce8f57dd476e6cb/merged major:0 minor:295 fsType:overlay blockSize:0} overlay_0-297:{mountpoint:/var/lib/containers/storage/overlay/3eccf431b3b03d19597cab5c7720e96141321facc11df4acaf9aeae5eb5cff4b/merged major:0 minor:297 fsType:overlay blockSize:0} overlay_0-299:{mountpoint:/var/lib/containers/storage/overlay/a252e23963211d9542968848cf9d48cc012fbf09b292403264d2a45234d9c4a1/merged major:0 minor:299 fsType:overlay blockSize:0} overlay_0-301:{mountpoint:/var/lib/containers/storage/overlay/1e9b0d7ae87d25c638e4b7cc94c220d55da06f12e8d29a756b350cbc4278b711/merged major:0 minor:301 fsType:overlay blockSize:0} overlay_0-305:{mountpoint:/var/lib/containers/storage/overlay/72b7f9c6a6eba9b6512a84c976d33d8807fc575a3d7d5f37d198ac73e29a5925/merged major:0 minor:305 fsType:overlay blockSize:0} overlay_0-307:{mountpoint:/var/lib/containers/storage/overlay/30b8cdc9c1a2def5dc5cad19ba2593e4edb09ee01e41374fafcd612baf67b667/merged major:0 minor:307 fsType:overlay blockSize:0} overlay_0-309:{mountpoint:/var/lib/containers/storage/overlay/6ba9c7d52dc9297679ca3cfa4cb0bf600312c377c92e9fe18f05b36f29678454/merged major:0 minor:309 fsType:overlay blockSize:0} overlay_0-311:{mountpoint:/var/lib/containers/storage/overlay/9de694dedd8cf20a1e634df7e411e9d1989995a6dd528c269076d3816f28e386/merged major:0 minor:311 fsType:overlay blockSize:0} overlay_0-313:{mountpoint:/var/lib/containers/storage/overlay/1871147ec5d62518aec484a24b49ffcf6277fdced02212b869f2837411ff1a9e/merged major:0 minor:313 fsType:overlay blockSize:0} overlay_0-315:{mountpoint:/var/lib/containers/storage/overlay/fd74e22116d6063def16ea49c2763e244563c8c947965c5b40c256759d772ee9/merged major:0 minor:315 fsType:overlay blockSize:0} overlay_0-331:{mountpoint:/var/lib/containers/storage/overlay/b67e1f8639ec2a78fcccc3e76084c5c91b6dff88ea590fc55b664bc1c956b3c5/merged major:0 minor:331 fsType:overlay blockSize:0} overlay_0-333:{mountpoint:/var/lib/containers/storage/overlay/4c1e7495b41c953b1c404986b64ba66be8b187ed307e55d1446fe6cc6cd12758/merged major:0 minor:333 fsType:overlay blockSize:0} overlay_0-336:{mountpoint:/var/lib/containers/storage/overlay/47ea16a5c100f30ddf3205c5a165de9272e6b20e291e85e02f154792fc900132/merged major:0 minor:336 fsType:overlay blockSize:0} overlay_0-341:{mountpoint:/var/lib/containers/storage/overlay/057172f4a3b00c681e6792c8d073351a194d01c3449bb5b70fb376461244b273/merged major:0 minor:341 fsType:overlay blockSize:0} overlay_0-347:{mountpoint:/var/lib/containers/storage/overlay/a435b782d202c22b337a3cf7c329808e678dd7f9468c2d25f7027578b4b2a02b/merged major:0 minor:347 fsType:overlay blockSize:0} overlay_0-352:{mountpoint:/var/lib/containers/storage/overlay/654913866f66d582a428fb8f4f2455c17f39997dac1002ec768f040684c4f742/merged major:0 minor:352 fsType:overlay blockSize:0} overlay_0-357:{mountpoint:/var/lib/containers/storage/overlay/4fa264492c26c2a31380ae2a2cf91aac4e84a4e9f6deb3f03a6fbe324226db45/merged major:0 minor:357 fsType:overlay blockSize:0} overlay_0-363:{mountpoint:/var/lib/containers/storage/overlay/9227268c896d4179018f104d17bbe296043a7858e25cf0e087735197dac6901e/merged major:0 minor:363 fsType:overlay blockSize:0} overlay_0-365:{mountpoint:/var/lib/containers/storage/overlay/fcac3c396bb271d6c915821b6c02034f8eae640e5f320f8848d1854e9d3d81c6/merged major:0 minor:365 fsType:overlay blockSize:0} overlay_0-379:{mountpoint:/var/lib/containers/storage/overlay/106dc2f184c6f955c29c397d06f40bb2312a6e47d2d5d03603206e2ce3f9d506/merged major:0 minor:379 fsType:overlay blockSize:0} overlay_0-380:{mountpoint:/var/lib/containers/storage/overlay/91f15d60dac60eac7b81582eea3b43d2045a2b3b05c88697def59980bccd7fe2/merged major:0 minor:380 fsType:overlay blockSize:0} overlay_0-392:{mountpoint:/var/lib/containers/storage/overlay/6b264adf14d55b5bd02bf76a308bd42bb49f479abed8f67ab26f0248f32b06ea/merged major:0 minor:392 fsType:overlay blockSize:0} overlay_0-394:{mountpoint:/var/lib/containers/storage/overlay/daa2b92af9b1277b7d84cc9589e98ff120e9dfc0b563360a08a2a8365a744992/merged major:0 minor:394 fsType:overlay blockSize:0} overlay_0-396:{mountpoint:/var/lib/containers/storage/overlay/16bea68753284a8db9d0f699fbfbe2bc2962cabaff31a5b9ff7f616266a8aaec/merged major:0 minor:396 fsType:overlay blockSize:0} overlay_0-398:{mountpoint:/var/lib/containers/storage/overlay/5bb4c64b4d126002d58b2dea73d35e8fb280ea810a1d5ed8e6d12e63addbdf50/merged major:0 minor:398 fsType:overlay blockSize:0} overlay_0-402:{mountpoint:/var/lib/containers/storage/overlay/bd4ab16ffbdf0dda77a3fff722c02524dd684f17f13e64e26ff1d32e3e27c7da/merged major:0 minor:402 fsType:overlay blockSize:0} overlay_0-431:{mountpoint:/var/lib/containers/storage/overlay/b03408211121c27db29bd3595468ca0bfe9c3a3529e2106e7f89960168ee4925/merged major:0 minor:431 fsType:overlay blockSize:0} overlay_0-437:{mountpoint:/var/lib/containers/storage/overlay/b43aeaf72d960b186bea5d66e9277ae74d4eea66d8129731b224d85eb4f9dfcc/merged major:0 minor:437 fsType:overlay blockSize:0} overlay_0-44:{mountpoint:/var/lib/containers/storage/overlay/fc618a0235caaad11133db99a4b849d974263caadaa5c2b628e24398d3b9ec59/merged major:0 minor:44 fsType:overlay blockSize:0} overlay_0-445:{mountpoint:/var/lib/containers/storage/overlay/571dbed9425b77e9843a8df4fab69c18db055829db8660f2c8c1374bd4ec0ed8/merged major:0 minor:445 fsType:overlay blockSize:0} overlay_0-447:{mountpoint:/var/lib/containers/storage/overlay/4d81dc6c502b19f97923151cf33170065c1021871988e688482b26c9fe17403a/merged major:0 minor:447 fsType:overlay blockSize:0} overlay_0-45:{mountpoint:/var/lib/containers/storage/overlay/6eadda0f399c719bc57500231ea5a71223f8ae0b89ce6d997959b0bcb7251ba6/merged major:0 minor:45 fsType:overlay blockSize:0} overlay_0-459:{mountpoint:/var/lib/containers/storage/overlay/5cdd98cf2577d5058ef0a5bd3456a8c613e1d9995705a7bbc569ec9dba9eb432/merged major:0 minor:459 fsType:overlay blockSize:0} overlay_0-462:{mountpoint:/var/lib/containers/storage/overlay/46b03075e6ff452d517610705424aeb7fb8eebe8a44ecd5b503f08dd9b3130f3/merged major:0 minor:462 fsType:overlay blockSize:0} overlay_0-467:{mountpoint:/var/lib/containers/storage/overlay/a77989cf63c51999fd1d137866771d858c42380ec946f16e1af1a29ba3374667/merged major:0 minor:467 fsType:overlay blockSize:0} overlay_0-474:{mountpoint:/var/lib/containers/storage/overlay/424f5ec7098899605f08a11f08588b8109cfccc30efe5aeb7b099a1d6c090384/merged major:0 minor:474 fsType:overlay blockSize:0} overlay_0-476:{mountpoint:/var/lib/containers/storage/overlay/62b61f29b16707c872b44d6a8425f860361f5e82184cc6c372ef27aa3234c515/merged major:0 minor:476 fsType:overlay blockSize:0} overlay_0-477:{mountpoint:/var/lib/containers/storage/overlay/b3f84e6934ee4812c2ee511dd025423772511671cbbf65178b067eecc1de5055/merged major:0 minor:477 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/e9ba6c6084f4569d3a81ae09e2ad74ccd78f8092f77698ea7de7ca9e1f6540e5/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-487:{mountpoint:/var/lib/containers/storage/overlay/d65a79c7a5bd2fbf85ff31b006f51c410bc606fe272bbf9786390e7cda4e1a95/merged major:0 minor:487 fsType:overlay blockSize:0} overlay_0-491:{mountpoint:/var/lib/containers/storage/overlay/1a480d8257b2c75de264975452e2a2003157a7546072f456b723ecb14defed17/merged major:0 minor:491 fsType:overlay blockSize:0} overlay_0-497:{mountpoint:/var/lib/containers/storage/overlay/1492c51256abbf7ed775f2287d7bdc19653ba310472c354965ad3ec84cf335d4/merged major:0 minor:497 fsType:overlay blockSize:0} overlay_0-507:{mountpoint:/var/lib/containers/storage/overlay/1f9e26a048e9fd2b093e3271fb5d9e098688e960447350edab38232c2a06e673/merged major:0 minor:507 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/a890d0ffef46362decea31b67811f8a87249fe5fec0237c25fa51a883a8ab49f/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-550:{mountpoint:/var/lib/containers/storage/overlay/92eee517ec75736916357367625cfd076b47fc870b04cf1f0baf5ec5ce316a3b/merged major:0 minor:550 fsType:overlay blockSize:0} overlay_0-552:{mountpoint:/var/lib/containers/storage/overlay/b8280330f954e79519970bb2119b383d73181c8bb35425fd403366a5dee1531f/merged major:0 minor:552 fsType:overlay blockSize:0} overlay_0-554:{mountpoint:/var/lib/containers/storage/overlay/5cac3a281dd1c5874a7b0b831b16f21ed4bf96500180002c0e489aafff53409e/merged major:0 minor:554 fsType:overlay blockSize:0} overlay_0-556:{mountpoint:/var/lib/containers/storage/overlay/826565d7c541ec4c6a50470ec4444ee47f2b34f2c5135cec5b0d079d8bb73630/merged major:0 minor:556 fsType:overlay blockSize:0} overlay_0-558:{mountpoint:/var/lib/containers/storage/overlay/53e395dbec9224e0590d5d4a4ce68c0d4bb3d57b0d5fc7a76225622188d6c0a8/merged major:0 minor:558 fsType:overlay blockSize:0} overlay_0-560:{mountpoint:/var/lib/containers/storage/overlay/dcf6560d271c6f6e495287a03ad52a7920f26260e4582e1573495506695472bf/merged major:0 minor:560 fsType:overlay blockSize:0} overlay_0-562:{mountpoint:/var/lib/containers/storage/overlay/b020bf6161566fd6a68d78f1e68c8b96da5861468733e6be4a552372268cf0a9/merged major:0 minor:562 fsType:overlay blockSize:0} overlay_0-564:{mountpoint:/var/lib/containers/storage/overlay/62583b50e7495db0ca6178546319bcbd9280b0cc0a18a678d87d5f4e46cbcd18/merged major:0 minor:564 fsType:overlay blockSize:0} overlay_0-566:{mountpoint:/var/lib/containers/storage/overlay/9dc12c8349414728dd4cb55daf2a61ae033da5002a35e50792211b9a3eaad4a8/merged major:0 minor:566 fsType:overlay blockSize:0} overlay_0-568:{mountpoint:/var/lib/containers/storage/overlay/89ffb793450dece31f4c292db42a2edb9ad62c574cb4453f5f36023fe8e15590/merged major:0 minor:568 fsType:overlay blockSize:0} overlay_0-570:{mountpoint:/var/lib/containers/storage/overlay/8fca9b5963759d2c47b004cf0834498011b49d5a24dcb0cad2400581bf807ce1/merged major:0 minor:570 fsType:overlay blockSize:0} overlay_0-572:{mountpoint:/var/lib/containers/storage/overlay/54a07286efa5904ce631718245c4fadcd45a3f4893fe29ffba3f8971e2141db5/merged major:0 minor:572 fsType:overlay blockSize:0} overlay_0-574:{mountpoint:/var/lib/containers/storage/overlay/7a9f43f38d48099cd6c06dcc57a00c8baeb495507f0e3199ed5a0accf9b49479/merged major:0 minor:574 fsType:overlay blockSize:0} overlay_0-589:{mountpoint:/var/lib/containers/storage/overlay/1bf4363ab8d546d6fa0c1f8c61703e6e69d0aa4a357a93ebf637b2c27b11eac8/merged major:0 minor:589 fsType:overlay blockSize:0} overlay_0-591:{mountpoint:/var/lib/containers/storage/overlay/d7306722351976dc3623e75b02b706a73f3a965abf0289827ea300ec2df723c5/merged major:0 minor:591 fsType:overlay blockSize:0} overlay_0-593:{mountpoint:/var/lib/containers/storage/overlay/337ad48b86bd689a35f367a1223261c6da148e93ced4c22e11ff2cd895e16749/merged major:0 minor:593 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/4366cdbfce7b80b015a1fede6fd9cd5e1feb210bccd09a9f0654db6d43c4c172/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-603:{mountpoint:/var/lib/containers/storage/overlay/39f58e27f30008e4a4df736a50534cbbf676de52ed271f8b1f01f991330f8fda/merged major:0 minor:603 fsType:overlay blockSize:0} overlay_0-606:{mountpoint:/var/lib/containers/storage/overlay/b29136d75ddd45668417afd15e44dbe06e81fcf631c546a742ca69b44c9b9fb9/merged major:0 minor:606 fsType:overlay blockSize:0} overlay_0-607:{mountpoint:/var/lib/containers/storage/overlay/15a7be2928fff78be20389243197257b1fa24d49ee0e8aaa9a915f05ed5cc392/merged major:0 minor:607 fsType:overlay blockSize:0} overlay_0-608:{mountpoint:/var/lib/containers/storage/overlay/27a24f57951b952918afd1da67295e41348538c73caf5a4b7933d6faef752718/merged major:0 minor:608 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/1ed6a82112c412fce2d6a82ca1eb9ff35ded51d3e90f9a9d52eac7b8ba755ea3/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-620:{mountpoint:/var/lib/containers/storage/overlay/2822609e474fa0b0bfa0d436dda7ea3f24e92917a8e155ab4e80efed7cc1563f/merged major:0 minor:620 fsType:overlay blockSize:0} overlay_0-622:{mountpoint:/var/lib/containers/storage/overlay/5b02bc4f87413f6d24ad7bd292b09adbf98983c6fea480293e65049770ef53c3/merged major:0 minor:622 fsType:overlay blockSize:0} overlay_0-625:{mountpoint:/var/lib/containers/storage/overlay/883451f55722e12a850605a2878e37a04d0ea5e327d271d4601e56acb8b250b4/merged major:0 minor:625 fsType:overlay blockSize:0} overlay_0-626:{mountpoint:/var/lib/containers/storage/overlay/14c690599aabc207c25e52a0ead4bf3d6adf792490fa16dd1ffb5fbb2ae5a22f/merged major:0 minor:626 fsType:overlay blockSize:0} overlay_0-628:{mountpoint:/var/lib/containers/storage/overlay/0530b3320d1cc5625b5fe9ccb66a35bdbc75e950d6d6d7ce53d4d09a2f072499/merged major:0 minor:628 fsType:overlay blockSize:0} overlay_0-63:{mountpoint:/var/lib/containers/storage/overlay/3c6ee32a82948aa299f3d2471653f88df138c61fe1c7d1794403aaa9ad19e5fa/merged major:0 minor:63 fsType:overlay blockSize:0} overlay_0-630:{mountpoint:/var/lib/containers/storage/overlay/9abc34b878742fe059cbfe91bf5e684bfa62742cf1d9025c9f4f5579177d2a22/merged major:0 minor:630 fsType:overlay blockSize:0} overlay_0-634:{mountpoint:/var/lib/containers/storage/overlay/632a4ac3f21c33f637b55095b1bf6777829485025d60c67868ac1800ba834455/merged major:0 minor:634 fsType:overlay blockSize:0} overlay_0-643:{mountpoint:/var/lib/containers/storage/overlay/5d66cdeb623bd28f9a2949f58b1fdc84c33abaaad07233288f685b41f7466ffb/merged major:0 minor:643 fsType:overlay blockSize:0} overlay_0-648:{mountpoint:/var/lib/containers/storage/overlay/a054ffcdcc45cdc6130b2695aea5ea47fc650b580ab97951372d76ea74cf3f69/merged major:0 minor:648 fsType:overlay blockSize:0} overlay_0-649:{mountpoint:/var/lib/containers/storage/overlay/6f30a36acf29f9dddca49a1974d45e720592d9b30b5a95e14c624fd60e7152f2/merged major:0 minor:649 fsType:overlay blockSize:0} overlay_0-65:{mountpoint:/var/lib/containers/storage/overlay/ea69cf52ec07158e65e8840728653108c98d9b10b87383331696be0406ff0ca2/merged major:0 minor:65 fsType:overlay blockSize:0} overlay_0-650:{mountpoint:/var/lib/containers/storage/overlay/afdd7649435f79f727e8547ba7cbe7953c1210b84c803e8e1cc6c223f0726a48/merged major:0 minor:650 fsType:overlay blockSize:0} overlay_0-652:{mountpoint:/var/lib/containers/storage/overlay/f987e7ae71df0425ddc33c04288f2ff0e1e8e65044f5bcab265f027f1273ad0e/merged major:0 minor:652 fsType:overlay blockSize:0} overlay_0-654:{mountpoint:/var/lib/containers/storage/overlay/be221164ca97889315f4110f6e5ff0f692489b89ce70780a3710b2223145bb4b/merged major:0 minor:654 fsType:overlay blockSize:0} overlay_0-661:{mountpoint:/var/lib/containers/storage/overlay/2c3b7650ec5d76aa8fd188a695f140e94de941b26886f5e8dbcddabdb96d1be9/merged major:0 minor:661 fsType:overlay blockSize:0} overlay_0-662:{mountpoint:/var/lib/containers/storage/overlay/abf331811f0327156a4105dd66c14b3a3a026bcf9b427c84131419633fcdcd30/merged major:0 minor:662 fsType:overlay blockSize:0} overlay_0-668:{mountpoint:/var/lib/containers/storage/overlay/dcd10099fe8ea8d10bbc7f5c930cd8d7225e7c5d0b4e386be846c5c123017b89/merged major:0 minor:668 fsType:overlay blockSize:0} overlay_0-684:{mountpoint:/var/lib/containers/storage/overlay/64fcfd9159b2b3ca4316a70bb0dd6f98859d7f865fc7e458830b4ab9f3789b32/merged major:0 minor:684 fsType:overlay blockSize:0} overlay_0-686:{mountpoint:/var/lib/containers/storage/overlay/0a4156f2994748ddc7de784b71ee6ad3509a71ac67035cc35eb952b2ab2dde82/merged major:0 minor:686 fsType:overlay blockSize:0} overlay_0-690:{mountpoint:/var/lib/containers/storage/overlay/ba3e3078b43caf53d18f12930d626b917d28e5fb19c926afbd34459173147481/merged major:0 minor:690 fsType:overlay blockSize:0} overlay_0-692:{mountpoint:/var/lib/containers/storage/overlay/f8ea5b8ee9ee5da0740cd4911d55c8788b6027dbf7f659815c16ed7f182e4b6d/merged major:0 minor:692 fsType:overlay blockSize:0} overlay_0-705:{mountpoint:/var/lib/containers/storage/overlay/04d2b112b4675da681eecd3ec42df4b342fbbb7daecf293d4f12ab8e0629a228/merged major:0 minor:705 fsType:overlay blockSize:0} overlay_0-707:{mountpoint:/var/lib/containers/storage/overlay/c82066b05257f67303b2f4b28e78a2e868a1e5fc6e74cdafb5c3e7e0cdc98a28/merged major:0 minor:707 fsType:overlay blockSize:0} overlay_0-719:{mountpoint:/var/lib/containers/storage/overlay/954356f24fc378a0acda90799e02ada677162f86cfb981052c587b8461caf289/merged major:0 minor:719 fsType:overlay blockSize:0} overlay_0-726:{mountpoint:/var/lib/containers/storage/overlay/fc673eda314ff71e0b40fe57727f608762fdef9de9c3dc98c17650ab64d660b3/merged major:0 minor:726 fsType:overlay blockSize:0} overlay_0-729:{mountpoint:/var/lib/containers/storage/overlay/53f6c906a830c461989849519a7fbf14b58a60eda89c09c4acbee71a4606387b/merged major:0 minor:729 fsType:overlay blockSize:0} overlay_0-731:{mountpoint:/var/lib/containers/storage/overlay/f3c6a5fcab15531b267be70f117cc6ade7a10185dc9708f6840b08f6a0590ac4/merged major:0 minor:731 fsType:overlay blockSize:0} overlay_0-733:{mountpoint:/var/lib/containers/storage/overlay/e4120c04c7dc4bab3bf846c1b7728f095e2c6a738345e3225fa2d86998781c5a/merged major:0 minor:733 fsType:overlay blockSize:0} overlay_0-74:{mountpoint:/var/lib/containers/storage/overlay/fc44ada267a4ac8be55e758490ea61d23882ec8a4428706701db96d28a18f2fd/merged major:0 minor:74 fsType:overlay blockSize:0} overlay_0-740:{mountpoint:/var/lib/containers/storage/overlay/65530952547b5bebb8ace4c3182ea73d1750b9a6a539cdde53a72983427081b6/merged major:0 minor:740 fsType:overlay blockSize:0} overlay_0-742:{mountpoint:/var/lib/containers/storage/overlay/52e2b3894f02bf083cb2302f5113b839f132333f97e0f2981786aebc4b0b19d5/merged major:0 minor:742 fsType:overlay blockSize:0} overlay_0-749:{mountpoint:/var/lib/containers/storage/overlay/092e091de9636612d8fc55422cec59e9abaeeb35aa93a582a53c07fb96200b4a/merged major:0 minor:749 fsType:overlay blockSize:0} overlay_0-777:{mountpoint:/var/lib/containers/storage/overlay/5d14a3c7ea8ed1e5282d4aa3d9878c14ae2a6dadeb1fbbbfa2bf642d6fc424a9/merged major:0 minor:777 fsType:overlay blockSize:0} overlay_0-78:{mountpoint:/var/lib/containers/storage/overlay/a8c5aaab0f67640ef9249ac13c5f725a07dd5d36baff187a206132a446cd7dc6/merged major:0 minor:78 fsType:overlay blockSize:0} overlay_0-780:{mountpoint:/var/lib/containers/storage/overlay/c512df31d8b0f963d9bccd66950c3f7940955beb1c6161bda117d5d9e5fc6b77/merged major:0 minor:780 fsType:overlay blockSize:0} overlay_0-79:{mountpoint:/var/lib/containers/storage/overlay/25b2dce0ee979b4b2f6933f03b35402ca650c561d46d7d238145a65a7219e565/merged major:0 minor:79 fsType:overlay blockSize:0} overlay_0-791:{mountpoint:/var/lib/containers/storage/overlay/ed2caacc196f05a111f73154da45b417daf87f72d59a6e83c34b4363f341dc25/merged major:0 minor:791 fsType:overlay blockSize:0} overlay_0-795:{mountpoint:/var/lib/containers/storage/overlay/80b5872c5d414548c6c9290a4b10c29a070410653df8955c54ea39a2caf38d6d/merged major:0 minor:795 fsType:overlay blockSize:0} overlay_0-796:{mountpoint:/var/lib/containers/storage/overlay/dcf9812018c4a4d026a09f456f661eec2a8a4990a3e8c6fa31ba2182f4eae4ff/merged major:0 minor:796 fsType:overlay blockSize:0} overlay_0-805:{mountpoint:/var/lib/containers/storage/overlay/52839f9e85486547b44acabffb7151c0a98552e11b35c315412b802afe82f8fe/merged major:0 minor:805 fsType:overlay blockSize:0} overlay_0-806:{mountpoint:/var/lib/containers/storage/overlay/6c5e88cd08603310afed2d9ed754a9d418939b47ee21fabd8f6bc103b7844865/merged major:0 minor:806 fsType:overlay blockSize:0} overlay_0-81:{mountpoint:/var/lib/containers/storage/overlay/e8e978a0a92b767880341940c557d5892fcf54f90ac9e7fa2ef664acbf72b05f/merged major:0 minor:81 fsType:overlay blockSize:0} overlay_0-812:{mountpoint:/var/lib/containers/storage/overlay/ddb8deb41872710393a2a9a4bf152e35312d2b0dc96ba278cdd6299e27f424ae/merged major:0 minor:812 fsType:overlay blockSize:0} overlay_0-851:{mountpoint:/var/lib/containers/storage/overlay/9377a2c2fc072bf4515f2da8314bc9de1e0c15947f69e7f205dbe27c0dfb2cc7/merged major:0 minor:851 fsType:overlay blockSize:0} overlay_0-853:{mountpoint:/var/lib/containers/storage/overlay/b34e9f0f7ad9ee8c232d3ff35e58197d256e8f492b9efd11e3a1713c7f67685c/merged major:0 minor:853 fsType:overlay blockSize:0} overlay_0-861:{mountpoint:/var/lib/containers/storage/overlay/be1924b6ee51d61c58927d4f76814755dae12ad0bb86c2630700675a8219a169/merged major:0 minor:861 fsType:overlay blockSize:0} overlay_0-862:{mountpoint:/var/lib/containers/storage/overlay/ad2e01c54116a5a32434c29111fa05de2517a2a7e3b6c18fc661a26634b9583b/merged major:0 minor:862 fsType:overlay blockSize:0} overlay_0-868:{mountpoint:/var/lib/containers/storage/overlay/f62f5dd8ef1abe886e669096f375ba627c9ca72217521287c67f228b10d35ae3/merged major:0 minor:868 fsType:overlay blockSize:0} overlay_0-870:{mountpoint:/var/lib/containers/storage/overlay/03a2ef0bdb70c6cf01f14385ad428e89960b6983cc8c2d820bf2e2b418bdd446/merged major:0 minor:870 fsType:overlay blockSize:0} overlay_0-877:{mountpoint:/var/lib/containers/storage/overlay/2a7ab94d6088d0278c0c783fb7defcaffe0a366c02fa97e0052274aea46c86d9/merged major:0 minor:877 fsType:overlay blockSize:0} overlay_0-884:{mountpoint:/var/lib/containers/storage/overlay/6a3e7c14530e834053d6197e4be72af4aa98848ae3c0bef4db539c47724f13cd/merged major:0 minor:884 fsType:overlay blockSize:0} overlay_0-886:{mountpoint:/var/lib/containers/storage/overlay/615bd89a6f2086143ecb4a064f53cb7c420bff643198ccc7aefd963a188c2d57/merged major:0 minor:886 fsType:overlay blockSize:0} overlay_0-893:{mountpoint:/var/lib/containers/storage/overlay/ca616af963a95998fd696dbb6a5cce3e6506571a4c2e1fd67a5b2b2792dd9dee/merged major:0 minor:893 fsType:overlay blockSize:0} overlay_0-898:{mountpoint:/var/lib/containers/storage/overlay/d79e9fff87b930fe38dbbdeda1243c49fcee8e67c6b6602d64313175885822f7/merged major:0 minor:898 fsType:overlay blockSize:0} overlay_0-904:{mountpoint:/var/lib/containers/storage/overlay/7aa8ed68563f0d0e71053ba5568a443092b41e0daf550f0524949212a5d30f82/merged major:0 minor:904 fsType:overlay blockSize:0} overlay_0-906:{mountpoint:/var/lib/containers/storage/overlay/f004de66f31df02514bddf8b314fd7901addfd8712256ca6ff903a198011721e/merged major:0 minor:906 fsType:overlay blockSize:0} overlay_0-908:{mountpoint:/var/lib/containers/storage/overlay/1e6e9e78f7e8cb5a40e933266a4383d4fb19ef07d732108df528b52e66fb9a0d/merged major:0 minor:908 fsType:overlay blockSize:0} overlay_0-923:{mountpoint:/var/lib/containers/storage/overlay/811fa9170b4729a811320fe400d6fc9d9476e68fbc97333aac057898fa90a85d/merged major:0 minor:923 fsType:overlay blockSize:0} overlay_0-924:{mountpoint:/var/lib/containers/storage/overlay/767ccd3fbd70c968943f254134bf2a234cf04a54091a3bee9ac733b9e5c1ab8e/merged major:0 minor:924 fsType:overlay blockSize:0} overlay_0-93:{mountpoint:/var/lib/containers/storage/overlay/8e75b582505318b721f7314816406cbd0ce0c3bb8e0d991c0377bbf334400db6/merged major:0 minor:93 fsType:overlay blockSize:0} overlay_0-930:{mountpoint:/var/lib/containers/storage/overlay/d24d523ec8cf8e147443aeeaaf7614aad8b842b8ccb0eb5ee1ac0a06cb3d0acc/merged major:0 minor:930 fsType:overlay blockSize:0} overlay_0-932:{mountpoint:/var/lib/containers/storage/overlay/61319e1b523ec31b4a5a2d19d00089a89b89a8e6571fc82549739e320ab613df/merged major:0 minor:932 fsType:overlay blockSize:0} overlay_0-937:{mountpoint:/var/lib/containers/storage/overlay/545986b15717b3405e63bc3cfb81c9646de9c98ae9ff17eb2ddf96ba07668f34/merged major:0 minor:937 fsType:overlay blockSize:0} overlay_0-940:{mountpoint:/var/lib/containers/storage/overlay/c66605117c5352bfc207dd21e6505f720215e041297b8dda9f0de3ef2ba7b73f/merged major:0 minor:940 fsType:overlay blockSize:0} overlay_0-943:{mountpoint:/var/lib/containers/storage/overlay/5145da3344ae889bdc42539c59da10d8d493e42716a92eef3e0dcf87ffdf5f1c/merged major:0 minor:943 fsType:overlay blockSize:0} overlay_0-954:{mountpoint:/var/lib/containers/storage/overlay/ed37b0d7c023004fece2e80543b512b7aa854f4b63c263367422f7bc7c5b529e/merged major:0 minor:954 fsType:overlay blockSize:0} overlay_0-959:{mountpoint:/var/lib/containers/storage/overlay/35d48096ef924731aee6f9e6c5889c3d54c97e402e6026414b15822bcf67c61c/merged major:0 minor:959 fsType:overlay blockSize:0} overlay_0-96:{mountpoint:/var/lib/containers/storage/overlay/99d7c08878ff738a6cf81568dad6b0182d6398762915c14a16f057427a16fafa/merged major:0 minor:96 fsType:overlay blockSize:0} overlay_0-960:{mountpoint:/var/lib/containers/storage/overlay/d2a4c3b3d4c08b5046a2ef4b8d9747880cb92a9cd172adfe776ddf3220dac88c/merged major:0 minor:960 fsType:overlay blockSize:0} overlay_0-962:{mountpoint:/var/lib/containers/storage/overlay/d4b5c44eac46d05373df3b4840dd3d911dabcadea3c61aa8d8d06f0e161fd2cb/merged major:0 minor:962 fsType:overlay blockSize:0} overlay_0-980:{mountpoint:/var/lib/containers/storage/overlay/817eec793d93365f5d0daf1927c9f8e4fa4ebf6e47b1668886f6bc4531200ce6/merged major:0 minor:980 fsType:overlay blockSize:0} overlay_0-99:{mountpoint:/var/lib/containers/storage/overlay/a9b1825459af37974fba5d6143a8a8f78044f6b3378513a8886388da43da8bd4/merged major:0 minor:99 fsType:overlay blockSize:0}] Mar 08 03:53:08.054523 master-0 kubenswrapper[18592]: I0308 03:53:08.053561 18592 manager.go:217] Machine: {Timestamp:2026-03-08 03:53:08.052863767 +0000 UTC m=+0.151618137 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:713fb7c44cd644b5986e15157751dddb SystemUUID:713fb7c4-4cd6-44b5-986e-15157751dddb BootID:30e60e76-0e70-41ea-99da-7a4dcafd0e32 Filesystems:[{Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-550 DeviceMajor:0 DeviceMinor:550 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-625 DeviceMajor:0 DeviceMinor:625 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2262647b-c315-477a-93bd-f168c1810475/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:823 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7/volumes/kubernetes.io~secret/machine-api-operator-tls DeviceMajor:0 DeviceMinor:895 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5a7752f9-7b9a-451f-997a-e9f696d38b34/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:209 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c9de4939-680a-4e3e-89fd-e20ecb8b10f2/volumes/kubernetes.io~projected/kube-api-access-29dpg DeviceMajor:0 DeviceMinor:248 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a/volumes/kubernetes.io~projected/kube-api-access-cfvnn DeviceMajor:0 DeviceMinor:255 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-562 DeviceMajor:0 DeviceMinor:562 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8e283f49-b85d-4789-a71f-3fcb5033cdf0/volumes/kubernetes.io~projected/kube-api-access-zm59c DeviceMajor:0 DeviceMinor:319 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-63 DeviceMajor:0 DeviceMinor:63 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-365 DeviceMajor:0 DeviceMinor:365 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e60e7e3ed5830ae078195a56d09f656f257e2641daffbfad2ebaca1e467bb613/userdata/shm DeviceMajor:0 DeviceMinor:849 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7e5935ea-8d95-45e3-b836-c7892953ef3d/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1cbcb403-a424-4496-8c5c-5eb5e42dfb93/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:220 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1b69fbf6-1ca5-413e-bffd-965730bcec1b/volumes/kubernetes.io~projected/kube-api-access-nfz27 DeviceMajor:0 DeviceMinor:472 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-729 DeviceMajor:0 DeviceMinor:729 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6bee226a-2a66-4032-8aba-2c8b82abcb6a/volumes/kubernetes.io~projected/kube-api-access-tp98d DeviceMajor:0 DeviceMinor:831 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2262647b-c315-477a-93bd-f168c1810475/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:824 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3ddfd0e7-fe76-41bc-b316-94505df81002/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:43 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/232c421d-96f0-4894-b8d8-74f43d02bbd3/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:517 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-564 DeviceMajor:0 DeviceMinor:564 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-643 DeviceMajor:0 DeviceMinor:643 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-158 DeviceMajor:0 DeviceMinor:158 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/adfc23f0d784d89240f88962b8f79cdf84a79077cb7581e94d0e19b479eeafaa/userdata/shm DeviceMajor:0 DeviceMinor:283 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-301 DeviceMajor:0 DeviceMinor:301 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-311 DeviceMajor:0 DeviceMinor:311 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-749 DeviceMajor:0 DeviceMinor:749 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-740 DeviceMajor:0 DeviceMinor:740 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/093f17f0-2818-4e24-b3c3-6ab4da9d21fb/volumes/kubernetes.io~projected/kube-api-access-7nk8r DeviceMajor:0 DeviceMinor:105 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5a7752f9-7b9a-451f-997a-e9f696d38b34/volumes/kubernetes.io~projected/kube-api-access-8b5zb DeviceMajor:0 DeviceMinor:219 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/30211469-7108-4820-a988-26fc4ced734e/volumes/kubernetes.io~projected/kube-api-access-fncng DeviceMajor:0 DeviceMinor:229 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-297 DeviceMajor:0 DeviceMinor:297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2f59fe81-deee-4ced-ae9d-f17752c82c4b/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:455 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2f59fe81-deee-4ced-ae9d-f17752c82c4b/volumes/kubernetes.io~projected/kube-api-access-bm7bw DeviceMajor:0 DeviceMinor:473 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-132 DeviceMajor:0 DeviceMinor:132 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2a8e902cd252f0c879e3e1c00047d04c3e8646bfeed72f034a41537b464f6d14/userdata/shm DeviceMajor:0 DeviceMinor:273 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-731 DeviceMajor:0 DeviceMinor:731 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/634c0f6d-bce6-42cf-9253-80d1bcc7c507/volumes/kubernetes.io~secret/samples-operator-tls DeviceMajor:0 DeviceMinor:840 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0ebf1330-e044-4ff5-8b48-2d667e0c5625/volumes/kubernetes.io~projected/kube-api-access-hccv4 DeviceMajor:0 DeviceMinor:241 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-363 DeviceMajor:0 DeviceMinor:363 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-474 DeviceMajor:0 DeviceMinor:474 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-626 DeviceMajor:0 DeviceMinor:626 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-396 DeviceMajor:0 DeviceMinor:396 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-45 DeviceMajor:0 DeviceMinor:45 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-690 DeviceMajor:0 DeviceMinor:690 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:611 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7ff63c73-62a3-44b4-acd3-1b3df175794f/volumes/kubernetes.io~projected/kube-api-access-vfqc5 DeviceMajor:0 DeviceMinor:239 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6cde5024-edf7-4fa4-8964-cabe7899578b/volumes/kubernetes.io~secret/package-server-manager-serving-cert DeviceMajor:0 DeviceMinor:503 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2dd4279d-a1a9-450a-a061-9008cd1ea8e0/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:519 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-113 DeviceMajor:0 DeviceMinor:113 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/49aec4971047b96e14ae56703fe099426b567477422c0add4be258e7ae9b7ff1/userdata/shm DeviceMajor:0 DeviceMinor:259 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-733 DeviceMajor:0 DeviceMinor:733 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c4262d5f7cd90d77070e291ffede65804485b27ce848841d5c9b49cfb475af2e/userdata/shm DeviceMajor:0 DeviceMinor:646 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5a7752f9-7b9a-451f-997a-e9f696d38b34/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e4541b7b-3f7f-4851-9bd9-26fcda5cab13/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:240 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-560 DeviceMajor:0 DeviceMinor:560 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-431 DeviceMajor:0 DeviceMinor:431 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-81 DeviceMajor:0 DeviceMinor:81 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f/volumes/kubernetes.io~secret/image-registry-operator-tls DeviceMajor:0 DeviceMinor:510 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-554 DeviceMajor:0 DeviceMinor:554 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-556 DeviceMajor:0 DeviceMinor:556 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-692 DeviceMajor:0 DeviceMinor:692 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1b700d17-83d2-46c8-afbc-e5774822eabe/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:866 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0362cae60811bf2651cbf0189c6f7ef23ef9a8b3134d671278cc424b4c2ad9ec/userdata/shm DeviceMajor:0 DeviceMinor:874 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-93 DeviceMajor:0 DeviceMinor:93 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0418ff42-7eac-4266-97b5-4df88623d066/volumes/kubernetes.io~projected/kube-api-access-kmpdd DeviceMajor:0 DeviceMinor:245 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e187516f-8f33-4c17-81d6-60c10b580bb0/volumes/kubernetes.io~projected/kube-api-access-vg9kg DeviceMajor:0 DeviceMinor:689 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-812 DeviceMajor:0 DeviceMinor:812 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c7817ab55e81e53bd9a1c875e0b10710e15527bb4f619ad1dc5011c4087c74fe/userdata/shm DeviceMajor:0 DeviceMinor:321 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-347 DeviceMajor:0 DeviceMinor:347 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-648 DeviceMajor:0 DeviceMinor:648 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-140 DeviceMajor:0 DeviceMinor:140 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-620 DeviceMajor:0 DeviceMinor:620 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-870 DeviceMajor:0 DeviceMinor:870 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/47433e6e63affa1ba02608e11b299ca5af00d1c85e6731e35f43a4b241522538/userdata/shm DeviceMajor:0 DeviceMinor:286 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-309 DeviceMajor:0 DeviceMinor:309 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e78b283b-981e-48d7-a5f2-53f8401766ea/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:511 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f7ac4af0eeac6f90547286a05d56708b5e0e75b0367c4826038733ce85075489/userdata/shm DeviceMajor:0 DeviceMinor:521 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-74 DeviceMajor:0 DeviceMinor:74 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-980 DeviceMajor:0 DeviceMinor:980 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-102 DeviceMajor:0 DeviceMinor:102 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2899b4e2a1cabd8aea96b1bf0db490c7e98f0e9564c40236186985f7b516039b/userdata/shm DeviceMajor:0 DeviceMinor:269 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4482916b3b4b521cf75927dd45a05e0a2072a49de37c125a72612ca885ff96ce/userdata/shm DeviceMajor:0 DeviceMinor:279 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:439 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-476 DeviceMajor:0 DeviceMinor:476 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/76ba45a2-8945-4afe-b913-126c26725867/volumes/kubernetes.io~projected/kube-api-access-dtts2 DeviceMajor:0 DeviceMinor:583 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-606 DeviceMajor:0 DeviceMinor:606 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-79 DeviceMajor:0 DeviceMinor:79 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-862 DeviceMajor:0 DeviceMinor:862 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-930 DeviceMajor:0 DeviceMinor:930 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0ebf1330-e044-4ff5-8b48-2d667e0c5625/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:226 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-336 DeviceMajor:0 DeviceMinor:336 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-628 DeviceMajor:0 DeviceMinor:628 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-157 DeviceMajor:0 DeviceMinor:157 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1eb851be-f157-48ea-9a39-1361b68d2639/volumes/kubernetes.io~projected/kube-api-access-nqhzl DeviceMajor:0 DeviceMinor:246 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-742 DeviceMajor:0 DeviceMinor:742 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/15db86a68d791074ec87b05b5a5fc2f19c6862a1ebcfb5de4931251a55e195a3/userdata/shm DeviceMajor:0 DeviceMinor:538 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c0861ccd-5e86-4277-9082-95f3133508a0/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert DeviceMajor:0 DeviceMinor:839 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f0f5f3f3-0856-4da3-9157-15f65c6aba6e/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:126 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:214 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0918ba32-8e55-48d0-8e50-027c0dcb4bbd/volumes/kubernetes.io~projected/kube-api-access-mghmh DeviceMajor:0 DeviceMinor:252 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-299 DeviceMajor:0 DeviceMinor:299 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-622 DeviceMajor:0 DeviceMinor:622 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a6c4695c-da78-46b6-8f92-ca93c5ebb96b/volumes/kubernetes.io~projected/kube-api-access-bd7d5 DeviceMajor:0 DeviceMinor:641 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-307 DeviceMajor:0 DeviceMinor:307 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-398 DeviceMajor:0 DeviceMinor:398 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1b69fbf6-1ca5-413e-bffd-965730bcec1b/volumes/kubernetes.io~secret/catalogserver-certs DeviceMajor:0 DeviceMinor:586 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-906 DeviceMajor:0 DeviceMinor:906 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9ec89e27-4360-48f2-a7ca-5d823bda4510/volumes/kubernetes.io~projected/kube-api-access-vndvf DeviceMajor:0 DeviceMinor:386 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/232c421d-96f0-4894-b8d8-74f43d02bbd3/volumes/kubernetes.io~secret/node-tuning-operator-tls DeviceMajor:0 DeviceMinor:509 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-162 DeviceMajor:0 DeviceMinor:162 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-662 DeviceMajor:0 DeviceMinor:662 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5b8c31076d1db49fd8c133661fbbc131a58892112131cf3118f58212505e7460/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-608 DeviceMajor:0 DeviceMinor:608 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-959 DeviceMajor:0 DeviceMinor:959 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0418ff42-7eac-4266-97b5-4df88623d066/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls DeviceMajor:0 DeviceMinor:508 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-341 DeviceMajor:0 DeviceMinor:341 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-884 DeviceMajor:0 DeviceMinor:884 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-507 DeviceMajor:0 DeviceMinor:507 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-898 DeviceMajor:0 DeviceMinor:898 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/df9f6505570d879efae2662d6149a2ae417f35b1bed956f7339c92d857b81707/userdata/shm DeviceMajor:0 DeviceMinor:281 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/946c23d9475281a0fd499a7ff53f910e9f2c222a2716d7d2886b8590024362cc/userdata/shm DeviceMajor:0 DeviceMinor:443 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1eb851be-f157-48ea-9a39-1361b68d2639/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:514 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-437 DeviceMajor:0 DeviceMinor:437 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9b135e9cc968b9e23fda104dcc5dd8cbf50632e21d670c61642446eb2eb45282/userdata/shm DeviceMajor:0 DeviceMinor:263 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-447 DeviceMajor:0 DeviceMinor:447 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79/volumes/kubernetes.io~projected/kube-api-access-7fzmf DeviceMajor:0 DeviceMinor:786 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-487 DeviceMajor:0 DeviceMinor:487 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-904 DeviceMajor:0 DeviceMinor:904 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/127c3f92-8283-4179-9e40-a12dcabaaa12/volumes/kubernetes.io~secret/machine-approver-tls DeviceMajor:0 DeviceMinor:822 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/30211469-7108-4820-a988-26fc4ced734e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:218 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/84d353ae-3992-4c17-a20e-3415edd92509/volumes/kubernetes.io~projected/kube-api-access-7smmf DeviceMajor:0 DeviceMinor:387 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/95267fdf9dd0903f45fd631ce455eb67b79bbeabc1d7f2fb9fb37ed66199c9e6/userdata/shm DeviceMajor:0 DeviceMinor:537 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-589 DeviceMajor:0 DeviceMinor:589 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-630 DeviceMajor:0 DeviceMinor:630 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0918ba32-8e55-48d0-8e50-027c0dcb4bbd/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:223 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/49ec083d-dc74-457e-b10f-3bde04e9e75e/volumes/kubernetes.io~projected/kube-api-access-zcjr9 DeviceMajor:0 DeviceMinor:442 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-634 DeviceMajor:0 DeviceMinor:634 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-726 DeviceMajor:0 DeviceMinor:726 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f/volumes/kubernetes.io~projected/kube-api-access-fw7mr DeviceMajor:0 DeviceMinor:231 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/34c7c8e7bf6b2608a8ed06595ef49b7b5823fe04e62631a07f3cbbca5adb876a/userdata/shm DeviceMajor:0 DeviceMinor:527 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-686 DeviceMajor:0 DeviceMinor:686 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e93b5361-30e6-44fd-a59e-2bc410c59480/volumes/kubernetes.io~projected/kube-api-access-4kc5q DeviceMajor:0 DeviceMinor:317 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d2cd5b23-e622-4b96-aee8-dbc942b73b4a/volumes/kubernetes.io~projected/kube-api-access-jljzc DeviceMajor:0 DeviceMinor:775 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-908 DeviceMajor:0 DeviceMinor:908 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c5cc16f26a63d054e0857f2a2f1278a7512a2a20bea66d9521aa218fb1539d3c/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1482d789-884b-4337-b598-f0e2b71eb9f2/volumes/kubernetes.io~projected/kube-api-access-m2h62 DeviceMajor:0 DeviceMinor:243 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-305 DeviceMajor:0 DeviceMinor:305 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/005a8f12bd7e675962c64889aeb13228b895d026517d2c45f10276c0ab4cd89e/userdata/shm DeviceMajor:0 DeviceMinor:91 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f0f5f3f3-0856-4da3-9157-15f65c6aba6e/volumes/kubernetes.io~projected/kube-api-access-2vklx DeviceMajor:0 DeviceMinor:127 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/26180f77-0b1a-4d0f-9ed0-a12fdee69817/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:253 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-295 DeviceMajor:0 DeviceMinor:295 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-313 DeviceMajor:0 DeviceMinor:313 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-110 DeviceMajor:0 DeviceMinor:110 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a60bc804-52e7-422a-87fd-ac4c5aa90cb3/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:217 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-796 DeviceMajor:0 DeviceMinor:796 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-65 DeviceMajor:0 DeviceMinor:65 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a6c4695c-da78-46b6-8f92-ca93c5ebb96b/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:612 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-684 DeviceMajor:0 DeviceMinor:684 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2a6fcf9f9144c9d6bd5dcc37a1eb71dcefd7c65bbd0149ae8f69eff02142d6ae/userdata/shm DeviceMajor:0 DeviceMinor:897 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/127c3f92-8283-4179-9e40-a12dcabaaa12/volumes/kubernetes.io~projected/kube-api-access-zdn9r DeviceMajor:0 DeviceMinor:836 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/26180f77-0b1a-4d0f-9ed0-a12fdee69817/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:221 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/232c421d-96f0-4894-b8d8-74f43d02bbd3/volumes/kubernetes.io~projected/kube-api-access-fx4fw DeviceMajor:0 DeviceMinor:238 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3d64c10d51d4d9009da402a9f2c51b81830f1695b7370548200097f367d254f2/userdata/shm DeviceMajor:0 DeviceMinor:265 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-266 DeviceMajor:0 DeviceMinor:266 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1b69fbf6-1ca5-413e-bffd-965730bcec1b/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:471 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e187516f-8f33-4c17-81d6-60c10b580bb0/volumes/kubernetes.io~empty-dir/tmp DeviceMajor:0 DeviceMinor:755 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d9c3b609dc5ba9405c265ea8312eb684aae5364c416fb8a8d02c96c1413b155d/userdata/shm DeviceMajor:0 DeviceMinor:856 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-861 DeviceMajor:0 DeviceMinor:861 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/65320fce1a0608c5e233ad7039ccb30dfdee6ba6adad349424d74cf44c08e2db/userdata/shm DeviceMajor:0 DeviceMinor:98 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-394 DeviceMajor:0 DeviceMinor:394 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-462 DeviceMajor:0 DeviceMinor:462 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-572 DeviceMajor:0 DeviceMinor:572 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cbd8c33fbce7b1c9cf78530bc91c2ad9c46d9601ea6ef0914dea487c85e63f0d/userdata/shm DeviceMajor:0 DeviceMinor:782 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-924 DeviceMajor:0 DeviceMinor:924 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-116 DeviceMajor:0 DeviceMinor:116 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/164586b1-f133-4427-8ab6-eb0839b79738/volumes/kubernetes.io~projected/kube-api-access-r4stz DeviceMajor:0 DeviceMinor:139 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:247 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c9de4939-680a-4e3e-89fd-e20ecb8b10f2/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:250 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/201fec6682662fad0acc461d7db4c8e108b597d14fa258495dcdfa10f6e193b5/userdata/shm DeviceMajor:0 DeviceMinor:528 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-570 DeviceMajor:0 DeviceMinor:570 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/718b320dd5408c5dda7ee606dfc45fd377e09b1616f83b01ddd6bedbab6de149/userdata/shm DeviceMajor:0 DeviceMinor:587 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-780 DeviceMajor:0 DeviceMinor:780 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b259f65d2759400bc4903e27e05a8a6318da137f5779b59b0781ca65575183e7/userdata/shm DeviceMajor:0 DeviceMinor:859 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-352 DeviceMajor:0 DeviceMinor:352 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8efdcef9-9b31-4567-b7f9-cb59a894273d/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:513 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-652 DeviceMajor:0 DeviceMinor:652 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-932 DeviceMajor:0 DeviceMinor:932 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0ec1fcf833bb575029f4371f595adf3e92b6ae14914f83458d311cb85210d774/userdata/shm DeviceMajor:0 DeviceMinor:114 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:434 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/76ba45a2-8945-4afe-b913-126c26725867/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:582 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-654 DeviceMajor:0 DeviceMinor:654 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-805 DeviceMajor:0 DeviceMinor:805 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/76ba45a2-8945-4afe-b913-126c26725867/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:581 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2/volumes/kubernetes.io~projected/kube-api-access-pb87l DeviceMajor:0 DeviceMinor:642 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/634c0f6d-bce6-42cf-9253-80d1bcc7c507/volumes/kubernetes.io~projected/kube-api-access-8cwmn DeviceMajor:0 DeviceMinor:846 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ec83c044c04d6837d5d5f7d4c71e74473794e6ee1e718df488cf45a934fcc03a/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ff14038e05786c394b22fac9e7aff676f3eef7f98d2a8dbbbe2bd0a62e05aecf/userdata/shm DeviceMajor:0 DeviceMinor:531 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6/volumes/kubernetes.io~projected/kube-api-access-nfz6w DeviceMajor:0 DeviceMinor:769 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:784 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9903faf2a78afa21fe51e82e71e6a8b65942f5c695c1b737493cfec8a1911541/userdata/shm DeviceMajor:0 DeviceMinor:320 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-603 DeviceMajor:0 DeviceMinor:603 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e8733f46dd1d2647e586c0cc9b5a4ebea38d695f856a8c74190015b70d99a33e/userdata/shm DeviceMajor:0 DeviceMinor:129 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7f9b714050ae3b09bb8faf754e7059ffdbc5afd8e225d14b9c0ab424f1262da7/userdata/shm DeviceMajor:0 DeviceMinor:456 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/54ad284e-d40e-4e69-b898-f5093952a0e6/volumes/kubernetes.io~secret/marketplace-operator-metrics DeviceMajor:0 DeviceMinor:520 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:785 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0031e3a9-b253-4dda-a890-bf3e4d8737e8/volumes/kubernetes.io~projected/kube-api-access-qhms8 DeviceMajor:0 DeviceMinor:837 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2b3d1dc7-22f9-4c0c-802a-d7314894b255/volumes/kubernetes.io~projected/kube-api-access-zfgc6 DeviceMajor:0 DeviceMinor:318 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-467 DeviceMajor:0 DeviceMinor:467 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-877 DeviceMajor:0 DeviceMinor:877 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ed400b0e1b21fe5e4ef5385a05444bf39db4c2fd9c754a3d6c45427d3b29ef99/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ee586416-6f56-4ea4-ad62-95de1e6df23b/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0d377285-0336-41b7-b48f-c44a7b563498/volumes/kubernetes.io~projected/kube-api-access-7qn5v DeviceMajor:0 DeviceMinor:249 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d831cb23-7411-4072-8273-c167d9afca28/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls DeviceMajor:0 DeviceMinor:515 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/76ba45a2-8945-4afe-b913-126c26725867/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:576 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-649 DeviceMajor:0 DeviceMinor:649 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/db3e211f71e6d36cf104a5781a02c4e98905e1bbc8fec6cc754858473d74a96c/userdata/shm DeviceMajor:0 DeviceMinor:335 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-937 DeviceMajor:0 DeviceMinor:937 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/var/lib/kubelet/pods/e4541b7b-3f7f-4851-9bd9-26fcda5cab13/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:216 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/dc0e0feca8d08363d7aeeb0e56e61f125a0c90431bd31ea7f3ad61c6ddd5d77c/userdata/shm DeviceMajor:0 DeviceMinor:322 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-477 DeviceMajor:0 DeviceMinor:477 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b7854fff4e6d290ba66d677fb1f4c348702f3c168d271f4daa5e0ff010a39d54/userdata/shm DeviceMajor:0 DeviceMinor:524 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-943 DeviceMajor:0 DeviceMinor:943 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e169486121bbc52c7ca877ad3d815dc4a35f6b8ee220e0fd43b9661c26e26d92/userdata/shm DeviceMajor:0 DeviceMinor:89 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7ff63c73-62a3-44b4-acd3-1b3df175794f/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:222 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1cbcb403-a424-4496-8c5c-5eb5e42dfb93/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:244 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-940 DeviceMajor:0 DeviceMinor:940 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/33ed331b-89e9-45f8-ab3c-4533a77cc7b6/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls DeviceMajor:0 DeviceMinor:112 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4c5a0c1d-867a-4ce4-9570-ea66452c8db3/volumes/kubernetes.io~projected/kube-api-access-mkzb2 DeviceMajor:0 DeviceMinor:256 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-333 DeviceMajor:0 DeviceMinor:333 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-491 DeviceMajor:0 DeviceMinor:491 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b3eea925-73b3-4693-8f0e-6dd26107f60a/volumes/kubernetes.io~projected/kube-api-access-6sx5s DeviceMajor:0 DeviceMinor:234 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/59d3785f249896d312f7b27b04e39fa314d9b06309adfc4aa055444977f4fa7e/userdata/shm DeviceMajor:0 DeviceMinor:584 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/139881ee-6cfa-4a7e-b002-63cece048d16/volumes/kubernetes.io~projected/kube-api-access-h9z8g DeviceMajor:0 DeviceMinor:848 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4a19441e-e61b-4d58-85db-813ae88e1f9b/volumes/kubernetes.io~projected/kube-api-access-dw7bx DeviceMajor:0 DeviceMinor:118 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6588c21791f0b9fd7a866ced5165aad3ddf504a15e8585434bc4836ba3395293/userdata/shm DeviceMajor:0 DeviceMinor:142 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6cde5024-edf7-4fa4-8964-cabe7899578b/volumes/kubernetes.io~projected/kube-api-access-x997v DeviceMajor:0 DeviceMinor:237 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-791 DeviceMajor:0 DeviceMinor:791 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-607 DeviceMajor:0 DeviceMinor:607 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f0f5f3f3-0856-4da3-9157-15f65c6aba6e/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:440 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-552 DeviceMajor:0 DeviceMinor:552 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-650 DeviceMajor:0 DeviceMinor:650 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-962 DeviceMajor:0 DeviceMinor:962 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a60bc804-52e7-422a-87fd-ac4c5aa90cb3/volumes/kubernetes.io~projected/kube-api-access-zxkm6 DeviceMajor:0 DeviceMinor:230 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d47d14f256ba67306efe8da7bbcadc67f946b747f7e0a1d658a9687f1f0a1a37/userdata/shm DeviceMajor:0 DeviceMinor:275 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-568 DeviceMajor:0 DeviceMinor:568 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-357 DeviceMajor:0 DeviceMinor:357 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1b700d17-83d2-46c8-afbc-e5774822eabe/volumes/kubernetes.io~projected/kube-api-access-cv7sd DeviceMajor:0 DeviceMinor:867 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-923 DeviceMajor:0 DeviceMinor:923 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-44 DeviceMajor:0 DeviceMinor:44 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-402 DeviceMajor:0 DeviceMinor:402 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d5044ffd-0686-4679-9894-e696faf33699/volumes/kubernetes.io~projected/kube-api-access-mmhtb DeviceMajor:0 DeviceMinor:123 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/133c0043d3a977b4007520994c1530f26391f82433e16ae8b2e991aa2092980b/userdata/shm DeviceMajor:0 DeviceMinor:285 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8b3d4b079b2ebb85b87310aca0e7ee26b306a1f0013e66da4d3495d792aa5402/userdata/shm DeviceMajor:0 DeviceMinor:388 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d5044ffd-0686-4679-9894-e696faf33699/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:530 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-558 DeviceMajor:0 DeviceMinor:558 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-459 DeviceMajor:0 DeviceMinor:459 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b3eea925-73b3-4693-8f0e-6dd26107f60a/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:224 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0d377285-0336-41b7-b48f-c44a7b563498/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:225 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e78b283b-981e-48d7-a5f2-53f8401766ea/volumes/kubernetes.io~projected/kube-api-access-rchj5 DeviceMajor:0 DeviceMinor:251 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/24f253f45cdf8fe83c1408fe0ce3848ec429687603d0f7eff71df3320c693f47/userdata/shm DeviceMajor:0 DeviceMinor:529 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-380 DeviceMajor:0 DeviceMinor:380 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3f36e6c60bfd2bd17ffa26adda31fc5cb46b7dc64ce396281d544af5a25539b6/userdata/shm DeviceMajor:0 DeviceMinor:460 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-851 DeviceMajor:0 DeviceMinor:851 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-166 DeviceMajor:0 DeviceMinor:166 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-174 DeviceMajor:0 DeviceMinor:174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/54ad284e-d40e-4e69-b898-f5093952a0e6/volumes/kubernetes.io~projected/kube-api-access-9lfcj DeviceMajor:0 DeviceMinor:242 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-497 DeviceMajor:0 DeviceMinor:497 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2f60507250e058dbc73dd9a2defceea722aafde0bbc43ed7857b1626b36814fe/userdata/shm DeviceMajor:0 DeviceMinor:526 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b57da2053e178390131e07c76308479420cda5a65a12ef7fa425c01959c1b9c5/userdata/shm DeviceMajor:0 DeviceMinor:548 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-705 DeviceMajor:0 DeviceMinor:705 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4025d1a6cb66b179d453ec8f3c19442902ea80a04085eeeda4fa9c48c774a80e/userdata/shm DeviceMajor:0 DeviceMinor:644 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-668 DeviceMajor:0 DeviceMinor:668 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-379 DeviceMajor:0 DeviceMinor:379 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3235d3bd9c5f6c6a7e16ad74c79046e87f4d03278e4096c568a5930f544fbbf0/userdata/shm DeviceMajor:0 DeviceMinor:227 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ee586416-6f56-4ea4-ad62-95de1e6df23b/volumes/kubernetes.io~proje Mar 08 03:53:08.054840 master-0 kubenswrapper[18592]: cted/kube-api-access-sxxhh DeviceMajor:0 DeviceMinor:232 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f/volumes/kubernetes.io~projected/kube-api-access-4v45k DeviceMajor:0 DeviceMinor:449 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c9de4939-680a-4e3e-89fd-e20ecb8b10f2/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:518 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b1a0209e0a9a4093bed7068dc639e8f6b3aa1b820bc97e5ac17eab47d3a362ec/userdata/shm DeviceMajor:0 DeviceMinor:535 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-795 DeviceMajor:0 DeviceMinor:795 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/22cc81f0c9d90fe64f682c3bbb7bbcefc904c4ee2c036d7eedf6b66887f69fae/userdata/shm DeviceMajor:0 DeviceMinor:100 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2dd4279d-a1a9-450a-a061-9008cd1ea8e0/volumes/kubernetes.io~projected/kube-api-access-pnzt7 DeviceMajor:0 DeviceMinor:233 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-445 DeviceMajor:0 DeviceMinor:445 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-574 DeviceMajor:0 DeviceMinor:574 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bc8712c641a3b3ccd887956343c178e81a448d9908293a089aa942f0944b3018/userdata/shm DeviceMajor:0 DeviceMinor:329 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-331 DeviceMajor:0 DeviceMinor:331 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/139881ee-6cfa-4a7e-b002-63cece048d16/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls DeviceMajor:0 DeviceMinor:838 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a77b94c864745863d968467e91271f57be5b9449652226a9e1e9789e20eef38f/userdata/shm DeviceMajor:0 DeviceMinor:855 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-893 DeviceMajor:0 DeviceMinor:893 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-960 DeviceMajor:0 DeviceMinor:960 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-99 DeviceMajor:0 DeviceMinor:99 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/164586b1-f133-4427-8ab6-eb0839b79738/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:138 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-777 DeviceMajor:0 DeviceMinor:777 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9af5ebd3eee3c3de99e27a671d715ba12c7da929014abc4a9a4424a8fb8aad4e/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7e5935ea-8d95-45e3-b836-c7892953ef3d/volumes/kubernetes.io~projected/kube-api-access-c6gml DeviceMajor:0 DeviceMinor:125 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/52b495ac-bb28-44f3-b925-3c54f86d5ec4/volumes/kubernetes.io~projected/kube-api-access-dd549 DeviceMajor:0 DeviceMinor:235 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-261 DeviceMajor:0 DeviceMinor:261 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a5455725a1362a8e870442eb2f0235fbea46c1d047d2183683f1ca346ec9c059/userdata/shm DeviceMajor:0 DeviceMinor:289 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1482d789-884b-4337-b598-f0e2b71eb9f2/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:516 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/48d46b7645a64ea18f3fa334445c914bbcaaadce3a50f149dedad680b9f63699/userdata/shm DeviceMajor:0 DeviceMinor:536 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-591 DeviceMajor:0 DeviceMinor:591 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-719 DeviceMajor:0 DeviceMinor:719 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-707 DeviceMajor:0 DeviceMinor:707 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-96 DeviceMajor:0 DeviceMinor:96 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/49ec083d-dc74-457e-b10f-3bde04e9e75e/volumes/kubernetes.io~secret/signing-key DeviceMajor:0 DeviceMinor:441 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e187516f-8f33-4c17-81d6-60c10b580bb0/volumes/kubernetes.io~empty-dir/etc-tuned DeviceMajor:0 DeviceMinor:756 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b6803c40b59bc228703c8b2ce51f78af3f050cad56ceb99d544a076dbfccb803/userdata/shm DeviceMajor:0 DeviceMinor:605 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-806 DeviceMajor:0 DeviceMinor:806 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c0861ccd-5e86-4277-9082-95f3133508a0/volumes/kubernetes.io~projected/kube-api-access-n7rsc DeviceMajor:0 DeviceMinor:841 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-315 DeviceMajor:0 DeviceMinor:315 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-593 DeviceMajor:0 DeviceMinor:593 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-868 DeviceMajor:0 DeviceMinor:868 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/d831cb23-7411-4072-8273-c167d9afca28/volumes/kubernetes.io~projected/kube-api-access-dwkwt DeviceMajor:0 DeviceMinor:236 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d73b051671cc575452964e4ec7abae8ed2cf8ae1de2a3be5460a27e068329e94/userdata/shm DeviceMajor:0 DeviceMinor:257 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-277 DeviceMajor:0 DeviceMinor:277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/40effd583e5274c7bd2d572652f2adc6b94ce532187f776e02a024e30b5ff7e5/userdata/shm DeviceMajor:0 DeviceMinor:406 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:747 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-78 DeviceMajor:0 DeviceMinor:78 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/33ed331b-89e9-45f8-ab3c-4533a77cc7b6/volumes/kubernetes.io~projected/kube-api-access-hmsj5 DeviceMajor:0 DeviceMinor:303 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/14fe5cb6383f1129ecf327e882bdb7904f8ad1a8a2cc2647d9ee96534b6ccb93/userdata/shm DeviceMajor:0 DeviceMinor:271 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f8dce45144c680f78255eabd0603f42f2f97c7b82b1aee0c5c17224722da19a3/userdata/shm DeviceMajor:0 DeviceMinor:390 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c6cc97386aa9d8bb895c877b8849fa8dc27a6fe973ccc0760f4274b321682e77/userdata/shm DeviceMajor:0 DeviceMinor:522 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3ddfd0e7-fe76-41bc-b316-94505df81002/volumes/kubernetes.io~projected/kube-api-access-bgc7c DeviceMajor:0 DeviceMinor:92 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-159 DeviceMajor:0 DeviceMinor:159 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2221ffbcc38435886b634ed23b63c6c48586f85323431d02b258500a200b9a2b/userdata/shm DeviceMajor:0 DeviceMinor:778 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-661 DeviceMajor:0 DeviceMinor:661 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-886 DeviceMajor:0 DeviceMinor:886 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-853 DeviceMajor:0 DeviceMinor:853 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8efdcef9-9b31-4567-b7f9-cb59a894273d/volumes/kubernetes.io~projected/kube-api-access-cpsx7 DeviceMajor:0 DeviceMinor:254 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-392 DeviceMajor:0 DeviceMinor:392 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d831cb23-7411-4072-8273-c167d9afca28/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:512 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-566 DeviceMajor:0 DeviceMinor:566 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7/volumes/kubernetes.io~projected/kube-api-access-6nlq2 DeviceMajor:0 DeviceMinor:894 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-954 DeviceMajor:0 DeviceMinor:954 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:005a8f12bd7e675 MacAddress:06:9f:a6:f3:05:f5 Speed:10000 Mtu:8900} {Name:0362cae60811bf2 MacAddress:7a:f3:28:54:0a:68 Speed:10000 Mtu:8900} {Name:133c0043d3a977b MacAddress:1e:9d:27:74:5d:53 Speed:10000 Mtu:8900} {Name:14fe5cb6383f112 MacAddress:12:f3:4c:5a:09:d9 Speed:10000 Mtu:8900} {Name:15db86a68d79107 MacAddress:5a:9c:75:ac:55:b3 Speed:10000 Mtu:8900} {Name:201fec6682662fa MacAddress:72:c6:7f:a1:fa:62 Speed:10000 Mtu:8900} {Name:24f253f45cdf8fe MacAddress:2e:b4:ef:fe:85:42 Speed:10000 Mtu:8900} {Name:2899b4e2a1cabd8 MacAddress:1a:a3:47:17:ef:9e Speed:10000 Mtu:8900} {Name:2a6fcf9f9144c9d MacAddress:62:55:1e:ca:71:89 Speed:10000 Mtu:8900} {Name:2a8e902cd252f0c MacAddress:e2:86:f7:13:51:1d Speed:10000 Mtu:8900} {Name:2f60507250e058d MacAddress:aa:e5:43:9c:60:75 Speed:10000 Mtu:8900} {Name:3235d3bd9c5f6c6 MacAddress:8a:18:be:f7:22:2c Speed:10000 Mtu:8900} {Name:34c7c8e7bf6b260 MacAddress:5e:9b:14:6f:03:cc Speed:10000 Mtu:8900} {Name:3d64c10d51d4d90 MacAddress:46:60:8c:b2:fd:67 Speed:10000 Mtu:8900} {Name:4025d1a6cb66b17 MacAddress:fe:93:d1:88:46:c9 Speed:10000 Mtu:8900} {Name:40effd583e5274c MacAddress:16:23:58:59:cc:15 Speed:10000 Mtu:8900} {Name:4482916b3b4b521 MacAddress:5a:26:02:e7:e7:73 Speed:10000 Mtu:8900} {Name:47433e6e63affa1 MacAddress:96:97:7c:d3:d9:94 Speed:10000 Mtu:8900} {Name:48d46b7645a64ea MacAddress:1a:70:6c:39:1d:6e Speed:10000 Mtu:8900} {Name:59d3785f249896d MacAddress:82:b4:6f:03:5b:f9 Speed:10000 Mtu:8900} {Name:718b320dd5408c5 MacAddress:42:de:ff:18:fb:74 Speed:10000 Mtu:8900} {Name:7f9b714050ae3b0 MacAddress:52:dc:31:d2:44:1e Speed:10000 Mtu:8900} {Name:8b3d4b079b2ebb8 MacAddress:46:8f:54:21:d6:7a Speed:10000 Mtu:8900} {Name:946c23d9475281a MacAddress:2e:5f:54:97:d0:aa Speed:10000 Mtu:8900} {Name:95267fdf9dd0903 MacAddress:42:91:7a:2b:c8:d1 Speed:10000 Mtu:8900} {Name:9903faf2a78afa2 MacAddress:fe:ec:4e:5c:3c:16 Speed:10000 Mtu:8900} {Name:9b135e9cc968b9e MacAddress:ee:eb:82:5c:16:d8 Speed:10000 Mtu:8900} {Name:a5455725a1362a8 MacAddress:26:eb:eb:0e:4a:4a Speed:10000 Mtu:8900} {Name:a77b94c86474586 MacAddress:a6:91:a8:cd:33:fe Speed:10000 Mtu:8900} {Name:adfc23f0d784d89 MacAddress:22:f3:90:39:e7:5d Speed:10000 Mtu:8900} {Name:b1a0209e0a9a409 MacAddress:8a:da:ad:e6:ed:c1 Speed:10000 Mtu:8900} {Name:b259f65d2759400 MacAddress:b2:82:d7:3a:e3:cf Speed:10000 Mtu:8900} {Name:b57da2053e17839 MacAddress:52:06:e9:09:8c:a1 Speed:10000 Mtu:8900} {Name:b6803c40b59bc22 MacAddress:a2:de:76:f6:e3:2d Speed:10000 Mtu:8900} {Name:b7854fff4e6d290 MacAddress:e2:3b:59:c9:30:31 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:0e:c0:e1:46:be:d0 Speed:0 Mtu:8900} {Name:c4262d5f7cd90d7 MacAddress:56:be:59:a3:4f:0d Speed:10000 Mtu:8900} {Name:c6cc97386aa9d8b MacAddress:66:8a:e0:98:c3:0a Speed:10000 Mtu:8900} {Name:c7817ab55e81e53 MacAddress:42:6c:07:6f:53:c0 Speed:10000 Mtu:8900} {Name:cbd8c33fbce7b1c MacAddress:0e:2a:0e:01:fa:00 Speed:10000 Mtu:8900} {Name:d47d14f256ba673 MacAddress:f6:22:f1:06:e6:47 Speed:10000 Mtu:8900} {Name:d73b051671cc575 MacAddress:2e:11:69:61:1c:62 Speed:10000 Mtu:8900} {Name:d9c3b609dc5ba94 MacAddress:fa:0f:0f:f6:ff:f8 Speed:10000 Mtu:8900} {Name:dc0e0feca8d0836 MacAddress:c6:e1:77:51:d6:46 Speed:10000 Mtu:8900} {Name:df9f6505570d879 MacAddress:7a:62:3f:f3:9f:50 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:26:03:3b Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:21:09:89 Speed:-1 Mtu:9000} {Name:f7ac4af0eeac6f9 MacAddress:de:c4:d6:f6:13:78 Speed:10000 Mtu:8900} {Name:f8dce45144c680f MacAddress:32:55:81:f9:29:a4 Speed:10000 Mtu:8900} {Name:ff14038e05786c3 MacAddress:d2:13:ae:f7:1f:28 Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:06:d0:49:23:c0:ac Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 08 03:53:08.054993 master-0 kubenswrapper[18592]: I0308 03:53:08.054981 18592 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 08 03:53:08.055110 master-0 kubenswrapper[18592]: I0308 03:53:08.055094 18592 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 08 03:53:08.055349 master-0 kubenswrapper[18592]: I0308 03:53:08.055337 18592 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 08 03:53:08.055536 master-0 kubenswrapper[18592]: I0308 03:53:08.055507 18592 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 08 03:53:08.055747 master-0 kubenswrapper[18592]: I0308 03:53:08.055591 18592 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 08 03:53:08.055850 master-0 kubenswrapper[18592]: I0308 03:53:08.055839 18592 topology_manager.go:138] "Creating topology manager with none policy" Mar 08 03:53:08.055911 master-0 kubenswrapper[18592]: I0308 03:53:08.055903 18592 container_manager_linux.go:303] "Creating device plugin manager" Mar 08 03:53:08.055967 master-0 kubenswrapper[18592]: I0308 03:53:08.055956 18592 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 03:53:08.056032 master-0 kubenswrapper[18592]: I0308 03:53:08.056023 18592 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 03:53:08.056156 master-0 kubenswrapper[18592]: I0308 03:53:08.056144 18592 state_mem.go:36] "Initialized new in-memory state store" Mar 08 03:53:08.056271 master-0 kubenswrapper[18592]: I0308 03:53:08.056261 18592 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 08 03:53:08.056364 master-0 kubenswrapper[18592]: I0308 03:53:08.056355 18592 kubelet.go:418] "Attempting to sync node with API server" Mar 08 03:53:08.056426 master-0 kubenswrapper[18592]: I0308 03:53:08.056417 18592 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 08 03:53:08.056484 master-0 kubenswrapper[18592]: I0308 03:53:08.056475 18592 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 08 03:53:08.056543 master-0 kubenswrapper[18592]: I0308 03:53:08.056535 18592 kubelet.go:324] "Adding apiserver pod source" Mar 08 03:53:08.056595 master-0 kubenswrapper[18592]: I0308 03:53:08.056586 18592 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 08 03:53:08.064311 master-0 kubenswrapper[18592]: I0308 03:53:08.064287 18592 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 08 03:53:08.064615 master-0 kubenswrapper[18592]: I0308 03:53:08.064603 18592 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 08 03:53:08.064964 master-0 kubenswrapper[18592]: I0308 03:53:08.064951 18592 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 08 03:53:08.065145 master-0 kubenswrapper[18592]: I0308 03:53:08.065134 18592 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 08 03:53:08.065235 master-0 kubenswrapper[18592]: I0308 03:53:08.065225 18592 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 08 03:53:08.065292 master-0 kubenswrapper[18592]: I0308 03:53:08.065283 18592 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 08 03:53:08.065346 master-0 kubenswrapper[18592]: I0308 03:53:08.065334 18592 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 08 03:53:08.065398 master-0 kubenswrapper[18592]: I0308 03:53:08.065390 18592 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 08 03:53:08.065451 master-0 kubenswrapper[18592]: I0308 03:53:08.065443 18592 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 08 03:53:08.065501 master-0 kubenswrapper[18592]: I0308 03:53:08.065493 18592 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 08 03:53:08.065555 master-0 kubenswrapper[18592]: I0308 03:53:08.065547 18592 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 08 03:53:08.065605 master-0 kubenswrapper[18592]: I0308 03:53:08.065597 18592 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 08 03:53:08.065659 master-0 kubenswrapper[18592]: I0308 03:53:08.065650 18592 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 08 03:53:08.065711 master-0 kubenswrapper[18592]: I0308 03:53:08.065703 18592 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 08 03:53:08.065769 master-0 kubenswrapper[18592]: I0308 03:53:08.065760 18592 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 08 03:53:08.065915 master-0 kubenswrapper[18592]: I0308 03:53:08.065906 18592 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 08 03:53:08.066327 master-0 kubenswrapper[18592]: I0308 03:53:08.066316 18592 server.go:1280] "Started kubelet" Mar 08 03:53:08.066641 master-0 kubenswrapper[18592]: I0308 03:53:08.066500 18592 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 08 03:53:08.066949 master-0 kubenswrapper[18592]: I0308 03:53:08.066907 18592 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 08 03:53:08.067027 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 08 03:53:08.067223 master-0 kubenswrapper[18592]: I0308 03:53:08.067167 18592 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 08 03:53:08.067696 master-0 kubenswrapper[18592]: I0308 03:53:08.067653 18592 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 08 03:53:08.072841 master-0 kubenswrapper[18592]: I0308 03:53:08.069547 18592 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 03:53:08.072841 master-0 kubenswrapper[18592]: I0308 03:53:08.071365 18592 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 03:53:08.074789 master-0 kubenswrapper[18592]: I0308 03:53:08.073325 18592 server.go:449] "Adding debug handlers to kubelet server" Mar 08 03:53:08.079470 master-0 kubenswrapper[18592]: I0308 03:53:08.078726 18592 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 08 03:53:08.079470 master-0 kubenswrapper[18592]: I0308 03:53:08.078755 18592 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 08 03:53:08.079470 master-0 kubenswrapper[18592]: I0308 03:53:08.079057 18592 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-09 03:37:12 +0000 UTC, rotation deadline is 2026-03-08 22:51:30.259723196 +0000 UTC Mar 08 03:53:08.079470 master-0 kubenswrapper[18592]: I0308 03:53:08.079097 18592 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h58m22.180627488s for next certificate rotation Mar 08 03:53:08.080847 master-0 kubenswrapper[18592]: I0308 03:53:08.079678 18592 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 08 03:53:08.080847 master-0 kubenswrapper[18592]: I0308 03:53:08.079693 18592 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 08 03:53:08.080847 master-0 kubenswrapper[18592]: I0308 03:53:08.079794 18592 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 08 03:53:08.083915 master-0 kubenswrapper[18592]: I0308 03:53:08.082969 18592 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.112948 18592 factory.go:153] Registering CRI-O factory Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.112994 18592 factory.go:221] Registration of the crio container factory successfully Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114310 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ff63c73-62a3-44b4-acd3-1b3df175794f" volumeName="kubernetes.io/secret/7ff63c73-62a3-44b4-acd3-1b3df175794f-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114399 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a6c4695c-da78-46b6-8f92-ca93c5ebb96b" volumeName="kubernetes.io/secret/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114423 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c9de4939-680a-4e3e-89fd-e20ecb8b10f2" volumeName="kubernetes.io/projected/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-bound-sa-token" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114436 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f" volumeName="kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-config" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114454 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e4541b7b-3f7f-4851-9bd9-26fcda5cab13" volumeName="kubernetes.io/configmap/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-config" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114468 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e78b283b-981e-48d7-a5f2-53f8401766ea" volumeName="kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-images" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114482 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ddfd0e7-fe76-41bc-b316-94505df81002" volumeName="kubernetes.io/projected/3ddfd0e7-fe76-41bc-b316-94505df81002-kube-api-access-bgc7c" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114501 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a" volumeName="kubernetes.io/configmap/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-config" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114517 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="84d353ae-3992-4c17-a20e-3415edd92509" volumeName="kubernetes.io/projected/84d353ae-3992-4c17-a20e-3415edd92509-kube-api-access-7smmf" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114534 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8e283f49-b85d-4789-a71f-3fcb5033cdf0" volumeName="kubernetes.io/empty-dir/8e283f49-b85d-4789-a71f-3fcb5033cdf0-utilities" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114546 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b3eea925-73b3-4693-8f0e-6dd26107f60a" volumeName="kubernetes.io/projected/b3eea925-73b3-4693-8f0e-6dd26107f60a-kube-api-access-6sx5s" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114566 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e78b283b-981e-48d7-a5f2-53f8401766ea" volumeName="kubernetes.io/projected/e78b283b-981e-48d7-a5f2-53f8401766ea-kube-api-access-rchj5" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114581 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76ba45a2-8945-4afe-b913-126c26725867" volumeName="kubernetes.io/configmap/76ba45a2-8945-4afe-b913-126c26725867-audit-policies" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114602 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7e5935ea-8d95-45e3-b836-c7892953ef3d" volumeName="kubernetes.io/projected/7e5935ea-8d95-45e3-b836-c7892953ef3d-kube-api-access-c6gml" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114615 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d831cb23-7411-4072-8273-c167d9afca28" volumeName="kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114626 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="164586b1-f133-4427-8ab6-eb0839b79738" volumeName="kubernetes.io/secret/164586b1-f133-4427-8ab6-eb0839b79738-webhook-cert" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114642 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="30211469-7108-4820-a988-26fc4ced734e" volumeName="kubernetes.io/configmap/30211469-7108-4820-a988-26fc4ced734e-config" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114652 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="33ed331b-89e9-45f8-ab3c-4533a77cc7b6" volumeName="kubernetes.io/configmap/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-auth-proxy-config" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114667 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="54ad284e-d40e-4e69-b898-f5093952a0e6" volumeName="kubernetes.io/projected/54ad284e-d40e-4e69-b898-f5093952a0e6-kube-api-access-9lfcj" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114678 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a7752f9-7b9a-451f-997a-e9f696d38b34" volumeName="kubernetes.io/secret/5a7752f9-7b9a-451f-997a-e9f696d38b34-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114691 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f7d2cef-b17b-43ba-a222-9e6e8d8352e2" volumeName="kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-client-ca" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114707 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4a19441e-e61b-4d58-85db-813ae88e1f9b" volumeName="kubernetes.io/projected/4a19441e-e61b-4d58-85db-813ae88e1f9b-kube-api-access-dw7bx" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114724 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76ba45a2-8945-4afe-b913-126c26725867" volumeName="kubernetes.io/secret/76ba45a2-8945-4afe-b913-126c26725867-etcd-client" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114742 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7b485db9-29b5-45a1-a4fb-b4264c6bf2d6" volumeName="kubernetes.io/configmap/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6-config-volume" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114757 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0861ccd-5e86-4277-9082-95f3133508a0" volumeName="kubernetes.io/secret/c0861ccd-5e86-4277-9082-95f3133508a0-cloud-credential-operator-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114772 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0918ba32-8e55-48d0-8e50-027c0dcb4bbd" volumeName="kubernetes.io/secret/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114791 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1b69fbf6-1ca5-413e-bffd-965730bcec1b" volumeName="kubernetes.io/empty-dir/1b69fbf6-1ca5-413e-bffd-965730bcec1b-cache" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114806 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="33ed331b-89e9-45f8-ab3c-4533a77cc7b6" volumeName="kubernetes.io/configmap/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-images" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114841 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76ba45a2-8945-4afe-b913-126c26725867" volumeName="kubernetes.io/projected/76ba45a2-8945-4afe-b913-126c26725867-kube-api-access-dtts2" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114863 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" volumeName="kubernetes.io/projected/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-kube-api-access-zxkm6" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114877 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" volumeName="kubernetes.io/secret/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114890 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f" volumeName="kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-image-import-ca" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114904 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e187516f-8f33-4c17-81d6-60c10b580bb0" volumeName="kubernetes.io/empty-dir/e187516f-8f33-4c17-81d6-60c10b580bb0-tmp" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114915 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79" volumeName="kubernetes.io/empty-dir/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-tmpfs" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114932 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="26180f77-0b1a-4d0f-9ed0-a12fdee69817" volumeName="kubernetes.io/secret/26180f77-0b1a-4d0f-9ed0-a12fdee69817-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114946 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f7d2cef-b17b-43ba-a222-9e6e8d8352e2" volumeName="kubernetes.io/projected/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-kube-api-access-pb87l" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114959 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ec083d-dc74-457e-b10f-3bde04e9e75e" volumeName="kubernetes.io/projected/49ec083d-dc74-457e-b10f-3bde04e9e75e-kube-api-access-zcjr9" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114973 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ff63c73-62a3-44b4-acd3-1b3df175794f" volumeName="kubernetes.io/projected/7ff63c73-62a3-44b4-acd3-1b3df175794f-kube-api-access-vfqc5" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114984 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ee586416-6f56-4ea4-ad62-95de1e6df23b" volumeName="kubernetes.io/secret/ee586416-6f56-4ea4-ad62-95de1e6df23b-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.114998 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e93b5361-30e6-44fd-a59e-2bc410c59480" volumeName="kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115011 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0418ff42-7eac-4266-97b5-4df88623d066" volumeName="kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115022 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="30211469-7108-4820-a988-26fc4ced734e" volumeName="kubernetes.io/secret/30211469-7108-4820-a988-26fc4ced734e-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115036 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="69eb8ba2-7bfb-4433-8951-08f89e7bcb5f" volumeName="kubernetes.io/projected/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-kube-api-access-fw7mr" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115048 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8e283f49-b85d-4789-a71f-3fcb5033cdf0" volumeName="kubernetes.io/projected/8e283f49-b85d-4789-a71f-3fcb5033cdf0-kube-api-access-zm59c" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115062 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" volumeName="kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-service-ca-bundle" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115073 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1eb851be-f157-48ea-9a39-1361b68d2639" volumeName="kubernetes.io/projected/1eb851be-f157-48ea-9a39-1361b68d2639-kube-api-access-nqhzl" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115085 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4a19441e-e61b-4d58-85db-813ae88e1f9b" volumeName="kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-cni-binary-copy" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115100 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a" volumeName="kubernetes.io/secret/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115112 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="093f17f0-2818-4e24-b3c3-6ab4da9d21fb" volumeName="kubernetes.io/projected/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-kube-api-access-7nk8r" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115131 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0ebf1330-e044-4ff5-8b48-2d667e0c5625" volumeName="kubernetes.io/configmap/0ebf1330-e044-4ff5-8b48-2d667e0c5625-config" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115149 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79" volumeName="kubernetes.io/secret/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-apiservice-cert" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115164 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79" volumeName="kubernetes.io/secret/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-webhook-cert" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115185 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1b69fbf6-1ca5-413e-bffd-965730bcec1b" volumeName="kubernetes.io/projected/1b69fbf6-1ca5-413e-bffd-965730bcec1b-ca-certs" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115201 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d831cb23-7411-4072-8273-c167d9afca28" volumeName="kubernetes.io/configmap/d831cb23-7411-4072-8273-c167d9afca28-images" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115213 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2dd4279d-a1a9-450a-a061-9008cd1ea8e0" volumeName="kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115229 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a7752f9-7b9a-451f-997a-e9f696d38b34" volumeName="kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-config" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115249 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="69eb8ba2-7bfb-4433-8951-08f89e7bcb5f" volumeName="kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115263 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d2cd5b23-e622-4b96-aee8-dbc942b73b4a" volumeName="kubernetes.io/projected/d2cd5b23-e622-4b96-aee8-dbc942b73b4a-kube-api-access-jljzc" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115274 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e4541b7b-3f7f-4851-9bd9-26fcda5cab13" volumeName="kubernetes.io/secret/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115287 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2dd4279d-a1a9-450a-a061-9008cd1ea8e0" volumeName="kubernetes.io/projected/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-kube-api-access-pnzt7" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115297 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="30211469-7108-4820-a988-26fc4ced734e" volumeName="kubernetes.io/projected/30211469-7108-4820-a988-26fc4ced734e-kube-api-access-fncng" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115313 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" volumeName="kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-trusted-ca-bundle" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115330 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b70adfe9-94f1-44bc-85ce-498e5f0a1ca7" volumeName="kubernetes.io/projected/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-kube-api-access-6nlq2" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115343 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0861ccd-5e86-4277-9082-95f3133508a0" volumeName="kubernetes.io/configmap/c0861ccd-5e86-4277-9082-95f3133508a0-cco-trusted-ca" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115361 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="139881ee-6cfa-4a7e-b002-63cece048d16" volumeName="kubernetes.io/projected/139881ee-6cfa-4a7e-b002-63cece048d16-kube-api-access-h9z8g" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115374 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a7752f9-7b9a-451f-997a-e9f696d38b34" volumeName="kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-service-ca" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115388 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76ba45a2-8945-4afe-b913-126c26725867" volumeName="kubernetes.io/configmap/76ba45a2-8945-4afe-b913-126c26725867-trusted-ca-bundle" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115400 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7e5935ea-8d95-45e3-b836-c7892953ef3d" volumeName="kubernetes.io/configmap/7e5935ea-8d95-45e3-b836-c7892953ef3d-ovnkube-config" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115410 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0f5f3f3-0856-4da3-9157-15f65c6aba6e" volumeName="kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-env-overrides" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115427 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="093f17f0-2818-4e24-b3c3-6ab4da9d21fb" volumeName="kubernetes.io/configmap/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-daemon-config" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115446 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1cbcb403-a424-4496-8c5c-5eb5e42dfb93" volumeName="kubernetes.io/projected/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-kube-api-access" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115464 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c9de4939-680a-4e3e-89fd-e20ecb8b10f2" volumeName="kubernetes.io/configmap/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-trusted-ca" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115477 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e187516f-8f33-4c17-81d6-60c10b580bb0" volumeName="kubernetes.io/projected/e187516f-8f33-4c17-81d6-60c10b580bb0-kube-api-access-vg9kg" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115489 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f" volumeName="kubernetes.io/secret/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-encryption-config" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115502 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0f5f3f3-0856-4da3-9157-15f65c6aba6e" volumeName="kubernetes.io/projected/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-kube-api-access-2vklx" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115513 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0d377285-0336-41b7-b48f-c44a7b563498" volumeName="kubernetes.io/projected/0d377285-0336-41b7-b48f-c44a7b563498-kube-api-access-7qn5v" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115528 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="127c3f92-8283-4179-9e40-a12dcabaaa12" volumeName="kubernetes.io/secret/127c3f92-8283-4179-9e40-a12dcabaaa12-machine-approver-tls" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115546 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="232c421d-96f0-4894-b8d8-74f43d02bbd3" volumeName="kubernetes.io/projected/232c421d-96f0-4894-b8d8-74f43d02bbd3-kube-api-access-fx4fw" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115561 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="26180f77-0b1a-4d0f-9ed0-a12fdee69817" volumeName="kubernetes.io/projected/26180f77-0b1a-4d0f-9ed0-a12fdee69817-kube-api-access" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115578 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2b3d1dc7-22f9-4c0c-802a-d7314894b255" volumeName="kubernetes.io/empty-dir/2b3d1dc7-22f9-4c0c-802a-d7314894b255-catalog-content" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115591 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b70adfe9-94f1-44bc-85ce-498e5f0a1ca7" volumeName="kubernetes.io/configmap/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-config" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115604 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c9de4939-680a-4e3e-89fd-e20ecb8b10f2" volumeName="kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115620 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ee586416-6f56-4ea4-ad62-95de1e6df23b" volumeName="kubernetes.io/configmap/ee586416-6f56-4ea4-ad62-95de1e6df23b-trusted-ca-bundle" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115641 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="127c3f92-8283-4179-9e40-a12dcabaaa12" volumeName="kubernetes.io/configmap/127c3f92-8283-4179-9e40-a12dcabaaa12-config" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115659 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2262647b-c315-477a-93bd-f168c1810475" volumeName="kubernetes.io/configmap/2262647b-c315-477a-93bd-f168c1810475-service-ca" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115673 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4a19441e-e61b-4d58-85db-813ae88e1f9b" volumeName="kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-whereabouts-configmap" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115685 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a7752f9-7b9a-451f-997a-e9f696d38b34" volumeName="kubernetes.io/projected/5a7752f9-7b9a-451f-997a-e9f696d38b34-kube-api-access-8b5zb" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115702 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6bee226a-2a66-4032-8aba-2c8b82abcb6a" volumeName="kubernetes.io/empty-dir/6bee226a-2a66-4032-8aba-2c8b82abcb6a-utilities" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115715 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e4541b7b-3f7f-4851-9bd9-26fcda5cab13" volumeName="kubernetes.io/projected/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-kube-api-access" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115734 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1482d789-884b-4337-b598-f0e2b71eb9f2" volumeName="kubernetes.io/projected/1482d789-884b-4337-b598-f0e2b71eb9f2-kube-api-access-m2h62" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115747 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="232c421d-96f0-4894-b8d8-74f43d02bbd3" volumeName="kubernetes.io/configmap/232c421d-96f0-4894-b8d8-74f43d02bbd3-trusted-ca" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115759 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2b3d1dc7-22f9-4c0c-802a-d7314894b255" volumeName="kubernetes.io/empty-dir/2b3d1dc7-22f9-4c0c-802a-d7314894b255-utilities" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115775 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4a19441e-e61b-4d58-85db-813ae88e1f9b" volumeName="kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-cni-sysctl-allowlist" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115789 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="634c0f6d-bce6-42cf-9253-80d1bcc7c507" volumeName="kubernetes.io/projected/634c0f6d-bce6-42cf-9253-80d1bcc7c507-kube-api-access-8cwmn" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115807 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d5044ffd-0686-4679-9894-e696faf33699" volumeName="kubernetes.io/projected/d5044ffd-0686-4679-9894-e696faf33699-kube-api-access-mmhtb" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115857 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d831cb23-7411-4072-8273-c167d9afca28" volumeName="kubernetes.io/configmap/d831cb23-7411-4072-8273-c167d9afca28-config" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115877 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0031e3a9-b253-4dda-a890-bf3e4d8737e8" volumeName="kubernetes.io/empty-dir/0031e3a9-b253-4dda-a890-bf3e4d8737e8-utilities" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115897 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="164586b1-f133-4427-8ab6-eb0839b79738" volumeName="kubernetes.io/projected/164586b1-f133-4427-8ab6-eb0839b79738-kube-api-access-r4stz" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115912 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f7d2cef-b17b-43ba-a222-9e6e8d8352e2" volumeName="kubernetes.io/secret/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115929 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8efdcef9-9b31-4567-b7f9-cb59a894273d" volumeName="kubernetes.io/projected/8efdcef9-9b31-4567-b7f9-cb59a894273d-kube-api-access-cpsx7" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115947 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9ec89e27-4360-48f2-a7ca-5d823bda4510" volumeName="kubernetes.io/projected/9ec89e27-4360-48f2-a7ca-5d823bda4510-kube-api-access-vndvf" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115960 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a" volumeName="kubernetes.io/projected/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-kube-api-access-cfvnn" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115978 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a7752f9-7b9a-451f-997a-e9f696d38b34" volumeName="kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-ca" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.115992 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="634c0f6d-bce6-42cf-9253-80d1bcc7c507" volumeName="kubernetes.io/secret/634c0f6d-bce6-42cf-9253-80d1bcc7c507-samples-operator-tls" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116039 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e78b283b-981e-48d7-a5f2-53f8401766ea" volumeName="kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116059 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a6c4695c-da78-46b6-8f92-ca93c5ebb96b" volumeName="kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-proxy-ca-bundles" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116088 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ee586416-6f56-4ea4-ad62-95de1e6df23b" volumeName="kubernetes.io/empty-dir/ee586416-6f56-4ea4-ad62-95de1e6df23b-snapshots" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116111 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0ebf1330-e044-4ff5-8b48-2d667e0c5625" volumeName="kubernetes.io/secret/0ebf1330-e044-4ff5-8b48-2d667e0c5625-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116134 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1cbcb403-a424-4496-8c5c-5eb5e42dfb93" volumeName="kubernetes.io/configmap/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-config" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116151 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a7752f9-7b9a-451f-997a-e9f696d38b34" volumeName="kubernetes.io/secret/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-client" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116167 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7b485db9-29b5-45a1-a4fb-b4264c6bf2d6" volumeName="kubernetes.io/projected/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6-kube-api-access-nfz6w" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116180 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a6c4695c-da78-46b6-8f92-ca93c5ebb96b" volumeName="kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-config" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116196 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0d377285-0336-41b7-b48f-c44a7b563498" volumeName="kubernetes.io/configmap/0d377285-0336-41b7-b48f-c44a7b563498-config" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116552 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="164586b1-f133-4427-8ab6-eb0839b79738" volumeName="kubernetes.io/configmap/164586b1-f133-4427-8ab6-eb0839b79738-env-overrides" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116574 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7b485db9-29b5-45a1-a4fb-b4264c6bf2d6" volumeName="kubernetes.io/secret/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6-metrics-tls" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116588 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d5044ffd-0686-4679-9894-e696faf33699" volumeName="kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116609 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ee586416-6f56-4ea4-ad62-95de1e6df23b" volumeName="kubernetes.io/configmap/ee586416-6f56-4ea4-ad62-95de1e6df23b-service-ca-bundle" seLinuxMountContext="" Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116790 18592 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116809 18592 factory.go:55] Registering systemd factory Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116817 18592 factory.go:221] Registration of the systemd container factory successfully Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116847 18592 factory.go:103] Registering Raw factory Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116859 18592 manager.go:1196] Started watching for new ooms in manager Mar 08 03:53:08.116900 master-0 kubenswrapper[18592]: I0308 03:53:08.116620 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0f5f3f3-0856-4da3-9157-15f65c6aba6e" volumeName="kubernetes.io/secret/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovn-node-metrics-cert" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117075 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1b69fbf6-1ca5-413e-bffd-965730bcec1b" volumeName="kubernetes.io/projected/1b69fbf6-1ca5-413e-bffd-965730bcec1b-kube-api-access-nfz27" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117090 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1b700d17-83d2-46c8-afbc-e5774822eabe" volumeName="kubernetes.io/projected/1b700d17-83d2-46c8-afbc-e5774822eabe-kube-api-access-cv7sd" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117100 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="69eb8ba2-7bfb-4433-8951-08f89e7bcb5f" volumeName="kubernetes.io/projected/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-bound-sa-token" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117180 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6cde5024-edf7-4fa4-8964-cabe7899578b" volumeName="kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117200 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a6c4695c-da78-46b6-8f92-ca93c5ebb96b" volumeName="kubernetes.io/projected/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-kube-api-access-bd7d5" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117215 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="54ad284e-d40e-4e69-b898-f5093952a0e6" volumeName="kubernetes.io/configmap/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-trusted-ca" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117224 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b70adfe9-94f1-44bc-85ce-498e5f0a1ca7" volumeName="kubernetes.io/configmap/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-images" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117235 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7e5935ea-8d95-45e3-b836-c7892953ef3d" volumeName="kubernetes.io/configmap/7e5935ea-8d95-45e3-b836-c7892953ef3d-env-overrides" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117249 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f" volumeName="kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-trusted-ca-bundle" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117261 18592 manager.go:319] Starting recovery of all containers Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117260 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="093f17f0-2818-4e24-b3c3-6ab4da9d21fb" volumeName="kubernetes.io/configmap/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-cni-binary-copy" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117296 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79" volumeName="kubernetes.io/projected/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-kube-api-access-7fzmf" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117346 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="139881ee-6cfa-4a7e-b002-63cece048d16" volumeName="kubernetes.io/secret/139881ee-6cfa-4a7e-b002-63cece048d16-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117362 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1b700d17-83d2-46c8-afbc-e5774822eabe" volumeName="kubernetes.io/configmap/1b700d17-83d2-46c8-afbc-e5774822eabe-auth-proxy-config" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117388 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="33ed331b-89e9-45f8-ab3c-4533a77cc7b6" volumeName="kubernetes.io/secret/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-cloud-controller-manager-operator-tls" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117400 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ff63c73-62a3-44b4-acd3-1b3df175794f" volumeName="kubernetes.io/empty-dir/7ff63c73-62a3-44b4-acd3-1b3df175794f-operand-assets" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117415 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" volumeName="kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-config" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117426 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f" volumeName="kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-audit" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117437 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0418ff42-7eac-4266-97b5-4df88623d066" volumeName="kubernetes.io/projected/0418ff42-7eac-4266-97b5-4df88623d066-kube-api-access-kmpdd" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117452 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0ebf1330-e044-4ff5-8b48-2d667e0c5625" volumeName="kubernetes.io/projected/0ebf1330-e044-4ff5-8b48-2d667e0c5625-kube-api-access-hccv4" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117463 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="232c421d-96f0-4894-b8d8-74f43d02bbd3" volumeName="kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117477 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ec083d-dc74-457e-b10f-3bde04e9e75e" volumeName="kubernetes.io/configmap/49ec083d-dc74-457e-b10f-3bde04e9e75e-signing-cabundle" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117516 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="52b495ac-bb28-44f3-b925-3c54f86d5ec4" volumeName="kubernetes.io/projected/52b495ac-bb28-44f3-b925-3c54f86d5ec4-kube-api-access-dd549" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117538 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b3eea925-73b3-4693-8f0e-6dd26107f60a" volumeName="kubernetes.io/secret/b3eea925-73b3-4693-8f0e-6dd26107f60a-cluster-storage-operator-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117551 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f" volumeName="kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-etcd-serving-ca" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117566 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d831cb23-7411-4072-8273-c167d9afca28" volumeName="kubernetes.io/projected/d831cb23-7411-4072-8273-c167d9afca28-kube-api-access-dwkwt" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117580 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="54ad284e-d40e-4e69-b898-f5093952a0e6" volumeName="kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117591 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0031e3a9-b253-4dda-a890-bf3e4d8737e8" volumeName="kubernetes.io/projected/0031e3a9-b253-4dda-a890-bf3e4d8737e8-kube-api-access-qhms8" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117602 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1b700d17-83d2-46c8-afbc-e5774822eabe" volumeName="kubernetes.io/secret/1b700d17-83d2-46c8-afbc-e5774822eabe-cert" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117618 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f59fe81-deee-4ced-ae9d-f17752c82c4b" volumeName="kubernetes.io/projected/2f59fe81-deee-4ced-ae9d-f17752c82c4b-kube-api-access-bm7bw" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117629 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ddfd0e7-fe76-41bc-b316-94505df81002" volumeName="kubernetes.io/secret/3ddfd0e7-fe76-41bc-b316-94505df81002-metrics-tls" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117639 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4c5a0c1d-867a-4ce4-9570-ea66452c8db3" volumeName="kubernetes.io/configmap/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-iptables-alerter-script" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117703 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0418ff42-7eac-4266-97b5-4df88623d066" volumeName="kubernetes.io/configmap/0418ff42-7eac-4266-97b5-4df88623d066-telemetry-config" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117720 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1eb851be-f157-48ea-9a39-1361b68d2639" volumeName="kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117735 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="33ed331b-89e9-45f8-ab3c-4533a77cc7b6" volumeName="kubernetes.io/projected/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-kube-api-access-hmsj5" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117746 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8e283f49-b85d-4789-a71f-3fcb5033cdf0" volumeName="kubernetes.io/empty-dir/8e283f49-b85d-4789-a71f-3fcb5033cdf0-catalog-content" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117757 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6bee226a-2a66-4032-8aba-2c8b82abcb6a" volumeName="kubernetes.io/projected/6bee226a-2a66-4032-8aba-2c8b82abcb6a-kube-api-access-tp98d" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117770 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76ba45a2-8945-4afe-b913-126c26725867" volumeName="kubernetes.io/secret/76ba45a2-8945-4afe-b913-126c26725867-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117779 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f7d2cef-b17b-43ba-a222-9e6e8d8352e2" volumeName="kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-config" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117795 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76ba45a2-8945-4afe-b913-126c26725867" volumeName="kubernetes.io/secret/76ba45a2-8945-4afe-b913-126c26725867-encryption-config" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117805 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a6c4695c-da78-46b6-8f92-ca93c5ebb96b" volumeName="kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-client-ca" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117817 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="127c3f92-8283-4179-9e40-a12dcabaaa12" volumeName="kubernetes.io/projected/127c3f92-8283-4179-9e40-a12dcabaaa12-kube-api-access-zdn9r" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117881 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="164586b1-f133-4427-8ab6-eb0839b79738" volumeName="kubernetes.io/configmap/164586b1-f133-4427-8ab6-eb0839b79738-ovnkube-identity-cm" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117893 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2b3d1dc7-22f9-4c0c-802a-d7314894b255" volumeName="kubernetes.io/projected/2b3d1dc7-22f9-4c0c-802a-d7314894b255-kube-api-access-zfgc6" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117907 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f59fe81-deee-4ced-ae9d-f17752c82c4b" volumeName="kubernetes.io/empty-dir/2f59fe81-deee-4ced-ae9d-f17752c82c4b-cache" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117919 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f59fe81-deee-4ced-ae9d-f17752c82c4b" volumeName="kubernetes.io/projected/2f59fe81-deee-4ced-ae9d-f17752c82c4b-ca-certs" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117929 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b70adfe9-94f1-44bc-85ce-498e5f0a1ca7" volumeName="kubernetes.io/secret/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-machine-api-operator-tls" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117942 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0861ccd-5e86-4277-9082-95f3133508a0" volumeName="kubernetes.io/projected/c0861ccd-5e86-4277-9082-95f3133508a0-kube-api-access-n7rsc" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117961 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c9de4939-680a-4e3e-89fd-e20ecb8b10f2" volumeName="kubernetes.io/projected/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-kube-api-access-29dpg" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117977 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0031e3a9-b253-4dda-a890-bf3e4d8737e8" volumeName="kubernetes.io/empty-dir/0031e3a9-b253-4dda-a890-bf3e4d8737e8-catalog-content" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.117989 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2262647b-c315-477a-93bd-f168c1810475" volumeName="kubernetes.io/secret/2262647b-c315-477a-93bd-f168c1810475-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118059 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e187516f-8f33-4c17-81d6-60c10b580bb0" volumeName="kubernetes.io/empty-dir/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-tuned" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118076 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0f5f3f3-0856-4da3-9157-15f65c6aba6e" volumeName="kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovnkube-config" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118087 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="69eb8ba2-7bfb-4433-8951-08f89e7bcb5f" volumeName="kubernetes.io/configmap/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-trusted-ca" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118100 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6bee226a-2a66-4032-8aba-2c8b82abcb6a" volumeName="kubernetes.io/empty-dir/6bee226a-2a66-4032-8aba-2c8b82abcb6a-catalog-content" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118114 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6cde5024-edf7-4fa4-8964-cabe7899578b" volumeName="kubernetes.io/projected/6cde5024-edf7-4fa4-8964-cabe7899578b-kube-api-access-x997v" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118125 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="127c3f92-8283-4179-9e40-a12dcabaaa12" volumeName="kubernetes.io/configmap/127c3f92-8283-4179-9e40-a12dcabaaa12-auth-proxy-config" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118139 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1482d789-884b-4337-b598-f0e2b71eb9f2" volumeName="kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118151 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1b69fbf6-1ca5-413e-bffd-965730bcec1b" volumeName="kubernetes.io/secret/1b69fbf6-1ca5-413e-bffd-965730bcec1b-catalogserver-certs" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118161 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1cbcb403-a424-4496-8c5c-5eb5e42dfb93" volumeName="kubernetes.io/secret/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118176 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="26180f77-0b1a-4d0f-9ed0-a12fdee69817" volumeName="kubernetes.io/configmap/26180f77-0b1a-4d0f-9ed0-a12fdee69817-config" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118245 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76ba45a2-8945-4afe-b913-126c26725867" volumeName="kubernetes.io/configmap/76ba45a2-8945-4afe-b913-126c26725867-etcd-serving-ca" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118578 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7e5935ea-8d95-45e3-b836-c7892953ef3d" volumeName="kubernetes.io/secret/7e5935ea-8d95-45e3-b836-c7892953ef3d-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118590 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f" volumeName="kubernetes.io/secret/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-etcd-client" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118600 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d831cb23-7411-4072-8273-c167d9afca28" volumeName="kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118612 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0f5f3f3-0856-4da3-9157-15f65c6aba6e" volumeName="kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovnkube-script-lib" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118624 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0d377285-0336-41b7-b48f-c44a7b563498" volumeName="kubernetes.io/secret/0d377285-0336-41b7-b48f-c44a7b563498-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118640 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4c5a0c1d-867a-4ce4-9570-ea66452c8db3" volumeName="kubernetes.io/projected/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-kube-api-access-mkzb2" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118651 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8efdcef9-9b31-4567-b7f9-cb59a894273d" volumeName="kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118660 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f" volumeName="kubernetes.io/projected/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-kube-api-access-4v45k" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118673 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f" volumeName="kubernetes.io/secret/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-serving-cert" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118721 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ee586416-6f56-4ea4-ad62-95de1e6df23b" volumeName="kubernetes.io/projected/ee586416-6f56-4ea4-ad62-95de1e6df23b-kube-api-access-sxxhh" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118743 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0918ba32-8e55-48d0-8e50-027c0dcb4bbd" volumeName="kubernetes.io/empty-dir/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-available-featuregates" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118756 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0918ba32-8e55-48d0-8e50-027c0dcb4bbd" volumeName="kubernetes.io/projected/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-kube-api-access-mghmh" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118766 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2262647b-c315-477a-93bd-f168c1810475" volumeName="kubernetes.io/projected/2262647b-c315-477a-93bd-f168c1810475-kube-api-access" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118780 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="232c421d-96f0-4894-b8d8-74f43d02bbd3" volumeName="kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118791 18592 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ec083d-dc74-457e-b10f-3bde04e9e75e" volumeName="kubernetes.io/secret/49ec083d-dc74-457e-b10f-3bde04e9e75e-signing-key" seLinuxMountContext="" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118801 18592 reconstruct.go:97] "Volume reconstruction finished" Mar 08 03:53:08.120146 master-0 kubenswrapper[18592]: I0308 03:53:08.118809 18592 reconciler.go:26] "Reconciler: start to sync state" Mar 08 03:53:08.123237 master-0 kubenswrapper[18592]: I0308 03:53:08.123045 18592 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 08 03:53:08.128987 master-0 kubenswrapper[18592]: E0308 03:53:08.128629 18592 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 08 03:53:08.139608 master-0 kubenswrapper[18592]: I0308 03:53:08.139550 18592 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 08 03:53:08.141556 master-0 kubenswrapper[18592]: I0308 03:53:08.141535 18592 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 08 03:53:08.141609 master-0 kubenswrapper[18592]: I0308 03:53:08.141572 18592 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 08 03:53:08.141800 master-0 kubenswrapper[18592]: I0308 03:53:08.141778 18592 kubelet.go:2335] "Starting kubelet main sync loop" Mar 08 03:53:08.141857 master-0 kubenswrapper[18592]: E0308 03:53:08.141838 18592 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 08 03:53:08.143534 master-0 kubenswrapper[18592]: I0308 03:53:08.143509 18592 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 03:53:08.156535 master-0 kubenswrapper[18592]: I0308 03:53:08.156502 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-7f65c457f5-6fhhs_5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a/kube-storage-version-migrator-operator/1.log" Mar 08 03:53:08.156603 master-0 kubenswrapper[18592]: I0308 03:53:08.156539 18592 generic.go:334] "Generic (PLEG): container finished" podID="5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a" containerID="2f6ec83521bbab74297dfc2bb7addc122c614b4ebd158773a1f21c9c8a08aa06" exitCode=255 Mar 08 03:53:08.158633 master-0 kubenswrapper[18592]: I0308 03:53:08.158605 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-slm72_a60bc804-52e7-422a-87fd-ac4c5aa90cb3/authentication-operator/1.log" Mar 08 03:53:08.158695 master-0 kubenswrapper[18592]: I0308 03:53:08.158632 18592 generic.go:334] "Generic (PLEG): container finished" podID="a60bc804-52e7-422a-87fd-ac4c5aa90cb3" containerID="376406ceea2c5527fe4c957342f4e7bbd7c621b656b4317d1368005ebb85d7c7" exitCode=255 Mar 08 03:53:08.178882 master-0 kubenswrapper[18592]: I0308 03:53:08.178848 18592 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="3df151c3da265182304d84afb0b3bc1e42416ef6485b53e3bd88733c8055b421" exitCode=0 Mar 08 03:53:08.178882 master-0 kubenswrapper[18592]: I0308 03:53:08.178878 18592 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="338e07cf4149947d0b4bb7aee072ff8d4da6cb3eeb924ae9f2fa6dc0d8d523b1" exitCode=0 Mar 08 03:53:08.178882 master-0 kubenswrapper[18592]: I0308 03:53:08.178886 18592 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="74cf1dbcbe0d060e62a1cff77950d3cf19f4f4c11ebaceeae2f072445a583ffa" exitCode=0 Mar 08 03:53:08.180092 master-0 kubenswrapper[18592]: I0308 03:53:08.180070 18592 generic.go:334] "Generic (PLEG): container finished" podID="9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b" containerID="e137d58d0275a0b444de45e72047e1d303bf2156296279cd1f222cf4c2e05cac" exitCode=0 Mar 08 03:53:08.191712 master-0 kubenswrapper[18592]: I0308 03:53:08.191671 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 08 03:53:08.192145 master-0 kubenswrapper[18592]: I0308 03:53:08.192111 18592 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="1611cfa5e175032b10c844270b1926150f7a6bf4a58e7bfa0e9ab7a757d448fe" exitCode=1 Mar 08 03:53:08.192145 master-0 kubenswrapper[18592]: I0308 03:53:08.192143 18592 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="809953c4a3d0d1d245b0d287db991ef24c93f664dbc39c226e2a89fc2ba7da3d" exitCode=0 Mar 08 03:53:08.193427 master-0 kubenswrapper[18592]: I0308 03:53:08.193401 18592 generic.go:334] "Generic (PLEG): container finished" podID="30211469-7108-4820-a988-26fc4ced734e" containerID="47c3f7232d0f0bc4de9dbe2ca382d3e0709c3d618e0b06a088f2ef41c6b071e7" exitCode=0 Mar 08 03:53:08.197721 master-0 kubenswrapper[18592]: I0308 03:53:08.197701 18592 generic.go:334] "Generic (PLEG): container finished" podID="7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" containerID="d11b35d3ea3d0150cbdfe887feb70180d8c9d1802a844e12699e549dc588011a" exitCode=0 Mar 08 03:53:08.200845 master-0 kubenswrapper[18592]: I0308 03:53:08.200808 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-jghp5_d831cb23-7411-4072-8273-c167d9afca28/cluster-baremetal-operator/0.log" Mar 08 03:53:08.200896 master-0 kubenswrapper[18592]: I0308 03:53:08.200865 18592 generic.go:334] "Generic (PLEG): container finished" podID="d831cb23-7411-4072-8273-c167d9afca28" containerID="712603a1b97b084eebc58893e05cde574b9f0f2e5360a98b0fe0e6acfea60707" exitCode=1 Mar 08 03:53:08.211872 master-0 kubenswrapper[18592]: I0308 03:53:08.211789 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-86d7cdfdfb-chpl6_26180f77-0b1a-4d0f-9ed0-a12fdee69817/kube-controller-manager-operator/1.log" Mar 08 03:53:08.211872 master-0 kubenswrapper[18592]: I0308 03:53:08.211836 18592 generic.go:334] "Generic (PLEG): container finished" podID="26180f77-0b1a-4d0f-9ed0-a12fdee69817" containerID="b9a51bfe829084894104463976cade708d7a51f90ef15a899d7341f663daf1dc" exitCode=255 Mar 08 03:53:08.215258 master-0 kubenswrapper[18592]: I0308 03:53:08.215234 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-vzms7_5a7752f9-7b9a-451f-997a-e9f696d38b34/etcd-operator/1.log" Mar 08 03:53:08.215320 master-0 kubenswrapper[18592]: I0308 03:53:08.215272 18592 generic.go:334] "Generic (PLEG): container finished" podID="5a7752f9-7b9a-451f-997a-e9f696d38b34" containerID="9cb40f8e472021b6bf28adddecb51a371c5cff426f2d0e4b345adbb4c28df1e5" exitCode=255 Mar 08 03:53:08.217483 master-0 kubenswrapper[18592]: I0308 03:53:08.217457 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-99d2k_3ddfd0e7-fe76-41bc-b316-94505df81002/network-operator/1.log" Mar 08 03:53:08.217550 master-0 kubenswrapper[18592]: I0308 03:53:08.217487 18592 generic.go:334] "Generic (PLEG): container finished" podID="3ddfd0e7-fe76-41bc-b316-94505df81002" containerID="e9dbfd241ad84e1bb7af7ba76075dd9557271049d4b44017afb55ac9ce7ffb9b" exitCode=255 Mar 08 03:53:08.237384 master-0 kubenswrapper[18592]: I0308 03:53:08.237359 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-8h6fj_1b69fbf6-1ca5-413e-bffd-965730bcec1b/manager/0.log" Mar 08 03:53:08.237849 master-0 kubenswrapper[18592]: I0308 03:53:08.237800 18592 generic.go:334] "Generic (PLEG): container finished" podID="1b69fbf6-1ca5-413e-bffd-965730bcec1b" containerID="55aa7553b7b737c589cdd0270a8ec23cc64ce136f8130219ce1dabd7e976b992" exitCode=1 Mar 08 03:53:08.239336 master-0 kubenswrapper[18592]: I0308 03:53:08.239322 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-5-master-0_0f865279-e751-456d-8c96-6381f8b45ce1/installer/0.log" Mar 08 03:53:08.239751 master-0 kubenswrapper[18592]: I0308 03:53:08.239729 18592 generic.go:334] "Generic (PLEG): container finished" podID="0f865279-e751-456d-8c96-6381f8b45ce1" containerID="9289f2928e2e95c2ade5890aeb0e93be12cf91a6e92bf8866de144086be0fb16" exitCode=1 Mar 08 03:53:08.242063 master-0 kubenswrapper[18592]: E0308 03:53:08.241889 18592 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 08 03:53:08.243629 master-0 kubenswrapper[18592]: I0308 03:53:08.243606 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-check-endpoints/0.log" Mar 08 03:53:08.246066 master-0 kubenswrapper[18592]: I0308 03:53:08.246048 18592 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="222e8ca389049069b4efae8be97f8ff91fe671c190224c8b6f05f39079d825cf" exitCode=255 Mar 08 03:53:08.246140 master-0 kubenswrapper[18592]: I0308 03:53:08.246128 18592 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="1570887c60156b6fbbdb4d53007ec6f0d11589a7feaf962ad0cf0545fdd489d2" exitCode=0 Mar 08 03:53:08.255716 master-0 kubenswrapper[18592]: I0308 03:53:08.255662 18592 generic.go:334] "Generic (PLEG): container finished" podID="f0f5f3f3-0856-4da3-9157-15f65c6aba6e" containerID="e04ec38e07d8783fc2ade88328995e37c561797980580532badf766ca8953982" exitCode=0 Mar 08 03:53:08.258247 master-0 kubenswrapper[18592]: I0308 03:53:08.258212 18592 generic.go:334] "Generic (PLEG): container finished" podID="2b3d1dc7-22f9-4c0c-802a-d7314894b255" containerID="9a8e4c15154302a8bb3171ed8ab697b8ee8b484e44617249609422b38b588432" exitCode=0 Mar 08 03:53:08.258247 master-0 kubenswrapper[18592]: I0308 03:53:08.258238 18592 generic.go:334] "Generic (PLEG): container finished" podID="2b3d1dc7-22f9-4c0c-802a-d7314894b255" containerID="3762e6cf17228a56d31f9c4b27fe58a04fecaafe047e1561d4d5135a98072ca1" exitCode=0 Mar 08 03:53:08.265231 master-0 kubenswrapper[18592]: I0308 03:53:08.265209 18592 generic.go:334] "Generic (PLEG): container finished" podID="4a19441e-e61b-4d58-85db-813ae88e1f9b" containerID="1362430063d3256452d2e164a40d29841ede63c548ec607c9dddfdd02a33cead" exitCode=0 Mar 08 03:53:08.265307 master-0 kubenswrapper[18592]: I0308 03:53:08.265295 18592 generic.go:334] "Generic (PLEG): container finished" podID="4a19441e-e61b-4d58-85db-813ae88e1f9b" containerID="e3fd81814e45a4dba9c86317c3e1475a8abfc09eed557e2f1e1628bc58babab2" exitCode=0 Mar 08 03:53:08.265379 master-0 kubenswrapper[18592]: I0308 03:53:08.265367 18592 generic.go:334] "Generic (PLEG): container finished" podID="4a19441e-e61b-4d58-85db-813ae88e1f9b" containerID="6808f9225d491b34c3cf7a01d443c6732f48fff26280b582719d87525223329a" exitCode=0 Mar 08 03:53:08.265434 master-0 kubenswrapper[18592]: I0308 03:53:08.265423 18592 generic.go:334] "Generic (PLEG): container finished" podID="4a19441e-e61b-4d58-85db-813ae88e1f9b" containerID="5b762e908370687f296be27f837eacb773e5b2c7f10d7523e57f3d511196e87d" exitCode=0 Mar 08 03:53:08.265486 master-0 kubenswrapper[18592]: I0308 03:53:08.265476 18592 generic.go:334] "Generic (PLEG): container finished" podID="4a19441e-e61b-4d58-85db-813ae88e1f9b" containerID="3fee6f2c5c3a300e3baa43dfa5cafdfb86c438810bfee3e881783f584b000768" exitCode=0 Mar 08 03:53:08.265545 master-0 kubenswrapper[18592]: I0308 03:53:08.265532 18592 generic.go:334] "Generic (PLEG): container finished" podID="4a19441e-e61b-4d58-85db-813ae88e1f9b" containerID="46250aae369897400569e4111703b276aadaa65120ad7d4c39a342c4f39e31c8" exitCode=0 Mar 08 03:53:08.268100 master-0 kubenswrapper[18592]: I0308 03:53:08.268054 18592 generic.go:334] "Generic (PLEG): container finished" podID="c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f" containerID="1b384f11136941d514d3d61afdf401bff1b3b2c0f0bf1870bb3feb8e7a8ab041" exitCode=0 Mar 08 03:53:08.275141 master-0 kubenswrapper[18592]: I0308 03:53:08.275099 18592 generic.go:334] "Generic (PLEG): container finished" podID="9c95709c-c3cb-46fb-afe7-626c8013f3c6" containerID="5f3bdf25350f3735e74258b774375768d3fdf1215b280e2ea275e2a1b21b5161" exitCode=0 Mar 08 03:53:08.277746 master-0 kubenswrapper[18592]: I0308 03:53:08.277729 18592 generic.go:334] "Generic (PLEG): container finished" podID="1cbcb403-a424-4496-8c5c-5eb5e42dfb93" containerID="3b576ae60c0b63ec0db45afc74d3ab2b7a31ef872c28479883b2bca1465128e0" exitCode=0 Mar 08 03:53:08.282177 master-0 kubenswrapper[18592]: I0308 03:53:08.282162 18592 generic.go:334] "Generic (PLEG): container finished" podID="d191ff84-f4e4-4d99-8cbb-c10771e68baf" containerID="ab476bbfa4b9ad96fb2348ef6d3d71a1b60822ddb1c07515c6d2e7af7a64fce8" exitCode=0 Mar 08 03:53:08.298352 master-0 kubenswrapper[18592]: I0308 03:53:08.298307 18592 generic.go:334] "Generic (PLEG): container finished" podID="0918ba32-8e55-48d0-8e50-027c0dcb4bbd" containerID="1523789c3f1ce2ac99a20cd9e6a22cf2201e9f542fc114c065ce9962d3d4debb" exitCode=0 Mar 08 03:53:08.298352 master-0 kubenswrapper[18592]: I0308 03:53:08.298337 18592 generic.go:334] "Generic (PLEG): container finished" podID="0918ba32-8e55-48d0-8e50-027c0dcb4bbd" containerID="c9cf4c65bcca879489d6d583c27aa9216a027640b05dfba4d536ce4f8192a79a" exitCode=0 Mar 08 03:53:08.302659 master-0 kubenswrapper[18592]: I0308 03:53:08.302627 18592 generic.go:334] "Generic (PLEG): container finished" podID="54ad284e-d40e-4e69-b898-f5093952a0e6" containerID="791aee9d23f28d5b9bc6bbbcd3f26705c245a61021bebb20a57835608ad72cab" exitCode=0 Mar 08 03:53:08.315617 master-0 kubenswrapper[18592]: I0308 03:53:08.315574 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-69b6fc6b88-kg795_0d377285-0336-41b7-b48f-c44a7b563498/service-ca-operator/1.log" Mar 08 03:53:08.315617 master-0 kubenswrapper[18592]: I0308 03:53:08.315605 18592 generic.go:334] "Generic (PLEG): container finished" podID="0d377285-0336-41b7-b48f-c44a7b563498" containerID="c699bcd2d347a8b5955755f0e95a151eaeb522f32880f9765692ad3e2e0369c6" exitCode=255 Mar 08 03:53:08.320080 master-0 kubenswrapper[18592]: I0308 03:53:08.320044 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_47643289-ac4b-425d-8ea1-913b6ca39ee0/installer/0.log" Mar 08 03:53:08.320148 master-0 kubenswrapper[18592]: I0308 03:53:08.320090 18592 generic.go:334] "Generic (PLEG): container finished" podID="47643289-ac4b-425d-8ea1-913b6ca39ee0" containerID="18670cb65f485400f1fdb45bed0a06f4e06d21d135459ea29b3c0fcd10f2d210" exitCode=1 Mar 08 03:53:08.325608 master-0 kubenswrapper[18592]: I0308 03:53:08.325588 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-75qmb_2f59fe81-deee-4ced-ae9d-f17752c82c4b/manager/0.log" Mar 08 03:53:08.325669 master-0 kubenswrapper[18592]: I0308 03:53:08.325619 18592 generic.go:334] "Generic (PLEG): container finished" podID="2f59fe81-deee-4ced-ae9d-f17752c82c4b" containerID="3059f49f388319ee646920103084d28d8b0077750e77df3225c9bad4053dd550" exitCode=1 Mar 08 03:53:08.328316 master-0 kubenswrapper[18592]: I0308 03:53:08.328236 18592 generic.go:334] "Generic (PLEG): container finished" podID="76ba45a2-8945-4afe-b913-126c26725867" containerID="41eba15c47abd981b40ebf82cbf86f9a574f89d62b79b6b757b4ed7a35e235d0" exitCode=0 Mar 08 03:53:08.329796 master-0 kubenswrapper[18592]: I0308 03:53:08.329690 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-ggzm8_164586b1-f133-4427-8ab6-eb0839b79738/approver/0.log" Mar 08 03:53:08.329982 master-0 kubenswrapper[18592]: I0308 03:53:08.329931 18592 generic.go:334] "Generic (PLEG): container finished" podID="164586b1-f133-4427-8ab6-eb0839b79738" containerID="eca6f5647fbdf9b3ef8c7044a7fb91cd16de860543c74991829e340da4a238fe" exitCode=1 Mar 08 03:53:08.333889 master-0 kubenswrapper[18592]: I0308 03:53:08.333856 18592 generic.go:334] "Generic (PLEG): container finished" podID="8e283f49-b85d-4789-a71f-3fcb5033cdf0" containerID="5ff54f1371455049a6c7805b6b5fc557245d737690eacfa02101a51a25748851" exitCode=0 Mar 08 03:53:08.333889 master-0 kubenswrapper[18592]: I0308 03:53:08.333885 18592 generic.go:334] "Generic (PLEG): container finished" podID="8e283f49-b85d-4789-a71f-3fcb5033cdf0" containerID="97d43cbd349f40b6639c942720c80d6ef418f56edf2fb6db63bcde4714444dea" exitCode=0 Mar 08 03:53:08.336522 master-0 kubenswrapper[18592]: I0308 03:53:08.336487 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-6fbfc8dc8f-nm8fj_b3eea925-73b3-4693-8f0e-6dd26107f60a/cluster-storage-operator/1.log" Mar 08 03:53:08.336592 master-0 kubenswrapper[18592]: I0308 03:53:08.336533 18592 generic.go:334] "Generic (PLEG): container finished" podID="b3eea925-73b3-4693-8f0e-6dd26107f60a" containerID="11981809b9cc27f184966b17ad1925dff97bd3f4b8d6d288eb4740ef6e4ff5eb" exitCode=255 Mar 08 03:53:08.346292 master-0 kubenswrapper[18592]: I0308 03:53:08.346263 18592 generic.go:334] "Generic (PLEG): container finished" podID="7e5935ea-8d95-45e3-b836-c7892953ef3d" containerID="7fe9302ada8235a3afd5b8f3fc53b3d920a5fbae69778891c3722690a5eb8590" exitCode=0 Mar 08 03:53:08.351265 master-0 kubenswrapper[18592]: I0308 03:53:08.351225 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5c74bfc494-g6n58_e4541b7b-3f7f-4851-9bd9-26fcda5cab13/kube-scheduler-operator-container/1.log" Mar 08 03:53:08.351335 master-0 kubenswrapper[18592]: I0308 03:53:08.351280 18592 generic.go:334] "Generic (PLEG): container finished" podID="e4541b7b-3f7f-4851-9bd9-26fcda5cab13" containerID="38b4abf7d4c06fafbe1f2864c946ad3648498ee7c33fbece731408c279494ae4" exitCode=255 Mar 08 03:53:08.353398 master-0 kubenswrapper[18592]: I0308 03:53:08.353340 18592 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="28bcae4d70566beaa13732bd5095c7d8d6a2ad6f8be2ed4c2e4b067a051fc9f1" exitCode=0 Mar 08 03:53:08.353398 master-0 kubenswrapper[18592]: I0308 03:53:08.353375 18592 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="b00978d6151280d243ba1f6c8276b934ba5c5276b57bc3800284f048820f905f" exitCode=0 Mar 08 03:53:08.354904 master-0 kubenswrapper[18592]: I0308 03:53:08.354875 18592 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="f5a0acfb3a3f4f285f366c3abcb3f9d3bebb3626e4a976de0dab27a634745185" exitCode=1 Mar 08 03:53:08.364133 master-0 kubenswrapper[18592]: I0308 03:53:08.364083 18592 generic.go:334] "Generic (PLEG): container finished" podID="ee586416-6f56-4ea4-ad62-95de1e6df23b" containerID="644f0c7d4552f15957ecfc56f2d37a06ec2757ddcc7c2c371f0c34b92aa63533" exitCode=0 Mar 08 03:53:08.386598 master-0 kubenswrapper[18592]: I0308 03:53:08.386551 18592 generic.go:334] "Generic (PLEG): container finished" podID="7bf40ef9-a79a-4f5d-933c-5276edcccb4b" containerID="20197cef49bb05fb75f2e7eda65c3e92dc7a4af95343b25ff91e78b1d42be6fb" exitCode=1 Mar 08 03:53:08.404950 master-0 kubenswrapper[18592]: I0308 03:53:08.404875 18592 generic.go:334] "Generic (PLEG): container finished" podID="0031e3a9-b253-4dda-a890-bf3e4d8737e8" containerID="f0b47768bbdcebe6eae54026a9de105f6f07e381bc83e55351fb7b2741b60d83" exitCode=0 Mar 08 03:53:08.404950 master-0 kubenswrapper[18592]: I0308 03:53:08.404939 18592 generic.go:334] "Generic (PLEG): container finished" podID="0031e3a9-b253-4dda-a890-bf3e4d8737e8" containerID="c1f98a5af9a428504d2938393c5cc9f6182f4f61d848078dae052d1d8e74b50e" exitCode=0 Mar 08 03:53:08.411837 master-0 kubenswrapper[18592]: I0308 03:53:08.408603 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/0.log" Mar 08 03:53:08.411837 master-0 kubenswrapper[18592]: I0308 03:53:08.408655 18592 generic.go:334] "Generic (PLEG): container finished" podID="9ec89e27-4360-48f2-a7ca-5d823bda4510" containerID="e1cf094994e913e66c5a9e6e155292c3e34468235cb173dcf1919a0eed0dd4ca" exitCode=1 Mar 08 03:53:08.427927 master-0 kubenswrapper[18592]: I0308 03:53:08.427878 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-77899cf6d-x9h9q_7ff63c73-62a3-44b4-acd3-1b3df175794f/cluster-olm-operator/1.log" Mar 08 03:53:08.430494 master-0 kubenswrapper[18592]: I0308 03:53:08.430461 18592 generic.go:334] "Generic (PLEG): container finished" podID="7ff63c73-62a3-44b4-acd3-1b3df175794f" containerID="7b9e0618571c76237a54adfbc9471783f3afade6ddbedbe9d5d1037a9f845813" exitCode=255 Mar 08 03:53:08.430804 master-0 kubenswrapper[18592]: I0308 03:53:08.430782 18592 generic.go:334] "Generic (PLEG): container finished" podID="7ff63c73-62a3-44b4-acd3-1b3df175794f" containerID="4387f662533a08651e81f74de2284f00d980e00423a2533ad7cf2a0699bd920f" exitCode=0 Mar 08 03:53:08.430804 master-0 kubenswrapper[18592]: I0308 03:53:08.430802 18592 generic.go:334] "Generic (PLEG): container finished" podID="7ff63c73-62a3-44b4-acd3-1b3df175794f" containerID="b570308b9f2efb1190e0fe5138fb6f12ce5146071b70dc77e8c4a2c70d7d56d5" exitCode=0 Mar 08 03:53:08.442464 master-0 kubenswrapper[18592]: E0308 03:53:08.442431 18592 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 08 03:53:08.451082 master-0 kubenswrapper[18592]: I0308 03:53:08.451034 18592 generic.go:334] "Generic (PLEG): container finished" podID="6bee226a-2a66-4032-8aba-2c8b82abcb6a" containerID="f5a03a43b579d01ff866bab8c195d17a338aa8d73cf258ca83335e51946cf09b" exitCode=0 Mar 08 03:53:08.451082 master-0 kubenswrapper[18592]: I0308 03:53:08.451075 18592 generic.go:334] "Generic (PLEG): container finished" podID="6bee226a-2a66-4032-8aba-2c8b82abcb6a" containerID="e70077d0bc2f435dbefd1bd93a5bf3f06dc8fe76044ed3f37fa4f6ef147e9f4c" exitCode=0 Mar 08 03:53:08.454493 master-0 kubenswrapper[18592]: I0308 03:53:08.454462 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_baab6171-046d-4fc9-b7d7-ff2fd12f185f/installer/0.log" Mar 08 03:53:08.454570 master-0 kubenswrapper[18592]: I0308 03:53:08.454513 18592 generic.go:334] "Generic (PLEG): container finished" podID="baab6171-046d-4fc9-b7d7-ff2fd12f185f" containerID="5a92eed331c18522564f92e3e6e14d9dcb5be24514d5ff22fbf01a140de4cfee" exitCode=1 Mar 08 03:53:08.466621 master-0 kubenswrapper[18592]: I0308 03:53:08.466555 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-t77qr_c9de4939-680a-4e3e-89fd-e20ecb8b10f2/ingress-operator/0.log" Mar 08 03:53:08.466685 master-0 kubenswrapper[18592]: I0308 03:53:08.466610 18592 generic.go:334] "Generic (PLEG): container finished" podID="c9de4939-680a-4e3e-89fd-e20ecb8b10f2" containerID="db28e69e1ea518493719876e18b2faf675fe251b59f240840b24dd0b6d115924" exitCode=1 Mar 08 03:53:08.470711 master-0 kubenswrapper[18592]: I0308 03:53:08.470686 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-kt66j_0ebf1330-e044-4ff5-8b48-2d667e0c5625/openshift-controller-manager-operator/1.log" Mar 08 03:53:08.470770 master-0 kubenswrapper[18592]: I0308 03:53:08.470724 18592 generic.go:334] "Generic (PLEG): container finished" podID="0ebf1330-e044-4ff5-8b48-2d667e0c5625" containerID="cae216678d94c10a368ff595527d708d87bd43ed6865eacedbf892861c47fe3a" exitCode=255 Mar 08 03:53:08.648120 master-0 kubenswrapper[18592]: I0308 03:53:08.648094 18592 manager.go:324] Recovery completed Mar 08 03:53:08.722949 master-0 kubenswrapper[18592]: I0308 03:53:08.722845 18592 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 08 03:53:08.722949 master-0 kubenswrapper[18592]: I0308 03:53:08.722882 18592 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 08 03:53:08.722949 master-0 kubenswrapper[18592]: I0308 03:53:08.722925 18592 state_mem.go:36] "Initialized new in-memory state store" Mar 08 03:53:08.723152 master-0 kubenswrapper[18592]: I0308 03:53:08.723121 18592 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 08 03:53:08.723251 master-0 kubenswrapper[18592]: I0308 03:53:08.723138 18592 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 08 03:53:08.723251 master-0 kubenswrapper[18592]: I0308 03:53:08.723162 18592 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 08 03:53:08.723251 master-0 kubenswrapper[18592]: I0308 03:53:08.723170 18592 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 08 03:53:08.723251 master-0 kubenswrapper[18592]: I0308 03:53:08.723178 18592 policy_none.go:49] "None policy: Start" Mar 08 03:53:08.725594 master-0 kubenswrapper[18592]: I0308 03:53:08.725566 18592 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 08 03:53:08.725594 master-0 kubenswrapper[18592]: I0308 03:53:08.725595 18592 state_mem.go:35] "Initializing new in-memory state store" Mar 08 03:53:08.725777 master-0 kubenswrapper[18592]: I0308 03:53:08.725754 18592 state_mem.go:75] "Updated machine memory state" Mar 08 03:53:08.725777 master-0 kubenswrapper[18592]: I0308 03:53:08.725770 18592 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 08 03:53:08.743330 master-0 kubenswrapper[18592]: I0308 03:53:08.742440 18592 manager.go:334] "Starting Device Plugin manager" Mar 08 03:53:08.743330 master-0 kubenswrapper[18592]: I0308 03:53:08.742501 18592 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 08 03:53:08.743330 master-0 kubenswrapper[18592]: I0308 03:53:08.742527 18592 server.go:79] "Starting device plugin registration server" Mar 08 03:53:08.743330 master-0 kubenswrapper[18592]: I0308 03:53:08.742972 18592 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 08 03:53:08.743330 master-0 kubenswrapper[18592]: I0308 03:53:08.742986 18592 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 08 03:53:08.746035 master-0 kubenswrapper[18592]: I0308 03:53:08.745687 18592 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 08 03:53:08.746035 master-0 kubenswrapper[18592]: I0308 03:53:08.745815 18592 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 08 03:53:08.746035 master-0 kubenswrapper[18592]: I0308 03:53:08.745839 18592 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 08 03:53:08.842643 master-0 kubenswrapper[18592]: I0308 03:53:08.842540 18592 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0","openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-kube-controller-manager/kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Mar 08 03:53:08.843354 master-0 kubenswrapper[18592]: I0308 03:53:08.843329 18592 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:53:08.844968 master-0 kubenswrapper[18592]: I0308 03:53:08.844895 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"5d6ee3d775aef6ab7485b93f260e08787c53fee03078cb5743281f3a00c6731a"} Mar 08 03:53:08.844968 master-0 kubenswrapper[18592]: I0308 03:53:08.844968 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"743b6d0d3328cb1e5fd90f39085d9403830aebc1de828659a1d9c0fc9660f4a2"} Mar 08 03:53:08.844968 master-0 kubenswrapper[18592]: I0308 03:53:08.844978 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"a8391451b1644c11ad363666bf4d456fe86930894f61c4a9474dc40e3b26d78b"} Mar 08 03:53:08.845094 master-0 kubenswrapper[18592]: I0308 03:53:08.844986 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"afc2ac57ed877bb9156ca731d8cd2f853ddb9f606dc1ae3cba22d206076d25c5"} Mar 08 03:53:08.845094 master-0 kubenswrapper[18592]: I0308 03:53:08.844995 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"b968b6dde7d2ded374cd8ae315cb70a664d6c49c41163b10766b7ed997cf628a"} Mar 08 03:53:08.845094 master-0 kubenswrapper[18592]: I0308 03:53:08.845003 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"3df151c3da265182304d84afb0b3bc1e42416ef6485b53e3bd88733c8055b421"} Mar 08 03:53:08.845094 master-0 kubenswrapper[18592]: I0308 03:53:08.845032 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"338e07cf4149947d0b4bb7aee072ff8d4da6cb3eeb924ae9f2fa6dc0d8d523b1"} Mar 08 03:53:08.845094 master-0 kubenswrapper[18592]: I0308 03:53:08.845040 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"74cf1dbcbe0d060e62a1cff77950d3cf19f4f4c11ebaceeae2f072445a583ffa"} Mar 08 03:53:08.845094 master-0 kubenswrapper[18592]: I0308 03:53:08.845050 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"bc8712c641a3b3ccd887956343c178e81a448d9908293a089aa942f0944b3018"} Mar 08 03:53:08.845094 master-0 kubenswrapper[18592]: I0308 03:53:08.845063 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ed8c48f565d4b8be16c6ba185f91ba3e8904463e008be6f2e6c969571e27427" Mar 08 03:53:08.845094 master-0 kubenswrapper[18592]: I0308 03:53:08.845071 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ffd7e8cf7a9593e9910a67c41b6e95af26b8d49eaf5fd007129fe49d1978425" Mar 08 03:53:08.845094 master-0 kubenswrapper[18592]: I0308 03:53:08.845081 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"0bbd3b73d51b06514693db13893aa6ce69354b9ab4f18d355441678c9479dc95"} Mar 08 03:53:08.845314 master-0 kubenswrapper[18592]: I0308 03:53:08.845109 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"e169486121bbc52c7ca877ad3d815dc4a35f6b8ee220e0fd43b9661c26e26d92"} Mar 08 03:53:08.845314 master-0 kubenswrapper[18592]: I0308 03:53:08.845121 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"2c8736855304b1b6928cbfdc88bfeac2e98662a8092340731da4a5d87e7dfa39"} Mar 08 03:53:08.845314 master-0 kubenswrapper[18592]: I0308 03:53:08.845130 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"1611cfa5e175032b10c844270b1926150f7a6bf4a58e7bfa0e9ab7a757d448fe"} Mar 08 03:53:08.845314 master-0 kubenswrapper[18592]: I0308 03:53:08.845141 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"809953c4a3d0d1d245b0d287db991ef24c93f664dbc39c226e2a89fc2ba7da3d"} Mar 08 03:53:08.845314 master-0 kubenswrapper[18592]: I0308 03:53:08.845301 18592 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:53:08.845442 master-0 kubenswrapper[18592]: I0308 03:53:08.845319 18592 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:53:08.845442 master-0 kubenswrapper[18592]: I0308 03:53:08.845347 18592 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:53:08.845502 master-0 kubenswrapper[18592]: I0308 03:53:08.845447 18592 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:53:08.845712 master-0 kubenswrapper[18592]: I0308 03:53:08.845151 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"ed400b0e1b21fe5e4ef5385a05444bf39db4c2fd9c754a3d6c45427d3b29ef99"} Mar 08 03:53:08.845765 master-0 kubenswrapper[18592]: I0308 03:53:08.845718 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d29086141609fa12579213578ed2d780ee581ff60e20ceb99a14fefd44548805" Mar 08 03:53:08.845797 master-0 kubenswrapper[18592]: I0308 03:53:08.845775 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a12e67e5b53279c862df229026c8d16c","Type":"ContainerStarted","Data":"6979155324a9775c0f334fc4aa6afa070463810c3191479ea2bb2dbfe2843ea3"} Mar 08 03:53:08.845797 master-0 kubenswrapper[18592]: I0308 03:53:08.845786 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a12e67e5b53279c862df229026c8d16c","Type":"ContainerStarted","Data":"d92fdcd0bd88e0c579bd858a04d7e6e266a7f72aec3885543e0de2cee51140ac"} Mar 08 03:53:08.845877 master-0 kubenswrapper[18592]: I0308 03:53:08.845797 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a12e67e5b53279c862df229026c8d16c","Type":"ContainerStarted","Data":"f5cec83dc05dfae95933e7d5e4646a470fd6b2150eeed3507d1b115fc1dfcb34"} Mar 08 03:53:08.845877 master-0 kubenswrapper[18592]: I0308 03:53:08.845807 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a12e67e5b53279c862df229026c8d16c","Type":"ContainerStarted","Data":"9d1a3af9468d450b8ce515e818a31e6bfe522f30f01bccb1080ebaabf3f6d3f1"} Mar 08 03:53:08.845877 master-0 kubenswrapper[18592]: I0308 03:53:08.845815 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a12e67e5b53279c862df229026c8d16c","Type":"ContainerStarted","Data":"9af5ebd3eee3c3de99e27a671d715ba12c7da929014abc4a9a4424a8fb8aad4e"} Mar 08 03:53:08.845877 master-0 kubenswrapper[18592]: I0308 03:53:08.845858 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46d4f01a8b97928ed12b356249f2c516cd9275fd33a04ced54c1129e7817bd38" Mar 08 03:53:08.845877 master-0 kubenswrapper[18592]: I0308 03:53:08.845868 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerDied","Data":"222e8ca389049069b4efae8be97f8ff91fe671c190224c8b6f05f39079d825cf"} Mar 08 03:53:08.845877 master-0 kubenswrapper[18592]: I0308 03:53:08.845877 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"5bc596a566a004204d8781e6880a298269208812a64f684e8f90b164a5a846fe"} Mar 08 03:53:08.846024 master-0 kubenswrapper[18592]: I0308 03:53:08.845886 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"865460c774c2766f8b86ebf8237c6f8af6ae97a526279d303aebe43f358dbff8"} Mar 08 03:53:08.846024 master-0 kubenswrapper[18592]: I0308 03:53:08.845894 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"77d9f19c7fff32bc633b77d809c0704eaf44b3aee7eeaf009773338793ad2dd5"} Mar 08 03:53:08.846024 master-0 kubenswrapper[18592]: I0308 03:53:08.845919 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"80a13278743d26b7b1321c7095283277668741654b1e182af894d61a0ac675ff"} Mar 08 03:53:08.846024 master-0 kubenswrapper[18592]: I0308 03:53:08.845931 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerDied","Data":"1570887c60156b6fbbdb4d53007ec6f0d11589a7feaf962ad0cf0545fdd489d2"} Mar 08 03:53:08.846024 master-0 kubenswrapper[18592]: I0308 03:53:08.845940 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"65320fce1a0608c5e233ad7039ccb30dfdee6ba6adad349424d74cf44c08e2db"} Mar 08 03:53:08.846024 master-0 kubenswrapper[18592]: I0308 03:53:08.846005 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75f4eb3be0a13ebaf0adfb5408aab17726c88687c273eeef67438587a2bdc267" Mar 08 03:53:08.846024 master-0 kubenswrapper[18592]: I0308 03:53:08.846016 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="404a0d2a7b8ca71df97070480a7d3f018db46911d3635a93f748cb0ea044da91" Mar 08 03:53:08.846024 master-0 kubenswrapper[18592]: I0308 03:53:08.846030 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="490c7966b29451303b6f42ccaf5b249c853bf84b3cd4be6c7f5b23f3365fe971" Mar 08 03:53:08.846225 master-0 kubenswrapper[18592]: I0308 03:53:08.846077 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="399bfaebb310269b0096a7a05e3f4655b777b39ab76bccec71929c119f7598f0" Mar 08 03:53:08.846225 master-0 kubenswrapper[18592]: I0308 03:53:08.846091 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68a9c31781b9210f187f31760c081b7a914e3f7c545e30283d68c9a55506f854" Mar 08 03:53:08.846225 master-0 kubenswrapper[18592]: I0308 03:53:08.846160 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="758a2c2e2af7455b02804a595f36886f4047114b8dbd25a8393a292e35b7254e" Mar 08 03:53:08.846225 master-0 kubenswrapper[18592]: I0308 03:53:08.846168 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"255dd70f3aa78d8d4e9fb681404034a533a64980f735eecd5cf5d8b6ad4838a5"} Mar 08 03:53:08.846225 master-0 kubenswrapper[18592]: I0308 03:53:08.846177 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerDied","Data":"f5a0acfb3a3f4f285f366c3abcb3f9d3bebb3626e4a976de0dab27a634745185"} Mar 08 03:53:08.846225 master-0 kubenswrapper[18592]: I0308 03:53:08.846186 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"5b8c31076d1db49fd8c133661fbbc131a58892112131cf3118f58212505e7460"} Mar 08 03:53:08.846225 master-0 kubenswrapper[18592]: I0308 03:53:08.846209 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6174c0cced28744679d07cd6bfda1e5016fe917384d58e904dd1b71ae6c4d184" Mar 08 03:53:08.846395 master-0 kubenswrapper[18592]: I0308 03:53:08.846256 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2dd6f50bb704e85814a264a6b3647cea280a2063c980541df1082c59aa92b82" Mar 08 03:53:08.860848 master-0 kubenswrapper[18592]: E0308 03:53:08.860795 18592 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-startup-monitor-master-0\" already exists" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:53:08.861092 master-0 kubenswrapper[18592]: E0308 03:53:08.860858 18592 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:53:08.861200 master-0 kubenswrapper[18592]: E0308 03:53:08.860889 18592 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-master-0\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:53:08.863374 master-0 kubenswrapper[18592]: I0308 03:53:08.863343 18592 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 08 03:53:08.863457 master-0 kubenswrapper[18592]: I0308 03:53:08.863440 18592 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 08 03:53:08.928841 master-0 kubenswrapper[18592]: I0308 03:53:08.928772 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:53:08.928841 master-0 kubenswrapper[18592]: I0308 03:53:08.928818 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:53:08.929063 master-0 kubenswrapper[18592]: I0308 03:53:08.928868 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a12e67e5b53279c862df229026c8d16c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a12e67e5b53279c862df229026c8d16c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:53:08.929063 master-0 kubenswrapper[18592]: I0308 03:53:08.928950 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a12e67e5b53279c862df229026c8d16c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a12e67e5b53279c862df229026c8d16c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:53:08.929063 master-0 kubenswrapper[18592]: I0308 03:53:08.928995 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:53:08.929206 master-0 kubenswrapper[18592]: I0308 03:53:08.929138 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:53:08.929259 master-0 kubenswrapper[18592]: I0308 03:53:08.929232 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:53:08.929289 master-0 kubenswrapper[18592]: I0308 03:53:08.929275 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:08.929335 master-0 kubenswrapper[18592]: I0308 03:53:08.929311 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:53:08.929364 master-0 kubenswrapper[18592]: I0308 03:53:08.929350 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:53:08.929396 master-0 kubenswrapper[18592]: I0308 03:53:08.929384 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:53:08.929437 master-0 kubenswrapper[18592]: I0308 03:53:08.929415 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:53:08.929466 master-0 kubenswrapper[18592]: I0308 03:53:08.929452 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:53:08.929505 master-0 kubenswrapper[18592]: I0308 03:53:08.929482 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:53:08.929534 master-0 kubenswrapper[18592]: I0308 03:53:08.929521 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:53:08.929567 master-0 kubenswrapper[18592]: I0308 03:53:08.929552 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:53:08.929594 master-0 kubenswrapper[18592]: I0308 03:53:08.929582 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:53:08.929627 master-0 kubenswrapper[18592]: I0308 03:53:08.929612 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:08.929731 master-0 kubenswrapper[18592]: I0308 03:53:08.929691 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:08.929775 master-0 kubenswrapper[18592]: I0308 03:53:08.929741 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:53:08.960256 master-0 kubenswrapper[18592]: E0308 03:53:08.960215 18592 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:53:08.966555 master-0 kubenswrapper[18592]: E0308 03:53:08.966535 18592 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 08 03:53:09.030524 master-0 kubenswrapper[18592]: I0308 03:53:09.030374 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:09.030524 master-0 kubenswrapper[18592]: I0308 03:53:09.030434 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:09.030524 master-0 kubenswrapper[18592]: I0308 03:53:09.030455 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:53:09.030524 master-0 kubenswrapper[18592]: I0308 03:53:09.030472 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:53:09.030524 master-0 kubenswrapper[18592]: I0308 03:53:09.030499 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:53:09.030524 master-0 kubenswrapper[18592]: I0308 03:53:09.030521 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:53:09.030524 master-0 kubenswrapper[18592]: I0308 03:53:09.030540 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030555 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030574 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030588 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a12e67e5b53279c862df229026c8d16c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a12e67e5b53279c862df229026c8d16c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030605 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a12e67e5b53279c862df229026c8d16c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a12e67e5b53279c862df229026c8d16c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030621 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030637 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030652 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030666 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030681 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030695 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030710 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030725 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030743 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030789 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030855 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030879 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030903 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030925 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030957 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.030979 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.031000 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.031023 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.031046 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.031070 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a12e67e5b53279c862df229026c8d16c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a12e67e5b53279c862df229026c8d16c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.031100 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a12e67e5b53279c862df229026c8d16c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a12e67e5b53279c862df229026c8d16c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:53:09.031116 master-0 kubenswrapper[18592]: I0308 03:53:09.031150 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:53:09.032572 master-0 kubenswrapper[18592]: I0308 03:53:09.031175 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:53:09.032572 master-0 kubenswrapper[18592]: I0308 03:53:09.031198 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:53:09.032572 master-0 kubenswrapper[18592]: I0308 03:53:09.031225 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:53:09.032572 master-0 kubenswrapper[18592]: I0308 03:53:09.031253 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:53:09.032572 master-0 kubenswrapper[18592]: I0308 03:53:09.031272 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:53:09.032572 master-0 kubenswrapper[18592]: I0308 03:53:09.031291 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:53:09.032572 master-0 kubenswrapper[18592]: I0308 03:53:09.031311 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:53:09.058701 master-0 kubenswrapper[18592]: I0308 03:53:09.058645 18592 apiserver.go:52] "Watching apiserver" Mar 08 03:53:09.078297 master-0 kubenswrapper[18592]: I0308 03:53:09.078215 18592 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 03:53:09.080239 master-0 kubenswrapper[18592]: I0308 03:53:09.080187 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh","openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp","openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb","openshift-etcd/installer-1-master-0","openshift-insights/insights-operator-8f89dfddd-4mr6p","openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2","openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz","openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj","openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8","openshift-marketplace/redhat-marketplace-4h8qm","openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk","openshift-cluster-node-tuning-operator/tuned-bpwdb","openshift-kube-apiserver/installer-1-retry-1-master-0","openshift-kube-controller-manager/installer-3-master-0","openshift-multus/multus-rpppb","openshift-network-operator/network-operator-7c649bf6d4-99d2k","kube-system/bootstrap-kube-scheduler-master-0","openshift-etcd/etcd-master-0","openshift-kube-scheduler/installer-5-master-0","openshift-multus/multus-admission-controller-8d675b596-j8pv6","openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795","openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7","openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b","openshift-marketplace/certified-operators-p8nq8","openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb","openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj","openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-kube-storage-version-migrator/migrator-57ccdf9b5-wqldq","openshift-network-operator/iptables-alerter-7c28p","openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q","openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs","openshift-kube-controller-manager/installer-2-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d","openshift-ovn-kubernetes/ovnkube-node-jc6rf","openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv","openshift-ingress-operator/ingress-operator-677db989d6-t77qr","openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf","openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-xhbrl","openshift-dns/dns-default-4pjsn","openshift-marketplace/redhat-operators-rnz4w","openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq","openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh","openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52","openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp","openshift-controller-manager/controller-manager-6999cc9685-kprrt","openshift-kube-apiserver/installer-1-master-0","openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5","openshift-multus/network-metrics-daemon-schjl","openshift-service-ca/service-ca-84bfdbbb7f-gj69x","assisted-installer/assisted-installer-controller-66tqt","openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs","openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp","openshift-dns-operator/dns-operator-589895fbb7-xttlz","openshift-dns/node-resolver-wjl9v","openshift-network-diagnostics/network-check-target-xmgpj","openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t","openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682","openshift-multus/multus-additional-cni-plugins-g564l","openshift-network-node-identity/network-node-identity-ggzm8","openshift-marketplace/community-operators-lwt58","openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72","openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp","openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58","openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj","openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl","openshift-apiserver/apiserver-6b779d99b8-7kmck"] Mar 08 03:53:09.083318 master-0 kubenswrapper[18592]: I0308 03:53:09.083295 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:53:09.083951 master-0 kubenswrapper[18592]: I0308 03:53:09.083934 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-66tqt" Mar 08 03:53:09.085329 master-0 kubenswrapper[18592]: I0308 03:53:09.085310 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 08 03:53:09.085756 master-0 kubenswrapper[18592]: I0308 03:53:09.085738 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 03:53:09.091811 master-0 kubenswrapper[18592]: I0308 03:53:09.091322 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 03:53:09.099843 master-0 kubenswrapper[18592]: I0308 03:53:09.098302 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 08 03:53:09.099843 master-0 kubenswrapper[18592]: I0308 03:53:09.098724 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 03:53:09.099843 master-0 kubenswrapper[18592]: I0308 03:53:09.099355 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 08 03:53:09.101245 master-0 kubenswrapper[18592]: I0308 03:53:09.101002 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 03:53:09.111461 master-0 kubenswrapper[18592]: I0308 03:53:09.111256 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 08 03:53:09.113516 master-0 kubenswrapper[18592]: I0308 03:53:09.113402 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 08 03:53:09.114406 master-0 kubenswrapper[18592]: I0308 03:53:09.114357 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 03:53:09.114406 master-0 kubenswrapper[18592]: I0308 03:53:09.114390 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 03:53:09.114619 master-0 kubenswrapper[18592]: I0308 03:53:09.114591 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 03:53:09.114619 master-0 kubenswrapper[18592]: I0308 03:53:09.114618 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 03:53:09.114806 master-0 kubenswrapper[18592]: I0308 03:53:09.114735 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 03:53:09.114885 master-0 kubenswrapper[18592]: I0308 03:53:09.114757 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 03:53:09.115058 master-0 kubenswrapper[18592]: I0308 03:53:09.115037 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 08 03:53:09.115155 master-0 kubenswrapper[18592]: I0308 03:53:09.115134 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 03:53:09.115675 master-0 kubenswrapper[18592]: I0308 03:53:09.115442 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 03:53:09.115675 master-0 kubenswrapper[18592]: I0308 03:53:09.115523 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 03:53:09.115675 master-0 kubenswrapper[18592]: I0308 03:53:09.115606 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 03:53:09.115889 master-0 kubenswrapper[18592]: I0308 03:53:09.115863 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 03:53:09.116031 master-0 kubenswrapper[18592]: I0308 03:53:09.116004 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 03:53:09.116162 master-0 kubenswrapper[18592]: I0308 03:53:09.116128 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 08 03:53:09.116225 master-0 kubenswrapper[18592]: I0308 03:53:09.116169 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 03:53:09.116225 master-0 kubenswrapper[18592]: I0308 03:53:09.116162 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 03:53:09.117171 master-0 kubenswrapper[18592]: I0308 03:53:09.116724 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 03:53:09.117171 master-0 kubenswrapper[18592]: I0308 03:53:09.116789 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 03:53:09.117171 master-0 kubenswrapper[18592]: I0308 03:53:09.116906 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 03:53:09.117171 master-0 kubenswrapper[18592]: I0308 03:53:09.116955 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 08 03:53:09.117171 master-0 kubenswrapper[18592]: I0308 03:53:09.116911 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 03:53:09.117171 master-0 kubenswrapper[18592]: I0308 03:53:09.117005 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 03:53:09.117171 master-0 kubenswrapper[18592]: I0308 03:53:09.116978 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:53:09.117171 master-0 kubenswrapper[18592]: I0308 03:53:09.117176 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 03:53:09.117497 master-0 kubenswrapper[18592]: I0308 03:53:09.117189 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 03:53:09.117497 master-0 kubenswrapper[18592]: I0308 03:53:09.117254 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 03:53:09.117497 master-0 kubenswrapper[18592]: I0308 03:53:09.117281 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 03:53:09.117497 master-0 kubenswrapper[18592]: I0308 03:53:09.117294 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 03:53:09.117497 master-0 kubenswrapper[18592]: I0308 03:53:09.117423 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 03:53:09.117745 master-0 kubenswrapper[18592]: I0308 03:53:09.117719 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:53:09.117906 master-0 kubenswrapper[18592]: I0308 03:53:09.117788 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 08 03:53:09.118209 master-0 kubenswrapper[18592]: I0308 03:53:09.118082 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:53:09.118360 master-0 kubenswrapper[18592]: I0308 03:53:09.118281 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 03:53:09.118360 master-0 kubenswrapper[18592]: I0308 03:53:09.118287 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 03:53:09.119334 master-0 kubenswrapper[18592]: I0308 03:53:09.118470 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 08 03:53:09.119334 master-0 kubenswrapper[18592]: I0308 03:53:09.118540 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 03:53:09.119334 master-0 kubenswrapper[18592]: I0308 03:53:09.118628 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 03:53:09.119334 master-0 kubenswrapper[18592]: I0308 03:53:09.118760 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 03:53:09.119334 master-0 kubenswrapper[18592]: I0308 03:53:09.118809 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 03:53:09.119334 master-0 kubenswrapper[18592]: I0308 03:53:09.118886 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 03:53:09.119334 master-0 kubenswrapper[18592]: I0308 03:53:09.118836 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 03:53:09.119334 master-0 kubenswrapper[18592]: I0308 03:53:09.118936 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 08 03:53:09.119334 master-0 kubenswrapper[18592]: I0308 03:53:09.118939 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 03:53:09.119334 master-0 kubenswrapper[18592]: I0308 03:53:09.118981 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 03:53:09.119334 master-0 kubenswrapper[18592]: I0308 03:53:09.118990 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 03:53:09.119334 master-0 kubenswrapper[18592]: I0308 03:53:09.118991 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 03:53:09.119334 master-0 kubenswrapper[18592]: I0308 03:53:09.118902 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 08 03:53:09.119334 master-0 kubenswrapper[18592]: I0308 03:53:09.119089 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 03:53:09.119334 master-0 kubenswrapper[18592]: I0308 03:53:09.119142 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 03:53:09.119880 master-0 kubenswrapper[18592]: I0308 03:53:09.119603 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:53:09.120147 master-0 kubenswrapper[18592]: I0308 03:53:09.120109 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 03:53:09.120147 master-0 kubenswrapper[18592]: I0308 03:53:09.120129 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 03:53:09.120224 master-0 kubenswrapper[18592]: I0308 03:53:09.120168 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 03:53:09.120224 master-0 kubenswrapper[18592]: I0308 03:53:09.120182 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 08 03:53:09.120293 master-0 kubenswrapper[18592]: I0308 03:53:09.120236 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 03:53:09.120293 master-0 kubenswrapper[18592]: I0308 03:53:09.120242 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 08 03:53:09.120293 master-0 kubenswrapper[18592]: I0308 03:53:09.120260 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 03:53:09.120293 master-0 kubenswrapper[18592]: I0308 03:53:09.120289 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 03:53:09.120467 master-0 kubenswrapper[18592]: I0308 03:53:09.120358 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 08 03:53:09.120467 master-0 kubenswrapper[18592]: I0308 03:53:09.120368 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 03:53:09.120467 master-0 kubenswrapper[18592]: I0308 03:53:09.120397 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 08 03:53:09.120467 master-0 kubenswrapper[18592]: I0308 03:53:09.120435 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 03:53:09.120624 master-0 kubenswrapper[18592]: I0308 03:53:09.120483 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 08 03:53:09.120624 master-0 kubenswrapper[18592]: I0308 03:53:09.120504 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 03:53:09.120624 master-0 kubenswrapper[18592]: I0308 03:53:09.120525 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 08 03:53:09.120624 master-0 kubenswrapper[18592]: I0308 03:53:09.120540 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 03:53:09.120624 master-0 kubenswrapper[18592]: I0308 03:53:09.120582 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:53:09.120624 master-0 kubenswrapper[18592]: I0308 03:53:09.120600 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 03:53:09.120624 master-0 kubenswrapper[18592]: I0308 03:53:09.120614 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 03:53:09.121110 master-0 kubenswrapper[18592]: I0308 03:53:09.120128 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 03:53:09.121392 master-0 kubenswrapper[18592]: I0308 03:53:09.121366 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 03:53:09.121392 master-0 kubenswrapper[18592]: I0308 03:53:09.121368 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 03:53:09.121490 master-0 kubenswrapper[18592]: I0308 03:53:09.121295 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 03:53:09.121527 master-0 kubenswrapper[18592]: I0308 03:53:09.120842 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 08 03:53:09.121814 master-0 kubenswrapper[18592]: I0308 03:53:09.121791 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 03:53:09.122006 master-0 kubenswrapper[18592]: I0308 03:53:09.121873 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 03:53:09.122006 master-0 kubenswrapper[18592]: I0308 03:53:09.122003 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 03:53:09.122211 master-0 kubenswrapper[18592]: I0308 03:53:09.122177 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 03:53:09.122259 master-0 kubenswrapper[18592]: I0308 03:53:09.122237 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 03:53:09.122451 master-0 kubenswrapper[18592]: I0308 03:53:09.122198 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 03:53:09.122491 master-0 kubenswrapper[18592]: I0308 03:53:09.122456 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 03:53:09.122687 master-0 kubenswrapper[18592]: I0308 03:53:09.122662 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 08 03:53:09.122911 master-0 kubenswrapper[18592]: I0308 03:53:09.122892 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 03:53:09.123234 master-0 kubenswrapper[18592]: I0308 03:53:09.123206 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 08 03:53:09.125719 master-0 kubenswrapper[18592]: I0308 03:53:09.123995 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 03:53:09.125719 master-0 kubenswrapper[18592]: I0308 03:53:09.124545 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 03:53:09.125719 master-0 kubenswrapper[18592]: I0308 03:53:09.124592 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:09.126588 master-0 kubenswrapper[18592]: I0308 03:53:09.126251 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 08 03:53:09.126588 master-0 kubenswrapper[18592]: I0308 03:53:09.126381 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 08 03:53:09.126588 master-0 kubenswrapper[18592]: I0308 03:53:09.126506 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 03:53:09.140778 master-0 kubenswrapper[18592]: I0308 03:53:09.138867 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 08 03:53:09.143435 master-0 kubenswrapper[18592]: I0308 03:53:09.143400 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 08 03:53:09.146533 master-0 kubenswrapper[18592]: I0308 03:53:09.146494 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 03:53:09.150776 master-0 kubenswrapper[18592]: I0308 03:53:09.148961 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 08 03:53:09.152607 master-0 kubenswrapper[18592]: I0308 03:53:09.152573 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 03:53:09.158631 master-0 kubenswrapper[18592]: I0308 03:53:09.155856 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 03:53:09.159940 master-0 kubenswrapper[18592]: I0308 03:53:09.159299 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 03:53:09.161375 master-0 kubenswrapper[18592]: I0308 03:53:09.161339 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 03:53:09.161939 master-0 kubenswrapper[18592]: I0308 03:53:09.161899 18592 scope.go:117] "RemoveContainer" containerID="222e8ca389049069b4efae8be97f8ff91fe671c190224c8b6f05f39079d825cf" Mar 08 03:53:09.162298 master-0 kubenswrapper[18592]: I0308 03:53:09.162267 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 03:53:09.166096 master-0 kubenswrapper[18592]: I0308 03:53:09.165803 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 03:53:09.186453 master-0 kubenswrapper[18592]: I0308 03:53:09.186398 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 03:53:09.186633 master-0 kubenswrapper[18592]: I0308 03:53:09.186546 18592 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 08 03:53:09.208226 master-0 kubenswrapper[18592]: I0308 03:53:09.208040 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 03:53:09.225946 master-0 kubenswrapper[18592]: I0308 03:53:09.225897 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:53:09.232427 master-0 kubenswrapper[18592]: I0308 03:53:09.232378 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd7d5\" (UniqueName: \"kubernetes.io/projected/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-kube-api-access-bd7d5\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:53:09.232489 master-0 kubenswrapper[18592]: I0308 03:53:09.232438 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-os-release\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.232579 master-0 kubenswrapper[18592]: I0308 03:53:09.232532 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2262647b-c315-477a-93bd-f168c1810475-serving-cert\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:53:09.232806 master-0 kubenswrapper[18592]: I0308 03:53:09.232673 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:53:09.232806 master-0 kubenswrapper[18592]: I0308 03:53:09.232709 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.232806 master-0 kubenswrapper[18592]: I0308 03:53:09.232779 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-var-lib-kubelet\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.233019 master-0 kubenswrapper[18592]: I0308 03:53:09.232991 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e78b283b-981e-48d7-a5f2-53f8401766ea-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:53:09.233076 master-0 kubenswrapper[18592]: I0308 03:53:09.233056 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-run-ovn-kubernetes\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.233112 master-0 kubenswrapper[18592]: I0308 03:53:09.233083 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-env-overrides\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.233141 master-0 kubenswrapper[18592]: I0308 03:53:09.233122 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/127c3f92-8283-4179-9e40-a12dcabaaa12-config\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:53:09.233170 master-0 kubenswrapper[18592]: I0308 03:53:09.233140 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnzt7\" (UniqueName: \"kubernetes.io/projected/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-kube-api-access-pnzt7\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:53:09.233170 master-0 kubenswrapper[18592]: I0308 03:53:09.233163 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:53:09.233227 master-0 kubenswrapper[18592]: I0308 03:53:09.233180 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/49ec083d-dc74-457e-b10f-3bde04e9e75e-signing-key\") pod \"service-ca-84bfdbbb7f-gj69x\" (UID: \"49ec083d-dc74-457e-b10f-3bde04e9e75e\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" Mar 08 03:53:09.233227 master-0 kubenswrapper[18592]: I0308 03:53:09.233197 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76ba45a2-8945-4afe-b913-126c26725867-etcd-serving-ca\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:09.233317 master-0 kubenswrapper[18592]: I0308 03:53:09.233300 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:53:09.233358 master-0 kubenswrapper[18592]: I0308 03:53:09.233325 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76ba45a2-8945-4afe-b913-126c26725867-etcd-client\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:09.233358 master-0 kubenswrapper[18592]: I0308 03:53:09.233342 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-config\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:53:09.233413 master-0 kubenswrapper[18592]: I0308 03:53:09.233361 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bee226a-2a66-4032-8aba-2c8b82abcb6a-utilities\") pod \"redhat-operators-rnz4w\" (UID: \"6bee226a-2a66-4032-8aba-2c8b82abcb6a\") " pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:53:09.233413 master-0 kubenswrapper[18592]: I0308 03:53:09.233378 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1b69fbf6-1ca5-413e-bffd-965730bcec1b-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:53:09.233413 master-0 kubenswrapper[18592]: I0308 03:53:09.233395 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d377285-0336-41b7-b48f-c44a7b563498-config\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:53:09.233413 master-0 kubenswrapper[18592]: I0308 03:53:09.233412 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-serving-cert\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:53:09.233522 master-0 kubenswrapper[18592]: I0308 03:53:09.233430 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ee586416-6f56-4ea4-ad62-95de1e6df23b-snapshots\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:53:09.233522 master-0 kubenswrapper[18592]: I0308 03:53:09.233450 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/164586b1-f133-4427-8ab6-eb0839b79738-webhook-cert\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:53:09.233522 master-0 kubenswrapper[18592]: I0308 03:53:09.233467 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-sysctl-d\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.233522 master-0 kubenswrapper[18592]: I0308 03:53:09.233483 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d2cd5b23-e622-4b96-aee8-dbc942b73b4a-hosts-file\") pod \"node-resolver-wjl9v\" (UID: \"d2cd5b23-e622-4b96-aee8-dbc942b73b4a\") " pod="openshift-dns/node-resolver-wjl9v" Mar 08 03:53:09.233522 master-0 kubenswrapper[18592]: I0308 03:53:09.233502 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hccv4\" (UniqueName: \"kubernetes.io/projected/0ebf1330-e044-4ff5-8b48-2d667e0c5625-kube-api-access-hccv4\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:53:09.233709 master-0 kubenswrapper[18592]: I0308 03:53:09.233519 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:53:09.233709 master-0 kubenswrapper[18592]: I0308 03:53:09.233551 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:53:09.233709 master-0 kubenswrapper[18592]: I0308 03:53:09.233567 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:53:09.233709 master-0 kubenswrapper[18592]: I0308 03:53:09.233585 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfz6w\" (UniqueName: \"kubernetes.io/projected/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6-kube-api-access-nfz6w\") pod \"dns-default-4pjsn\" (UID: \"7b485db9-29b5-45a1-a4fb-b4264c6bf2d6\") " pod="openshift-dns/dns-default-4pjsn" Mar 08 03:53:09.233709 master-0 kubenswrapper[18592]: I0308 03:53:09.233601 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2262647b-c315-477a-93bd-f168c1810475-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:53:09.233709 master-0 kubenswrapper[18592]: I0308 03:53:09.233617 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3ddfd0e7-fe76-41bc-b316-94505df81002-host-etc-kube\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:53:09.233709 master-0 kubenswrapper[18592]: I0308 03:53:09.233634 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-apiservice-cert\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:53:09.233709 master-0 kubenswrapper[18592]: I0308 03:53:09.233651 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:53:09.233709 master-0 kubenswrapper[18592]: I0308 03:53:09.233667 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-trusted-ca\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:53:09.233709 master-0 kubenswrapper[18592]: I0308 03:53:09.233684 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-systemd-units\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.233709 master-0 kubenswrapper[18592]: I0308 03:53:09.233701 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp98d\" (UniqueName: \"kubernetes.io/projected/6bee226a-2a66-4032-8aba-2c8b82abcb6a-kube-api-access-tp98d\") pod \"redhat-operators-rnz4w\" (UID: \"6bee226a-2a66-4032-8aba-2c8b82abcb6a\") " pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:53:09.233709 master-0 kubenswrapper[18592]: I0308 03:53:09.233717 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/2f59fe81-deee-4ced-ae9d-f17752c82c4b-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:53:09.234614 master-0 kubenswrapper[18592]: I0308 03:53:09.233736 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vndvf\" (UniqueName: \"kubernetes.io/projected/9ec89e27-4360-48f2-a7ca-5d823bda4510-kube-api-access-vndvf\") pod \"csi-snapshot-controller-7577d6f48-h4qlp\" (UID: \"9ec89e27-4360-48f2-a7ca-5d823bda4510\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" Mar 08 03:53:09.234614 master-0 kubenswrapper[18592]: I0308 03:53:09.233760 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mghmh\" (UniqueName: \"kubernetes.io/projected/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-kube-api-access-mghmh\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:53:09.234614 master-0 kubenswrapper[18592]: I0308 03:53:09.233780 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:53:09.234614 master-0 kubenswrapper[18592]: I0308 03:53:09.233799 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ebf1330-e044-4ff5-8b48-2d667e0c5625-config\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:53:09.234614 master-0 kubenswrapper[18592]: I0308 03:53:09.233945 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-env-overrides\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.234614 master-0 kubenswrapper[18592]: I0308 03:53:09.233978 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:53:09.234614 master-0 kubenswrapper[18592]: I0308 03:53:09.234401 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d377285-0336-41b7-b48f-c44a7b563498-config\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:53:09.234796 master-0 kubenswrapper[18592]: I0308 03:53:09.234651 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-trusted-ca\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:53:09.234796 master-0 kubenswrapper[18592]: I0308 03:53:09.234733 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bee226a-2a66-4032-8aba-2c8b82abcb6a-utilities\") pod \"redhat-operators-rnz4w\" (UID: \"6bee226a-2a66-4032-8aba-2c8b82abcb6a\") " pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:53:09.234868 master-0 kubenswrapper[18592]: I0308 03:53:09.234858 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ee586416-6f56-4ea4-ad62-95de1e6df23b-snapshots\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:53:09.235020 master-0 kubenswrapper[18592]: I0308 03:53:09.234985 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ebf1330-e044-4ff5-8b48-2d667e0c5625-config\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:53:09.235111 master-0 kubenswrapper[18592]: I0308 03:53:09.235075 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efdcef9-9b31-4567-b7f9-cb59a894273d-metrics-tls\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:53:09.235220 master-0 kubenswrapper[18592]: I0308 03:53:09.235198 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:53:09.235251 master-0 kubenswrapper[18592]: I0308 03:53:09.235210 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:53:09.235279 master-0 kubenswrapper[18592]: I0308 03:53:09.235261 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-image-import-ca\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.235379 master-0 kubenswrapper[18592]: I0308 03:53:09.235357 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v45k\" (UniqueName: \"kubernetes.io/projected/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-kube-api-access-4v45k\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.235624 master-0 kubenswrapper[18592]: I0308 03:53:09.235590 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b700d17-83d2-46c8-afbc-e5774822eabe-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-bv67b\" (UID: \"1b700d17-83d2-46c8-afbc-e5774822eabe\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:53:09.235674 master-0 kubenswrapper[18592]: I0308 03:53:09.235649 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpsx7\" (UniqueName: \"kubernetes.io/projected/8efdcef9-9b31-4567-b7f9-cb59a894273d-kube-api-access-cpsx7\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:53:09.235732 master-0 kubenswrapper[18592]: I0308 03:53:09.235696 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-tuned\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.235764 master-0 kubenswrapper[18592]: I0308 03:53:09.235744 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:53:09.235836 master-0 kubenswrapper[18592]: I0308 03:53:09.235782 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9z8g\" (UniqueName: \"kubernetes.io/projected/139881ee-6cfa-4a7e-b002-63cece048d16-kube-api-access-h9z8g\") pod \"control-plane-machine-set-operator-6686554ddc-7bcsk\" (UID: \"139881ee-6cfa-4a7e-b002-63cece048d16\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk" Mar 08 03:53:09.235836 master-0 kubenswrapper[18592]: I0308 03:53:09.235812 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.235900 master-0 kubenswrapper[18592]: I0308 03:53:09.235866 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:53:09.235931 master-0 kubenswrapper[18592]: I0308 03:53:09.235924 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nk8r\" (UniqueName: \"kubernetes.io/projected/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-kube-api-access-7nk8r\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.235959 master-0 kubenswrapper[18592]: I0308 03:53:09.235947 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7752f9-7b9a-451f-997a-e9f696d38b34-serving-cert\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:53:09.235993 master-0 kubenswrapper[18592]: I0308 03:53:09.235965 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rchj5\" (UniqueName: \"kubernetes.io/projected/e78b283b-981e-48d7-a5f2-53f8401766ea-kube-api-access-rchj5\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:53:09.235993 master-0 kubenswrapper[18592]: I0308 03:53:09.235985 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-serving-cert\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:53:09.236047 master-0 kubenswrapper[18592]: I0308 03:53:09.236008 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/2f59fe81-deee-4ced-ae9d-f17752c82c4b-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:53:09.236047 master-0 kubenswrapper[18592]: I0308 03:53:09.236028 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-etcd-client\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.236102 master-0 kubenswrapper[18592]: I0308 03:53:09.236045 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-netns\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.236171 master-0 kubenswrapper[18592]: I0308 03:53:09.236148 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-sysconfig\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.236207 master-0 kubenswrapper[18592]: I0308 03:53:09.236182 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-sys\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.236207 master-0 kubenswrapper[18592]: I0308 03:53:09.236204 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:09.236277 master-0 kubenswrapper[18592]: I0308 03:53:09.236222 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:53:09.236277 master-0 kubenswrapper[18592]: I0308 03:53:09.236244 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkzb2\" (UniqueName: \"kubernetes.io/projected/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-kube-api-access-mkzb2\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:53:09.236334 master-0 kubenswrapper[18592]: I0308 03:53:09.236281 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw7bx\" (UniqueName: \"kubernetes.io/projected/4a19441e-e61b-4d58-85db-813ae88e1f9b-kube-api-access-dw7bx\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.236334 master-0 kubenswrapper[18592]: I0308 03:53:09.236300 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26180f77-0b1a-4d0f-9ed0-a12fdee69817-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:53:09.236334 master-0 kubenswrapper[18592]: I0308 03:53:09.236317 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-config\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:53:09.236457 master-0 kubenswrapper[18592]: I0308 03:53:09.236440 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-systemd\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.236487 master-0 kubenswrapper[18592]: I0308 03:53:09.236466 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtts2\" (UniqueName: \"kubernetes.io/projected/76ba45a2-8945-4afe-b913-126c26725867-kube-api-access-dtts2\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:09.236487 master-0 kubenswrapper[18592]: I0308 03:53:09.236484 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-cni-netd\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.236547 master-0 kubenswrapper[18592]: I0308 03:53:09.236502 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-serving-cert\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:53:09.236547 master-0 kubenswrapper[18592]: I0308 03:53:09.236521 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-cni-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.236547 master-0 kubenswrapper[18592]: I0308 03:53:09.236540 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-proxy-ca-bundles\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:53:09.236620 master-0 kubenswrapper[18592]: I0308 03:53:09.236556 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-kubelet\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.236620 master-0 kubenswrapper[18592]: I0308 03:53:09.236578 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3d1dc7-22f9-4c0c-802a-d7314894b255-catalog-content\") pod \"community-operators-lwt58\" (UID: \"2b3d1dc7-22f9-4c0c-802a-d7314894b255\") " pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:53:09.236620 master-0 kubenswrapper[18592]: I0308 03:53:09.236598 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmpdd\" (UniqueName: \"kubernetes.io/projected/0418ff42-7eac-4266-97b5-4df88623d066-kube-api-access-kmpdd\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:53:09.236620 master-0 kubenswrapper[18592]: I0308 03:53:09.236614 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/76ba45a2-8945-4afe-b913-126c26725867-audit-policies\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:09.236719 master-0 kubenswrapper[18592]: I0308 03:53:09.236632 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-whereabouts-configmap\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.236719 master-0 kubenswrapper[18592]: I0308 03:53:09.236650 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e283f49-b85d-4789-a71f-3fcb5033cdf0-catalog-content\") pod \"redhat-marketplace-4h8qm\" (UID: \"8e283f49-b85d-4789-a71f-3fcb5033cdf0\") " pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:53:09.236719 master-0 kubenswrapper[18592]: I0308 03:53:09.236666 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-client-ca\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:53:09.236719 master-0 kubenswrapper[18592]: I0308 03:53:09.236684 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-ca\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:53:09.236719 master-0 kubenswrapper[18592]: I0308 03:53:09.236703 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.236719 master-0 kubenswrapper[18592]: I0308 03:53:09.236720 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-system-cni-dir\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.236896 master-0 kubenswrapper[18592]: I0308 03:53:09.236737 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bee226a-2a66-4032-8aba-2c8b82abcb6a-catalog-content\") pod \"redhat-operators-rnz4w\" (UID: \"6bee226a-2a66-4032-8aba-2c8b82abcb6a\") " pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:53:09.236896 master-0 kubenswrapper[18592]: I0308 03:53:09.236754 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lfcj\" (UniqueName: \"kubernetes.io/projected/54ad284e-d40e-4e69-b898-f5093952a0e6-kube-api-access-9lfcj\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:53:09.236896 master-0 kubenswrapper[18592]: I0308 03:53:09.236772 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sx5s\" (UniqueName: \"kubernetes.io/projected/b3eea925-73b3-4693-8f0e-6dd26107f60a-kube-api-access-6sx5s\") pod \"cluster-storage-operator-6fbfc8dc8f-nm8fj\" (UID: \"b3eea925-73b3-4693-8f0e-6dd26107f60a\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" Mar 08 03:53:09.236896 master-0 kubenswrapper[18592]: I0308 03:53:09.236790 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ddfd0e7-fe76-41bc-b316-94505df81002-metrics-tls\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:53:09.236896 master-0 kubenswrapper[18592]: I0308 03:53:09.236806 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-slash\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.236896 master-0 kubenswrapper[18592]: I0308 03:53:09.236837 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:53:09.236896 master-0 kubenswrapper[18592]: I0308 03:53:09.236854 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:53:09.236896 master-0 kubenswrapper[18592]: I0308 03:53:09.236873 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30211469-7108-4820-a988-26fc4ced734e-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:53:09.236896 master-0 kubenswrapper[18592]: I0308 03:53:09.236891 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-os-release\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.237107 master-0 kubenswrapper[18592]: I0308 03:53:09.236909 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:53:09.237107 master-0 kubenswrapper[18592]: I0308 03:53:09.236927 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg9kg\" (UniqueName: \"kubernetes.io/projected/e187516f-8f33-4c17-81d6-60c10b580bb0-kube-api-access-vg9kg\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.237165 master-0 kubenswrapper[18592]: I0308 03:53:09.237123 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:53:09.237165 master-0 kubenswrapper[18592]: I0308 03:53:09.237147 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:53:09.237221 master-0 kubenswrapper[18592]: I0308 03:53:09.237168 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/232c421d-96f0-4894-b8d8-74f43d02bbd3-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:53:09.237221 master-0 kubenswrapper[18592]: I0308 03:53:09.237186 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fncng\" (UniqueName: \"kubernetes.io/projected/30211469-7108-4820-a988-26fc4ced734e-kube-api-access-fncng\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:53:09.237221 master-0 kubenswrapper[18592]: I0308 03:53:09.237210 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-system-cni-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.237301 master-0 kubenswrapper[18592]: I0308 03:53:09.237234 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwkwt\" (UniqueName: \"kubernetes.io/projected/d831cb23-7411-4072-8273-c167d9afca28-kube-api-access-dwkwt\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:53:09.237301 master-0 kubenswrapper[18592]: I0308 03:53:09.237256 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-conf-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.237301 master-0 kubenswrapper[18592]: I0308 03:53:09.237282 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx4fw\" (UniqueName: \"kubernetes.io/projected/232c421d-96f0-4894-b8d8-74f43d02bbd3-kube-api-access-fx4fw\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:53:09.237376 master-0 kubenswrapper[18592]: I0308 03:53:09.237304 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-cni-binary-copy\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.237376 master-0 kubenswrapper[18592]: I0308 03:53:09.237327 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6-config-volume\") pod \"dns-default-4pjsn\" (UID: \"7b485db9-29b5-45a1-a4fb-b4264c6bf2d6\") " pod="openshift-dns/dns-default-4pjsn" Mar 08 03:53:09.237376 master-0 kubenswrapper[18592]: I0308 03:53:09.237352 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb87l\" (UniqueName: \"kubernetes.io/projected/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-kube-api-access-pb87l\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:53:09.237448 master-0 kubenswrapper[18592]: I0308 03:53:09.237376 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-node-pullsecrets\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.237448 master-0 kubenswrapper[18592]: I0308 03:53:09.237402 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-bound-sa-token\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:53:09.237448 master-0 kubenswrapper[18592]: I0308 03:53:09.237427 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovnkube-script-lib\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.237522 master-0 kubenswrapper[18592]: I0308 03:53:09.237451 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2h62\" (UniqueName: \"kubernetes.io/projected/1482d789-884b-4337-b598-f0e2b71eb9f2-kube-api-access-m2h62\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:53:09.237522 master-0 kubenswrapper[18592]: I0308 03:53:09.237476 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0861ccd-5e86-4277-9082-95f3133508a0-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-5v6gs\" (UID: \"c0861ccd-5e86-4277-9082-95f3133508a0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:53:09.237522 master-0 kubenswrapper[18592]: I0308 03:53:09.237497 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-host\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.237667 master-0 kubenswrapper[18592]: I0308 03:53:09.237646 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26180f77-0b1a-4d0f-9ed0-a12fdee69817-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:53:09.237702 master-0 kubenswrapper[18592]: I0308 03:53:09.237675 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-audit-dir\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.237758 master-0 kubenswrapper[18592]: I0308 03:53:09.237694 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.237790 master-0 kubenswrapper[18592]: I0308 03:53:09.237769 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2262647b-c315-477a-93bd-f168c1810475-service-ca\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:53:09.238004 master-0 kubenswrapper[18592]: I0308 03:53:09.237787 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nlq2\" (UniqueName: \"kubernetes.io/projected/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-kube-api-access-6nlq2\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:53:09.238047 master-0 kubenswrapper[18592]: I0308 03:53:09.238009 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-etc-kubernetes\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.238047 master-0 kubenswrapper[18592]: I0308 03:53:09.238033 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/127c3f92-8283-4179-9e40-a12dcabaaa12-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:53:09.238098 master-0 kubenswrapper[18592]: I0308 03:53:09.238052 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-cni-binary-copy\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.238131 master-0 kubenswrapper[18592]: I0308 03:53:09.238106 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3d1dc7-22f9-4c0c-802a-d7314894b255-catalog-content\") pod \"community-operators-lwt58\" (UID: \"2b3d1dc7-22f9-4c0c-802a-d7314894b255\") " pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:53:09.238131 master-0 kubenswrapper[18592]: I0308 03:53:09.238108 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-serving-cert\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.238189 master-0 kubenswrapper[18592]: I0308 03:53:09.238153 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/49ec083d-dc74-457e-b10f-3bde04e9e75e-signing-cabundle\") pod \"service-ca-84bfdbbb7f-gj69x\" (UID: \"49ec083d-dc74-457e-b10f-3bde04e9e75e\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" Mar 08 03:53:09.238218 master-0 kubenswrapper[18592]: I0308 03:53:09.238196 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:53:09.238218 master-0 kubenswrapper[18592]: I0308 03:53:09.238203 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:53:09.238276 master-0 kubenswrapper[18592]: I0308 03:53:09.238249 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-whereabouts-configmap\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.238426 master-0 kubenswrapper[18592]: I0308 03:53:09.238401 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26180f77-0b1a-4d0f-9ed0-a12fdee69817-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:53:09.238475 master-0 kubenswrapper[18592]: I0308 03:53:09.238456 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:53:09.238511 master-0 kubenswrapper[18592]: I0308 03:53:09.238493 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4stz\" (UniqueName: \"kubernetes.io/projected/164586b1-f133-4427-8ab6-eb0839b79738-kube-api-access-r4stz\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:53:09.238539 master-0 kubenswrapper[18592]: I0308 03:53:09.238524 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fzmf\" (UniqueName: \"kubernetes.io/projected/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-kube-api-access-7fzmf\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:53:09.238567 master-0 kubenswrapper[18592]: I0308 03:53:09.238537 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-config\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:53:09.238567 master-0 kubenswrapper[18592]: I0308 03:53:09.238551 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfz27\" (UniqueName: \"kubernetes.io/projected/1b69fbf6-1ca5-413e-bffd-965730bcec1b-kube-api-access-nfz27\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:53:09.238621 master-0 kubenswrapper[18592]: I0308 03:53:09.238579 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:53:09.238621 master-0 kubenswrapper[18592]: I0308 03:53:09.238481 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-cni-binary-copy\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.238745 master-0 kubenswrapper[18592]: I0308 03:53:09.238726 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:53:09.238791 master-0 kubenswrapper[18592]: I0308 03:53:09.238749 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-config\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:53:09.238791 master-0 kubenswrapper[18592]: I0308 03:53:09.238778 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:53:09.238912 master-0 kubenswrapper[18592]: I0308 03:53:09.238883 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.238954 master-0 kubenswrapper[18592]: I0308 03:53:09.238887 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:53:09.238954 master-0 kubenswrapper[18592]: I0308 03:53:09.238934 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-serving-cert\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:53:09.239007 master-0 kubenswrapper[18592]: I0308 03:53:09.238917 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30211469-7108-4820-a988-26fc4ced734e-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:53:09.239007 master-0 kubenswrapper[18592]: I0308 03:53:09.238986 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-tuned\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.239061 master-0 kubenswrapper[18592]: I0308 03:53:09.239031 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76ba45a2-8945-4afe-b913-126c26725867-encryption-config\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:09.239061 master-0 kubenswrapper[18592]: I0308 03:53:09.239051 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bee226a-2a66-4032-8aba-2c8b82abcb6a-catalog-content\") pod \"redhat-operators-rnz4w\" (UID: \"6bee226a-2a66-4032-8aba-2c8b82abcb6a\") " pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:53:09.239242 master-0 kubenswrapper[18592]: I0308 03:53:09.239220 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cde5024-edf7-4fa4-8964-cabe7899578b-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:53:09.239242 master-0 kubenswrapper[18592]: I0308 03:53:09.239229 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-ca\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:53:09.239407 master-0 kubenswrapper[18592]: I0308 03:53:09.239373 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/232c421d-96f0-4894-b8d8-74f43d02bbd3-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:53:09.239407 master-0 kubenswrapper[18592]: I0308 03:53:09.239386 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1482d789-884b-4337-b598-f0e2b71eb9f2-srv-cert\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:53:09.239467 master-0 kubenswrapper[18592]: I0308 03:53:09.239312 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a19441e-e61b-4d58-85db-813ae88e1f9b-cni-binary-copy\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.239467 master-0 kubenswrapper[18592]: I0308 03:53:09.239424 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ddfd0e7-fe76-41bc-b316-94505df81002-metrics-tls\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:53:09.239523 master-0 kubenswrapper[18592]: I0308 03:53:09.239473 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-ovn\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.239523 master-0 kubenswrapper[18592]: I0308 03:53:09.239474 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e283f49-b85d-4789-a71f-3fcb5033cdf0-catalog-content\") pod \"redhat-marketplace-4h8qm\" (UID: \"8e283f49-b85d-4789-a71f-3fcb5033cdf0\") " pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:53:09.239523 master-0 kubenswrapper[18592]: I0308 03:53:09.239473 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7752f9-7b9a-451f-997a-e9f696d38b34-serving-cert\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:53:09.239597 master-0 kubenswrapper[18592]: I0308 03:53:09.239515 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:53:09.239597 master-0 kubenswrapper[18592]: I0308 03:53:09.239586 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/d831cb23-7411-4072-8273-c167d9afca28-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:53:09.239657 master-0 kubenswrapper[18592]: I0308 03:53:09.239635 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/127c3f92-8283-4179-9e40-a12dcabaaa12-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:53:09.239874 master-0 kubenswrapper[18592]: I0308 03:53:09.239846 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-iptables-alerter-script\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:53:09.239955 master-0 kubenswrapper[18592]: I0308 03:53:09.239891 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26180f77-0b1a-4d0f-9ed0-a12fdee69817-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:53:09.239955 master-0 kubenswrapper[18592]: I0308 03:53:09.239933 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:09.239955 master-0 kubenswrapper[18592]: I0308 03:53:09.239954 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30211469-7108-4820-a988-26fc4ced734e-config\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:53:09.240530 master-0 kubenswrapper[18592]: I0308 03:53:09.239994 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfqc5\" (UniqueName: \"kubernetes.io/projected/7ff63c73-62a3-44b4-acd3-1b3df175794f-kube-api-access-vfqc5\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:53:09.240530 master-0 kubenswrapper[18592]: I0308 03:53:09.240015 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxxhh\" (UniqueName: \"kubernetes.io/projected/ee586416-6f56-4ea4-ad62-95de1e6df23b-kube-api-access-sxxhh\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:53:09.240530 master-0 kubenswrapper[18592]: I0308 03:53:09.240034 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0861ccd-5e86-4277-9082-95f3133508a0-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-5v6gs\" (UID: \"c0861ccd-5e86-4277-9082-95f3133508a0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:53:09.240530 master-0 kubenswrapper[18592]: I0308 03:53:09.240052 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-run-netns\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.240530 master-0 kubenswrapper[18592]: I0308 03:53:09.240103 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-host-slash\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:53:09.240530 master-0 kubenswrapper[18592]: I0308 03:53:09.240122 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7e5935ea-8d95-45e3-b836-c7892953ef3d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:53:09.240530 master-0 kubenswrapper[18592]: I0308 03:53:09.240139 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-run\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.240530 master-0 kubenswrapper[18592]: I0308 03:53:09.240157 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76ba45a2-8945-4afe-b913-126c26725867-audit-dir\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:09.240530 master-0 kubenswrapper[18592]: I0308 03:53:09.240176 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:53:09.240530 master-0 kubenswrapper[18592]: I0308 03:53:09.240193 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d831cb23-7411-4072-8273-c167d9afca28-config\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:53:09.240530 master-0 kubenswrapper[18592]: I0308 03:53:09.240214 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-cni-bin\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.240530 master-0 kubenswrapper[18592]: I0308 03:53:09.240234 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2262647b-c315-477a-93bd-f168c1810475-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:53:09.241280 master-0 kubenswrapper[18592]: I0308 03:53:09.240783 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:53:09.241280 master-0 kubenswrapper[18592]: I0308 03:53:09.240808 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-config\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.241280 master-0 kubenswrapper[18592]: I0308 03:53:09.240844 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcjr9\" (UniqueName: \"kubernetes.io/projected/49ec083d-dc74-457e-b10f-3bde04e9e75e-kube-api-access-zcjr9\") pod \"service-ca-84bfdbbb7f-gj69x\" (UID: \"49ec083d-dc74-457e-b10f-3bde04e9e75e\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" Mar 08 03:53:09.241280 master-0 kubenswrapper[18592]: I0308 03:53:09.240864 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-systemd\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.241280 master-0 kubenswrapper[18592]: I0308 03:53:09.240884 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhms8\" (UniqueName: \"kubernetes.io/projected/0031e3a9-b253-4dda-a890-bf3e4d8737e8-kube-api-access-qhms8\") pod \"certified-operators-p8nq8\" (UID: \"0031e3a9-b253-4dda-a890-bf3e4d8737e8\") " pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:53:09.241280 master-0 kubenswrapper[18592]: I0308 03:53:09.240905 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-socket-dir-parent\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.241280 master-0 kubenswrapper[18592]: I0308 03:53:09.240927 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6gml\" (UniqueName: \"kubernetes.io/projected/7e5935ea-8d95-45e3-b836-c7892953ef3d-kube-api-access-c6gml\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:53:09.241280 master-0 kubenswrapper[18592]: I0308 03:53:09.240949 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/164586b1-f133-4427-8ab6-eb0839b79738-env-overrides\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:53:09.241280 master-0 kubenswrapper[18592]: I0308 03:53:09.240970 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-sysctl-conf\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.241280 master-0 kubenswrapper[18592]: I0308 03:53:09.240989 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-serving-cert\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:53:09.241280 master-0 kubenswrapper[18592]: I0308 03:53:09.241032 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/7ff63c73-62a3-44b4-acd3-1b3df175794f-operand-assets\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:53:09.241280 master-0 kubenswrapper[18592]: I0308 03:53:09.241164 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-serving-cert\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:53:09.241712 master-0 kubenswrapper[18592]: I0308 03:53:09.241198 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgc7c\" (UniqueName: \"kubernetes.io/projected/3ddfd0e7-fe76-41bc-b316-94505df81002-kube-api-access-bgc7c\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:53:09.241712 master-0 kubenswrapper[18592]: I0308 03:53:09.241296 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:53:09.241712 master-0 kubenswrapper[18592]: I0308 03:53:09.241576 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7e5935ea-8d95-45e3-b836-c7892953ef3d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:53:09.242029 master-0 kubenswrapper[18592]: I0308 03:53:09.242007 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5044ffd-0686-4679-9894-e696faf33699-metrics-certs\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:53:09.242132 master-0 kubenswrapper[18592]: I0308 03:53:09.242111 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d831cb23-7411-4072-8273-c167d9afca28-config\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:53:09.242525 master-0 kubenswrapper[18592]: I0308 03:53:09.242327 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/7ff63c73-62a3-44b4-acd3-1b3df175794f-operand-assets\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:53:09.243338 master-0 kubenswrapper[18592]: I0308 03:53:09.242801 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:53:09.243338 master-0 kubenswrapper[18592]: I0308 03:53:09.242984 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26180f77-0b1a-4d0f-9ed0-a12fdee69817-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:53:09.243338 master-0 kubenswrapper[18592]: I0308 03:53:09.243105 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-node-log\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.243338 master-0 kubenswrapper[18592]: I0308 03:53:09.243151 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cwmn\" (UniqueName: \"kubernetes.io/projected/634c0f6d-bce6-42cf-9253-80d1bcc7c507-kube-api-access-8cwmn\") pod \"cluster-samples-operator-664cb58b85-lrnks\" (UID: \"634c0f6d-bce6-42cf-9253-80d1bcc7c507\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks" Mar 08 03:53:09.243338 master-0 kubenswrapper[18592]: I0308 03:53:09.243289 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7e5935ea-8d95-45e3-b836-c7892953ef3d-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.243398 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0418ff42-7eac-4266-97b5-4df88623d066-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.243546 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7e5935ea-8d95-45e3-b836-c7892953ef3d-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.244061 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30211469-7108-4820-a988-26fc4ced734e-config\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.244849 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246458 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b700d17-83d2-46c8-afbc-e5774822eabe-cert\") pod \"cluster-autoscaler-operator-69576476f7-bv67b\" (UID: \"1b700d17-83d2-46c8-afbc-e5774822eabe\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246493 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e283f49-b85d-4789-a71f-3fcb5033cdf0-utilities\") pod \"redhat-marketplace-4h8qm\" (UID: \"8e283f49-b85d-4789-a71f-3fcb5033cdf0\") " pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246515 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7rsc\" (UniqueName: \"kubernetes.io/projected/c0861ccd-5e86-4277-9082-95f3133508a0-kube-api-access-n7rsc\") pod \"cloud-credential-operator-55d85b7b47-5v6gs\" (UID: \"c0861ccd-5e86-4277-9082-95f3133508a0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246535 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-lib-modules\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246556 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kc5q\" (UniqueName: \"kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q\") pod \"network-check-target-xmgpj\" (UID: \"e93b5361-30e6-44fd-a59e-2bc410c59480\") " pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246575 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-trusted-ca-bundle\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246592 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv7sd\" (UniqueName: \"kubernetes.io/projected/1b700d17-83d2-46c8-afbc-e5774822eabe-kube-api-access-cv7sd\") pod \"cluster-autoscaler-operator-69576476f7-bv67b\" (UID: \"1b700d17-83d2-46c8-afbc-e5774822eabe\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246609 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vklx\" (UniqueName: \"kubernetes.io/projected/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-kube-api-access-2vklx\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246631 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0031e3a9-b253-4dda-a890-bf3e4d8737e8-utilities\") pod \"certified-operators-p8nq8\" (UID: \"0031e3a9-b253-4dda-a890-bf3e4d8737e8\") " pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246648 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfgc6\" (UniqueName: \"kubernetes.io/projected/2b3d1dc7-22f9-4c0c-802a-d7314894b255-kube-api-access-zfgc6\") pod \"community-operators-lwt58\" (UID: \"2b3d1dc7-22f9-4c0c-802a-d7314894b255\") " pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246664 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee586416-6f56-4ea4-ad62-95de1e6df23b-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246682 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmsj5\" (UniqueName: \"kubernetes.io/projected/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-kube-api-access-hmsj5\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246699 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246715 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovn-node-metrics-cert\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246734 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2f59fe81-deee-4ced-ae9d-f17752c82c4b-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246753 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7smmf\" (UniqueName: \"kubernetes.io/projected/84d353ae-3992-4c17-a20e-3415edd92509-kube-api-access-7smmf\") pod \"migrator-57ccdf9b5-wqldq\" (UID: \"84d353ae-3992-4c17-a20e-3415edd92509\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-wqldq" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246769 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76ba45a2-8945-4afe-b913-126c26725867-trusted-ca-bundle\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246785 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-etc-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246805 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/634c0f6d-bce6-42cf-9253-80d1bcc7c507-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-lrnks\" (UID: \"634c0f6d-bce6-42cf-9253-80d1bcc7c507\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246908 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246935 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d377285-0336-41b7-b48f-c44a7b563498-serving-cert\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246952 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-client-ca\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246974 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw7mr\" (UniqueName: \"kubernetes.io/projected/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-kube-api-access-fw7mr\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.246992 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-k8s-cni-cncf-io\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247009 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247026 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxkm6\" (UniqueName: \"kubernetes.io/projected/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-kube-api-access-zxkm6\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247042 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee586416-6f56-4ea4-ad62-95de1e6df23b-serving-cert\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247059 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-config\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247077 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovnkube-config\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247095 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0031e3a9-b253-4dda-a890-bf3e4d8737e8-catalog-content\") pod \"certified-operators-p8nq8\" (UID: \"0031e3a9-b253-4dda-a890-bf3e4d8737e8\") " pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247113 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ebf1330-e044-4ff5-8b48-2d667e0c5625-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247150 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247171 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-config\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247188 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-cni-multus\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247206 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-webhook-cert\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247222 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-kubernetes\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247239 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-config\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247261 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3eea925-73b3-4693-8f0e-6dd26107f60a-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-nm8fj\" (UID: \"b3eea925-73b3-4693-8f0e-6dd26107f60a\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247279 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jljzc\" (UniqueName: \"kubernetes.io/projected/d2cd5b23-e622-4b96-aee8-dbc942b73b4a-kube-api-access-jljzc\") pod \"node-resolver-wjl9v\" (UID: \"d2cd5b23-e622-4b96-aee8-dbc942b73b4a\") " pod="openshift-dns/node-resolver-wjl9v" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247307 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdn9r\" (UniqueName: \"kubernetes.io/projected/127c3f92-8283-4179-9e40-a12dcabaaa12-kube-api-access-zdn9r\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247325 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff63c73-62a3-44b4-acd3-1b3df175794f-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247555 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff63c73-62a3-44b4-acd3-1b3df175794f-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247721 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d377285-0336-41b7-b48f-c44a7b563498-serving-cert\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247853 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e283f49-b85d-4789-a71f-3fcb5033cdf0-utilities\") pod \"redhat-marketplace-4h8qm\" (UID: \"8e283f49-b85d-4789-a71f-3fcb5033cdf0\") " pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.247997 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-srv-cert\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.248137 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0031e3a9-b253-4dda-a890-bf3e4d8737e8-utilities\") pod \"certified-operators-p8nq8\" (UID: \"0031e3a9-b253-4dda-a890-bf3e4d8737e8\") " pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.248303 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee586416-6f56-4ea4-ad62-95de1e6df23b-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.248339 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-metrics-tls\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.248358 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2f59fe81-deee-4ced-ae9d-f17752c82c4b-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.248422 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0031e3a9-b253-4dda-a890-bf3e4d8737e8-catalog-content\") pod \"certified-operators-p8nq8\" (UID: \"0031e3a9-b253-4dda-a890-bf3e4d8737e8\") " pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.248471 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovnkube-config\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.248500 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm59c\" (UniqueName: \"kubernetes.io/projected/8e283f49-b85d-4789-a71f-3fcb5033cdf0-kube-api-access-zm59c\") pod \"redhat-marketplace-4h8qm\" (UID: \"8e283f49-b85d-4789-a71f-3fcb5033cdf0\") " pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.248572 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.248597 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfvnn\" (UniqueName: \"kubernetes.io/projected/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-kube-api-access-cfvnn\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.248617 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmhtb\" (UniqueName: \"kubernetes.io/projected/d5044ffd-0686-4679-9894-e696faf33699-kube-api-access-mmhtb\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.248637 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3d1dc7-22f9-4c0c-802a-d7314894b255-utilities\") pod \"community-operators-lwt58\" (UID: \"2b3d1dc7-22f9-4c0c-802a-d7314894b255\") " pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.248644 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3eea925-73b3-4693-8f0e-6dd26107f60a-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-nm8fj\" (UID: \"b3eea925-73b3-4693-8f0e-6dd26107f60a\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.248655 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-cnibin\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.248677 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqhzl\" (UniqueName: \"kubernetes.io/projected/1eb851be-f157-48ea-9a39-1361b68d2639-kube-api-access-nqhzl\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.249198 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee586416-6f56-4ea4-ad62-95de1e6df23b-serving-cert\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.249234 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.249256 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29dpg\" (UniqueName: \"kubernetes.io/projected/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-kube-api-access-29dpg\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.249276 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-available-featuregates\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.249297 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.249315 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-audit\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.249438 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-available-featuregates\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.249638 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.249673 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-images\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.249984 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3d1dc7-22f9-4c0c-802a-d7314894b255-utilities\") pod \"community-operators-lwt58\" (UID: \"2b3d1dc7-22f9-4c0c-802a-d7314894b255\") " pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.250015 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ebf1330-e044-4ff5-8b48-2d667e0c5625-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.250329 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-config\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.250365 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b5zb\" (UniqueName: \"kubernetes.io/projected/5a7752f9-7b9a-451f-997a-e9f696d38b34-kube-api-access-8b5zb\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.250390 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-log-socket\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.250413 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-multus-certs\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.250433 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x997v\" (UniqueName: \"kubernetes.io/projected/6cde5024-edf7-4fa4-8964-cabe7899578b-kube-api-access-x997v\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.250455 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qn5v\" (UniqueName: \"kubernetes.io/projected/0d377285-0336-41b7-b48f-c44a7b563498-kube-api-access-7qn5v\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.250462 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-images\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.250654 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.250878 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm7bw\" (UniqueName: \"kubernetes.io/projected/2f59fe81-deee-4ced-ae9d-f17752c82c4b-kube-api-access-bm7bw\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.250898 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-modprobe-d\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.251036 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1b69fbf6-1ca5-413e-bffd-965730bcec1b-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.251082 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1b69fbf6-1ca5-413e-bffd-965730bcec1b-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.251108 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d831cb23-7411-4072-8273-c167d9afca28-images\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.251128 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/164586b1-f133-4427-8ab6-eb0839b79738-ovnkube-identity-cm\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.251149 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-images\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.251167 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-encryption-config\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.251188 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7e5935ea-8d95-45e3-b836-c7892953ef3d-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.251206 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.251206 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.251225 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/2f59fe81-deee-4ced-ae9d-f17752c82c4b-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:53:09.251157 master-0 kubenswrapper[18592]: I0308 03:53:09.251250 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-tmpfs\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.251270 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76ba45a2-8945-4afe-b913-126c26725867-serving-cert\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.251290 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-hostroot\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.251308 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1b69fbf6-1ca5-413e-bffd-965730bcec1b-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.251326 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-kubelet\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.251605 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/139881ee-6cfa-4a7e-b002-63cece048d16-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-7bcsk\" (UID: \"139881ee-6cfa-4a7e-b002-63cece048d16\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.251694 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2262647b-c315-477a-93bd-f168c1810475-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.251773 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-client\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.251852 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1b69fbf6-1ca5-413e-bffd-965730bcec1b-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.251925 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-cni-bin\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.251998 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-daemon-config\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.252039 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-cnibin\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.252099 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e187516f-8f33-4c17-81d6-60c10b580bb0-tmp\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.252126 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-var-lib-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.252179 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/0418ff42-7eac-4266-97b5-4df88623d066-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.252206 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd549\" (UniqueName: \"kubernetes.io/projected/52b495ac-bb28-44f3-b925-3c54f86d5ec4-kube-api-access-dd549\") pod \"csi-snapshot-controller-operator-5685fbc7d-xhbrl\" (UID: \"52b495ac-bb28-44f3-b925-3c54f86d5ec4\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-xhbrl" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.252293 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.252327 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.252894 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-daemon-config\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.253207 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7e5935ea-8d95-45e3-b836-c7892953ef3d-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.253635 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/0418ff42-7eac-4266-97b5-4df88623d066-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.253651 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-client\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.253782 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e187516f-8f33-4c17-81d6-60c10b580bb0-tmp\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.253938 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a7752f9-7b9a-451f-997a-e9f696d38b34-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.254040 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/232c421d-96f0-4894-b8d8-74f43d02bbd3-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.254179 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-tmpfs\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.254450 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d831cb23-7411-4072-8273-c167d9afca28-images\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.254665 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1b69fbf6-1ca5-413e-bffd-965730bcec1b-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.252378 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-serving-cert\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.254893 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee586416-6f56-4ea4-ad62-95de1e6df23b-service-ca-bundle\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.254947 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6-metrics-tls\") pod \"dns-default-4pjsn\" (UID: \"7b485db9-29b5-45a1-a4fb-b4264c6bf2d6\") " pod="openshift-dns/dns-default-4pjsn" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.254975 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-etcd-serving-ca\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.255024 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.255052 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-config\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.255609 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-serving-cert\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:53:09.256133 master-0 kubenswrapper[18592]: I0308 03:53:09.255900 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee586416-6f56-4ea4-ad62-95de1e6df23b-service-ca-bundle\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:53:09.258126 master-0 kubenswrapper[18592]: I0308 03:53:09.257876 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54ad284e-d40e-4e69-b898-f5093952a0e6-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:53:09.265605 master-0 kubenswrapper[18592]: I0308 03:53:09.265558 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 03:53:09.273174 master-0 kubenswrapper[18592]: I0308 03:53:09.268016 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:53:09.319956 master-0 kubenswrapper[18592]: I0308 03:53:09.319072 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 03:53:09.319956 master-0 kubenswrapper[18592]: I0308 03:53:09.319270 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 03:53:09.323468 master-0 kubenswrapper[18592]: I0308 03:53:09.323425 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/164586b1-f133-4427-8ab6-eb0839b79738-env-overrides\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:53:09.325995 master-0 kubenswrapper[18592]: I0308 03:53:09.325958 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 03:53:09.329846 master-0 kubenswrapper[18592]: I0308 03:53:09.329794 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/164586b1-f133-4427-8ab6-eb0839b79738-webhook-cert\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:53:09.349426 master-0 kubenswrapper[18592]: I0308 03:53:09.348741 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 03:53:09.356941 master-0 kubenswrapper[18592]: I0308 03:53:09.356659 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.356941 master-0 kubenswrapper[18592]: I0308 03:53:09.356703 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-os-release\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.356941 master-0 kubenswrapper[18592]: I0308 03:53:09.356739 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-var-lib-kubelet\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.356941 master-0 kubenswrapper[18592]: I0308 03:53:09.356757 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-run-ovn-kubernetes\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.356941 master-0 kubenswrapper[18592]: I0308 03:53:09.356793 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-run-ovn-kubernetes\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.356941 master-0 kubenswrapper[18592]: I0308 03:53:09.356864 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.356941 master-0 kubenswrapper[18592]: I0308 03:53:09.356879 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-os-release\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.356941 master-0 kubenswrapper[18592]: I0308 03:53:09.356916 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-var-lib-kubelet\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.357274 master-0 kubenswrapper[18592]: I0308 03:53:09.356974 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-sysctl-d\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.357274 master-0 kubenswrapper[18592]: I0308 03:53:09.357000 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d2cd5b23-e622-4b96-aee8-dbc942b73b4a-hosts-file\") pod \"node-resolver-wjl9v\" (UID: \"d2cd5b23-e622-4b96-aee8-dbc942b73b4a\") " pod="openshift-dns/node-resolver-wjl9v" Mar 08 03:53:09.357274 master-0 kubenswrapper[18592]: I0308 03:53:09.357068 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2262647b-c315-477a-93bd-f168c1810475-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:53:09.357274 master-0 kubenswrapper[18592]: I0308 03:53:09.357093 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3ddfd0e7-fe76-41bc-b316-94505df81002-host-etc-kube\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:53:09.357274 master-0 kubenswrapper[18592]: I0308 03:53:09.357138 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-systemd-units\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.357274 master-0 kubenswrapper[18592]: I0308 03:53:09.357171 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:53:09.357274 master-0 kubenswrapper[18592]: I0308 03:53:09.357225 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.357443 master-0 kubenswrapper[18592]: I0308 03:53:09.357281 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/2f59fe81-deee-4ced-ae9d-f17752c82c4b-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:53:09.357443 master-0 kubenswrapper[18592]: I0308 03:53:09.357302 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-sysconfig\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.357443 master-0 kubenswrapper[18592]: I0308 03:53:09.357323 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-sys\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.357443 master-0 kubenswrapper[18592]: I0308 03:53:09.357351 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-netns\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.357443 master-0 kubenswrapper[18592]: I0308 03:53:09.357372 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:09.357443 master-0 kubenswrapper[18592]: I0308 03:53:09.357393 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-systemd\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.357594 master-0 kubenswrapper[18592]: I0308 03:53:09.357450 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-cni-netd\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.357594 master-0 kubenswrapper[18592]: I0308 03:53:09.357491 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-cni-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.357594 master-0 kubenswrapper[18592]: I0308 03:53:09.357527 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-kubelet\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.357594 master-0 kubenswrapper[18592]: I0308 03:53:09.357563 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.357697 master-0 kubenswrapper[18592]: I0308 03:53:09.357608 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-system-cni-dir\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.357697 master-0 kubenswrapper[18592]: I0308 03:53:09.357630 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-slash\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.357749 master-0 kubenswrapper[18592]: I0308 03:53:09.357705 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-slash\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.357749 master-0 kubenswrapper[18592]: I0308 03:53:09.357716 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/2f59fe81-deee-4ced-ae9d-f17752c82c4b-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:53:09.357798 master-0 kubenswrapper[18592]: I0308 03:53:09.357752 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-sysconfig\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.357843 master-0 kubenswrapper[18592]: I0308 03:53:09.357796 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-sys\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.357843 master-0 kubenswrapper[18592]: I0308 03:53:09.357810 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-os-release\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.357897 master-0 kubenswrapper[18592]: I0308 03:53:09.357849 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-netns\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.357897 master-0 kubenswrapper[18592]: I0308 03:53:09.357868 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-system-cni-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.357897 master-0 kubenswrapper[18592]: I0308 03:53:09.357882 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:09.358113 master-0 kubenswrapper[18592]: I0308 03:53:09.357926 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-systemd\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.358113 master-0 kubenswrapper[18592]: I0308 03:53:09.357936 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-sysctl-d\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.358113 master-0 kubenswrapper[18592]: I0308 03:53:09.357954 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-conf-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.358113 master-0 kubenswrapper[18592]: I0308 03:53:09.357976 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d2cd5b23-e622-4b96-aee8-dbc942b73b4a-hosts-file\") pod \"node-resolver-wjl9v\" (UID: \"d2cd5b23-e622-4b96-aee8-dbc942b73b4a\") " pod="openshift-dns/node-resolver-wjl9v" Mar 08 03:53:09.358113 master-0 kubenswrapper[18592]: I0308 03:53:09.358024 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-conf-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.358113 master-0 kubenswrapper[18592]: I0308 03:53:09.358026 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-system-cni-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.358113 master-0 kubenswrapper[18592]: I0308 03:53:09.358032 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-os-release\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.358113 master-0 kubenswrapper[18592]: I0308 03:53:09.358058 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-cni-netd\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.358113 master-0 kubenswrapper[18592]: I0308 03:53:09.358111 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358156 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3ddfd0e7-fe76-41bc-b316-94505df81002-host-etc-kube\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358189 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-cni-dir\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358229 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2262647b-c315-477a-93bd-f168c1810475-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358253 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-host\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358256 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-kubelet\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358279 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-host\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358296 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-systemd-units\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358307 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-node-pullsecrets\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358350 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-node-pullsecrets\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358346 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-system-cni-dir\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358392 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-audit-dir\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358396 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358422 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-etc-kubernetes\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358434 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-audit-dir\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358484 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358509 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-ovn\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358510 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-etc-kubernetes\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358573 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-run-netns\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358619 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-ovn\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358648 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-run\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358659 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-run-netns\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358684 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76ba45a2-8945-4afe-b913-126c26725867-audit-dir\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358709 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-host-slash\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358720 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-run\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358724 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76ba45a2-8945-4afe-b913-126c26725867-audit-dir\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358780 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-cni-bin\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358806 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-host-slash\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358845 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-socket-dir-parent\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358883 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-multus-socket-dir-parent\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358856 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-cni-bin\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358904 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-sysctl-conf\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358935 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e78b283b-981e-48d7-a5f2-53f8401766ea-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358964 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-systemd\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.358996 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-run-systemd\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.359012 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-sysctl-conf\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.359054 master-0 kubenswrapper[18592]: I0308 03:53:09.359043 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-lib-modules\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.360284 master-0 kubenswrapper[18592]: I0308 03:53:09.359105 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-node-log\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.360284 master-0 kubenswrapper[18592]: I0308 03:53:09.359171 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-lib-modules\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.360284 master-0 kubenswrapper[18592]: I0308 03:53:09.359215 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-node-log\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.360284 master-0 kubenswrapper[18592]: I0308 03:53:09.359261 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-etc-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.360284 master-0 kubenswrapper[18592]: I0308 03:53:09.359302 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-k8s-cni-cncf-io\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.360284 master-0 kubenswrapper[18592]: I0308 03:53:09.359358 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-kubernetes\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.360284 master-0 kubenswrapper[18592]: I0308 03:53:09.359381 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-cni-multus\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.360284 master-0 kubenswrapper[18592]: I0308 03:53:09.359414 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:53:09.360284 master-0 kubenswrapper[18592]: I0308 03:53:09.359502 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-k8s-cni-cncf-io\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.360284 master-0 kubenswrapper[18592]: I0308 03:53:09.359540 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-etc-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.360284 master-0 kubenswrapper[18592]: I0308 03:53:09.359578 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-kubernetes\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.360284 master-0 kubenswrapper[18592]: I0308 03:53:09.359578 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-cni-multus\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.360284 master-0 kubenswrapper[18592]: I0308 03:53:09.359651 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:09.360284 master-0 kubenswrapper[18592]: I0308 03:53:09.360014 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:53:09.360284 master-0 kubenswrapper[18592]: I0308 03:53:09.360174 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-cnibin\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.360284 master-0 kubenswrapper[18592]: I0308 03:53:09.360219 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-log-socket\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.360284 master-0 kubenswrapper[18592]: I0308 03:53:09.360240 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-multus-certs\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.360284 master-0 kubenswrapper[18592]: I0308 03:53:09.360278 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-modprobe-d\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.360724 master-0 kubenswrapper[18592]: I0308 03:53:09.360299 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1b69fbf6-1ca5-413e-bffd-965730bcec1b-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:53:09.360724 master-0 kubenswrapper[18592]: I0308 03:53:09.360381 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/2f59fe81-deee-4ced-ae9d-f17752c82c4b-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:53:09.360724 master-0 kubenswrapper[18592]: I0308 03:53:09.360404 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-kubelet\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.360724 master-0 kubenswrapper[18592]: I0308 03:53:09.360419 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-hostroot\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.360724 master-0 kubenswrapper[18592]: I0308 03:53:09.360446 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1b69fbf6-1ca5-413e-bffd-965730bcec1b-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:53:09.360724 master-0 kubenswrapper[18592]: I0308 03:53:09.360462 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-cni-bin\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.360724 master-0 kubenswrapper[18592]: I0308 03:53:09.360479 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2262647b-c315-477a-93bd-f168c1810475-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:53:09.360724 master-0 kubenswrapper[18592]: I0308 03:53:09.360500 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-var-lib-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.360724 master-0 kubenswrapper[18592]: I0308 03:53:09.360521 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-cnibin\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.360724 master-0 kubenswrapper[18592]: I0308 03:53:09.360629 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a19441e-e61b-4d58-85db-813ae88e1f9b-cnibin\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:09.360724 master-0 kubenswrapper[18592]: I0308 03:53:09.360657 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:09.360724 master-0 kubenswrapper[18592]: I0308 03:53:09.360691 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-cnibin\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.360724 master-0 kubenswrapper[18592]: I0308 03:53:09.360710 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-log-socket\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.361078 master-0 kubenswrapper[18592]: I0308 03:53:09.360797 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-hostroot\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.361078 master-0 kubenswrapper[18592]: I0308 03:53:09.360845 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1b69fbf6-1ca5-413e-bffd-965730bcec1b-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:53:09.361078 master-0 kubenswrapper[18592]: I0308 03:53:09.360867 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-host-cni-bin\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.361078 master-0 kubenswrapper[18592]: I0308 03:53:09.360886 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2262647b-c315-477a-93bd-f168c1810475-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:53:09.361078 master-0 kubenswrapper[18592]: I0308 03:53:09.360906 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-var-lib-openvswitch\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.361078 master-0 kubenswrapper[18592]: I0308 03:53:09.360994 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e187516f-8f33-4c17-81d6-60c10b580bb0-etc-modprobe-d\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:09.361078 master-0 kubenswrapper[18592]: I0308 03:53:09.360750 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-var-lib-kubelet\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.362654 master-0 kubenswrapper[18592]: I0308 03:53:09.362620 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-host-run-multus-certs\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:09.362719 master-0 kubenswrapper[18592]: I0308 03:53:09.362636 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/2f59fe81-deee-4ced-ae9d-f17752c82c4b-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:53:09.362719 master-0 kubenswrapper[18592]: I0308 03:53:09.362637 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1b69fbf6-1ca5-413e-bffd-965730bcec1b-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:53:09.366726 master-0 kubenswrapper[18592]: I0308 03:53:09.365707 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 03:53:09.375715 master-0 kubenswrapper[18592]: I0308 03:53:09.375677 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/164586b1-f133-4427-8ab6-eb0839b79738-ovnkube-identity-cm\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:53:09.390526 master-0 kubenswrapper[18592]: I0308 03:53:09.390454 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 03:53:09.405382 master-0 kubenswrapper[18592]: I0308 03:53:09.405329 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 03:53:09.447895 master-0 kubenswrapper[18592]: I0308 03:53:09.447801 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 03:53:09.450416 master-0 kubenswrapper[18592]: I0308 03:53:09.450338 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovn-node-metrics-cert\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.464931 master-0 kubenswrapper[18592]: I0308 03:53:09.464882 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 03:53:09.471189 master-0 kubenswrapper[18592]: I0308 03:53:09.471117 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-iptables-alerter-script\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:53:09.476714 master-0 kubenswrapper[18592]: I0308 03:53:09.476677 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-check-endpoints/0.log" Mar 08 03:53:09.479379 master-0 kubenswrapper[18592]: I0308 03:53:09.479232 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"0a6172e8ccc8a4efe2658181eb18b7f6b4fbfb74c1d8665ad23817e21967ec14"} Mar 08 03:53:09.479379 master-0 kubenswrapper[18592]: I0308 03:53:09.479349 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:09.485432 master-0 kubenswrapper[18592]: I0308 03:53:09.485235 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 03:53:09.488894 master-0 kubenswrapper[18592]: I0308 03:53:09.488872 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:09.489134 master-0 kubenswrapper[18592]: I0308 03:53:09.489106 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-ovnkube-script-lib\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:09.493569 master-0 kubenswrapper[18592]: I0308 03:53:09.493536 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:09.498854 master-0 kubenswrapper[18592]: I0308 03:53:09.498814 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:09.505376 master-0 kubenswrapper[18592]: I0308 03:53:09.505333 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 03:53:09.514577 master-0 kubenswrapper[18592]: I0308 03:53:09.514546 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/49ec083d-dc74-457e-b10f-3bde04e9e75e-signing-key\") pod \"service-ca-84bfdbbb7f-gj69x\" (UID: \"49ec083d-dc74-457e-b10f-3bde04e9e75e\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" Mar 08 03:53:09.529884 master-0 kubenswrapper[18592]: I0308 03:53:09.528708 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 03:53:09.545561 master-0 kubenswrapper[18592]: I0308 03:53:09.545517 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 03:53:09.563197 master-0 kubenswrapper[18592]: I0308 03:53:09.563111 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-var-lock\") pod \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " Mar 08 03:53:09.563389 master-0 kubenswrapper[18592]: I0308 03:53:09.563235 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-var-lock" (OuterVolumeSpecName: "var-lock") pod "9c95709c-c3cb-46fb-afe7-626c8013f3c6" (UID: "9c95709c-c3cb-46fb-afe7-626c8013f3c6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:53:09.563389 master-0 kubenswrapper[18592]: I0308 03:53:09.563302 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kubelet-dir\") pod \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " Mar 08 03:53:09.563852 master-0 kubenswrapper[18592]: I0308 03:53:09.563787 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9c95709c-c3cb-46fb-afe7-626c8013f3c6" (UID: "9c95709c-c3cb-46fb-afe7-626c8013f3c6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:53:09.564348 master-0 kubenswrapper[18592]: I0308 03:53:09.564321 18592 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:53:09.564348 master-0 kubenswrapper[18592]: I0308 03:53:09.564341 18592 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:53:09.565418 master-0 kubenswrapper[18592]: I0308 03:53:09.565385 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 03:53:09.586247 master-0 kubenswrapper[18592]: I0308 03:53:09.586137 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 03:53:09.605454 master-0 kubenswrapper[18592]: I0308 03:53:09.605394 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 03:53:09.609084 master-0 kubenswrapper[18592]: I0308 03:53:09.609053 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/49ec083d-dc74-457e-b10f-3bde04e9e75e-signing-cabundle\") pod \"service-ca-84bfdbbb7f-gj69x\" (UID: \"49ec083d-dc74-457e-b10f-3bde04e9e75e\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" Mar 08 03:53:09.625836 master-0 kubenswrapper[18592]: I0308 03:53:09.625785 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 08 03:53:09.645676 master-0 kubenswrapper[18592]: I0308 03:53:09.645635 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 03:53:09.651200 master-0 kubenswrapper[18592]: I0308 03:53:09.651157 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-audit\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.674073 master-0 kubenswrapper[18592]: I0308 03:53:09.674026 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 08 03:53:09.686046 master-0 kubenswrapper[18592]: I0308 03:53:09.686003 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 08 03:53:09.691154 master-0 kubenswrapper[18592]: I0308 03:53:09.691124 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/2f59fe81-deee-4ced-ae9d-f17752c82c4b-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:53:09.705634 master-0 kubenswrapper[18592]: I0308 03:53:09.705578 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 03:53:09.708695 master-0 kubenswrapper[18592]: I0308 03:53:09.708658 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-etcd-client\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.725590 master-0 kubenswrapper[18592]: I0308 03:53:09.725536 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 03:53:09.729213 master-0 kubenswrapper[18592]: I0308 03:53:09.729152 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-serving-cert\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.745846 master-0 kubenswrapper[18592]: I0308 03:53:09.745777 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 03:53:09.751813 master-0 kubenswrapper[18592]: I0308 03:53:09.751774 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-encryption-config\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.765539 master-0 kubenswrapper[18592]: I0308 03:53:09.765492 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 03:53:09.766648 master-0 kubenswrapper[18592]: I0308 03:53:09.766622 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-etcd-serving-ca\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.780491 master-0 kubenswrapper[18592]: I0308 03:53:09.780433 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:53:09.785593 master-0 kubenswrapper[18592]: I0308 03:53:09.785554 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 03:53:09.788787 master-0 kubenswrapper[18592]: I0308 03:53:09.788754 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-image-import-ca\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.813562 master-0 kubenswrapper[18592]: I0308 03:53:09.813500 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 03:53:09.820412 master-0 kubenswrapper[18592]: I0308 03:53:09.820376 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-trusted-ca-bundle\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.826961 master-0 kubenswrapper[18592]: I0308 03:53:09.826934 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 03:53:09.846692 master-0 kubenswrapper[18592]: I0308 03:53:09.846588 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 03:53:09.852617 master-0 kubenswrapper[18592]: I0308 03:53:09.852580 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-config\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:09.865454 master-0 kubenswrapper[18592]: I0308 03:53:09.865405 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 03:53:09.870399 master-0 kubenswrapper[18592]: I0308 03:53:09.870359 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76ba45a2-8945-4afe-b913-126c26725867-encryption-config\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:09.886914 master-0 kubenswrapper[18592]: I0308 03:53:09.886872 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 08 03:53:09.905841 master-0 kubenswrapper[18592]: I0308 03:53:09.905785 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 03:53:09.915208 master-0 kubenswrapper[18592]: I0308 03:53:09.915168 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76ba45a2-8945-4afe-b913-126c26725867-etcd-client\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:09.926258 master-0 kubenswrapper[18592]: I0308 03:53:09.926214 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 03:53:09.935322 master-0 kubenswrapper[18592]: I0308 03:53:09.935284 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76ba45a2-8945-4afe-b913-126c26725867-serving-cert\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:09.945249 master-0 kubenswrapper[18592]: I0308 03:53:09.945193 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 08 03:53:09.965474 master-0 kubenswrapper[18592]: I0308 03:53:09.965411 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 03:53:09.974970 master-0 kubenswrapper[18592]: I0308 03:53:09.974923 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76ba45a2-8945-4afe-b913-126c26725867-etcd-serving-ca\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:09.991443 master-0 kubenswrapper[18592]: I0308 03:53:09.991383 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 08 03:53:09.994275 master-0 kubenswrapper[18592]: I0308 03:53:09.994237 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1b69fbf6-1ca5-413e-bffd-965730bcec1b-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:53:10.005732 master-0 kubenswrapper[18592]: I0308 03:53:10.005682 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 03:53:10.009282 master-0 kubenswrapper[18592]: I0308 03:53:10.009241 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/76ba45a2-8945-4afe-b913-126c26725867-audit-policies\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:10.025342 master-0 kubenswrapper[18592]: I0308 03:53:10.025287 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 08 03:53:10.035732 master-0 kubenswrapper[18592]: I0308 03:53:10.035689 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1b69fbf6-1ca5-413e-bffd-965730bcec1b-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:53:10.048160 master-0 kubenswrapper[18592]: I0308 03:53:10.048117 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 03:53:10.048512 master-0 kubenswrapper[18592]: I0308 03:53:10.048486 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76ba45a2-8945-4afe-b913-126c26725867-trusted-ca-bundle\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:10.065174 master-0 kubenswrapper[18592]: I0308 03:53:10.065140 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 03:53:10.085903 master-0 kubenswrapper[18592]: I0308 03:53:10.085840 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 03:53:10.104969 master-0 kubenswrapper[18592]: I0308 03:53:10.104866 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 03:53:10.124298 master-0 kubenswrapper[18592]: I0308 03:53:10.124250 18592 request.go:700] Waited for 1.006596356s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-dns/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 08 03:53:10.125766 master-0 kubenswrapper[18592]: I0308 03:53:10.125743 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 03:53:10.146466 master-0 kubenswrapper[18592]: I0308 03:53:10.146414 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 03:53:10.146765 master-0 kubenswrapper[18592]: I0308 03:53:10.146737 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6-metrics-tls\") pod \"dns-default-4pjsn\" (UID: \"7b485db9-29b5-45a1-a4fb-b4264c6bf2d6\") " pod="openshift-dns/dns-default-4pjsn" Mar 08 03:53:10.166083 master-0 kubenswrapper[18592]: I0308 03:53:10.166040 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 03:53:10.185297 master-0 kubenswrapper[18592]: I0308 03:53:10.185242 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 03:53:10.189533 master-0 kubenswrapper[18592]: I0308 03:53:10.189492 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6-config-volume\") pod \"dns-default-4pjsn\" (UID: \"7b485db9-29b5-45a1-a4fb-b4264c6bf2d6\") " pod="openshift-dns/dns-default-4pjsn" Mar 08 03:53:10.205369 master-0 kubenswrapper[18592]: I0308 03:53:10.205310 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 03:53:10.210947 master-0 kubenswrapper[18592]: I0308 03:53:10.210919 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-webhook-cert\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:53:10.215421 master-0 kubenswrapper[18592]: I0308 03:53:10.215382 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-apiservice-cert\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:53:10.225352 master-0 kubenswrapper[18592]: I0308 03:53:10.225317 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-g8pwp" Mar 08 03:53:10.233274 master-0 kubenswrapper[18592]: E0308 03:53:10.233237 18592 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.233386 master-0 kubenswrapper[18592]: E0308 03:53:10.233319 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2262647b-c315-477a-93bd-f168c1810475-serving-cert podName:2262647b-c315-477a-93bd-f168c1810475 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.733297133 +0000 UTC m=+2.832051483 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2262647b-c315-477a-93bd-f168c1810475-serving-cert") pod "cluster-version-operator-8c9c967c7-zq9rp" (UID: "2262647b-c315-477a-93bd-f168c1810475") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.234715 master-0 kubenswrapper[18592]: E0308 03:53:10.234684 18592 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.234774 master-0 kubenswrapper[18592]: E0308 03:53:10.234741 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/127c3f92-8283-4179-9e40-a12dcabaaa12-config podName:127c3f92-8283-4179-9e40-a12dcabaaa12 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.734728787 +0000 UTC m=+2.833483137 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/127c3f92-8283-4179-9e40-a12dcabaaa12-config") pod "machine-approver-754bdc9f9d-z4sdd" (UID: "127c3f92-8283-4179-9e40-a12dcabaaa12") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.235810 master-0 kubenswrapper[18592]: E0308 03:53:10.235782 18592 secret.go:189] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.235898 master-0 kubenswrapper[18592]: E0308 03:53:10.235847 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-machine-api-operator-tls podName:b70adfe9-94f1-44bc-85ce-498e5f0a1ca7 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.735835432 +0000 UTC m=+2.834589782 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-machine-api-operator-tls") pod "machine-api-operator-84bf6db4f9-tdrf8" (UID: "b70adfe9-94f1-44bc-85ce-498e5f0a1ca7") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.239083 master-0 kubenswrapper[18592]: E0308 03:53:10.239050 18592 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.239152 master-0 kubenswrapper[18592]: E0308 03:53:10.239095 18592 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy-cluster-autoscaler-operator: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.239152 master-0 kubenswrapper[18592]: E0308 03:53:10.239107 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-serving-cert podName:3f7d2cef-b17b-43ba-a222-9e6e8d8352e2 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.739092606 +0000 UTC m=+2.837846956 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-serving-cert") pod "route-controller-manager-5cb98fbc8c-xnx9t" (UID: "3f7d2cef-b17b-43ba-a222-9e6e8d8352e2") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.239152 master-0 kubenswrapper[18592]: E0308 03:53:10.239127 18592 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.239152 master-0 kubenswrapper[18592]: E0308 03:53:10.239137 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1b700d17-83d2-46c8-afbc-e5774822eabe-auth-proxy-config podName:1b700d17-83d2-46c8-afbc-e5774822eabe nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.739125837 +0000 UTC m=+2.837880267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/1b700d17-83d2-46c8-afbc-e5774822eabe-auth-proxy-config") pod "cluster-autoscaler-operator-69576476f7-bv67b" (UID: "1b700d17-83d2-46c8-afbc-e5774822eabe") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.239339 master-0 kubenswrapper[18592]: E0308 03:53:10.239156 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-serving-cert podName:a6c4695c-da78-46b6-8f92-ca93c5ebb96b nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.739148067 +0000 UTC m=+2.837902417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-serving-cert") pod "controller-manager-6999cc9685-kprrt" (UID: "a6c4695c-da78-46b6-8f92-ca93c5ebb96b") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.239339 master-0 kubenswrapper[18592]: E0308 03:53:10.239167 18592 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.239339 master-0 kubenswrapper[18592]: E0308 03:53:10.239182 18592 secret.go:189] Couldn't get secret openshift-cloud-credential-operator/cloud-credential-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.239339 master-0 kubenswrapper[18592]: E0308 03:53:10.239198 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-proxy-ca-bundles podName:a6c4695c-da78-46b6-8f92-ca93c5ebb96b nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.739189088 +0000 UTC m=+2.837943438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-proxy-ca-bundles") pod "controller-manager-6999cc9685-kprrt" (UID: "a6c4695c-da78-46b6-8f92-ca93c5ebb96b") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.239339 master-0 kubenswrapper[18592]: E0308 03:53:10.239205 18592 secret.go:189] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.239339 master-0 kubenswrapper[18592]: E0308 03:53:10.239215 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0861ccd-5e86-4277-9082-95f3133508a0-cloud-credential-operator-serving-cert podName:c0861ccd-5e86-4277-9082-95f3133508a0 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.739206029 +0000 UTC m=+2.837960379 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-credential-operator-serving-cert" (UniqueName: "kubernetes.io/secret/c0861ccd-5e86-4277-9082-95f3133508a0-cloud-credential-operator-serving-cert") pod "cloud-credential-operator-55d85b7b47-5v6gs" (UID: "c0861ccd-5e86-4277-9082-95f3133508a0") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.239339 master-0 kubenswrapper[18592]: E0308 03:53:10.239231 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/127c3f92-8283-4179-9e40-a12dcabaaa12-machine-approver-tls podName:127c3f92-8283-4179-9e40-a12dcabaaa12 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.739223539 +0000 UTC m=+2.837977899 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/127c3f92-8283-4179-9e40-a12dcabaaa12-machine-approver-tls") pod "machine-approver-754bdc9f9d-z4sdd" (UID: "127c3f92-8283-4179-9e40-a12dcabaaa12") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.240173 master-0 kubenswrapper[18592]: E0308 03:53:10.240139 18592 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.240237 master-0 kubenswrapper[18592]: E0308 03:53:10.240187 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-client-ca podName:a6c4695c-da78-46b6-8f92-ca93c5ebb96b nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.740175911 +0000 UTC m=+2.838930261 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-client-ca") pod "controller-manager-6999cc9685-kprrt" (UID: "a6c4695c-da78-46b6-8f92-ca93c5ebb96b") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.240237 master-0 kubenswrapper[18592]: E0308 03:53:10.240143 18592 configmap.go:193] Couldn't get configMap openshift-cluster-version/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.240237 master-0 kubenswrapper[18592]: E0308 03:53:10.240228 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2262647b-c315-477a-93bd-f168c1810475-service-ca podName:2262647b-c315-477a-93bd-f168c1810475 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.740220333 +0000 UTC m=+2.838974783 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca" (UniqueName: "kubernetes.io/configmap/2262647b-c315-477a-93bd-f168c1810475-service-ca") pod "cluster-version-operator-8c9c967c7-zq9rp" (UID: "2262647b-c315-477a-93bd-f168c1810475") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.240417 master-0 kubenswrapper[18592]: E0308 03:53:10.240249 18592 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.240417 master-0 kubenswrapper[18592]: E0308 03:53:10.240283 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-cloud-controller-manager-operator-tls podName:33ed331b-89e9-45f8-ab3c-4533a77cc7b6 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.740273754 +0000 UTC m=+2.839028114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" (UID: "33ed331b-89e9-45f8-ab3c-4533a77cc7b6") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.240417 master-0 kubenswrapper[18592]: E0308 03:53:10.240316 18592 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.240417 master-0 kubenswrapper[18592]: E0308 03:53:10.240350 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/127c3f92-8283-4179-9e40-a12dcabaaa12-auth-proxy-config podName:127c3f92-8283-4179-9e40-a12dcabaaa12 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.740338985 +0000 UTC m=+2.839093335 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/127c3f92-8283-4179-9e40-a12dcabaaa12-auth-proxy-config") pod "machine-approver-754bdc9f9d-z4sdd" (UID: "127c3f92-8283-4179-9e40-a12dcabaaa12") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.243549 master-0 kubenswrapper[18592]: E0308 03:53:10.243515 18592 configmap.go:193] Couldn't get configMap openshift-cloud-credential-operator/cco-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.243621 master-0 kubenswrapper[18592]: E0308 03:53:10.243570 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c0861ccd-5e86-4277-9082-95f3133508a0-cco-trusted-ca podName:c0861ccd-5e86-4277-9082-95f3133508a0 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.743557429 +0000 UTC m=+2.842311789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cco-trusted-ca" (UniqueName: "kubernetes.io/configmap/c0861ccd-5e86-4277-9082-95f3133508a0-cco-trusted-ca") pod "cloud-credential-operator-55d85b7b47-5v6gs" (UID: "c0861ccd-5e86-4277-9082-95f3133508a0") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.245635 master-0 kubenswrapper[18592]: I0308 03:53:10.245597 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-7dkp2" Mar 08 03:53:10.248011 master-0 kubenswrapper[18592]: E0308 03:53:10.247949 18592 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.248011 master-0 kubenswrapper[18592]: E0308 03:53:10.247982 18592 secret.go:189] Couldn't get secret openshift-machine-api/cluster-autoscaler-operator-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.248132 master-0 kubenswrapper[18592]: E0308 03:53:10.248026 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-client-ca podName:3f7d2cef-b17b-43ba-a222-9e6e8d8352e2 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.74801088 +0000 UTC m=+2.846765230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-client-ca") pod "route-controller-manager-5cb98fbc8c-xnx9t" (UID: "3f7d2cef-b17b-43ba-a222-9e6e8d8352e2") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.248132 master-0 kubenswrapper[18592]: E0308 03:53:10.248046 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b700d17-83d2-46c8-afbc-e5774822eabe-cert podName:1b700d17-83d2-46c8-afbc-e5774822eabe nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.748038861 +0000 UTC m=+2.846793221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1b700d17-83d2-46c8-afbc-e5774822eabe-cert") pod "cluster-autoscaler-operator-69576476f7-bv67b" (UID: "1b700d17-83d2-46c8-afbc-e5774822eabe") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.248748 master-0 kubenswrapper[18592]: E0308 03:53:10.248725 18592 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.248801 master-0 kubenswrapper[18592]: E0308 03:53:10.248772 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-config podName:3f7d2cef-b17b-43ba-a222-9e6e8d8352e2 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.748759058 +0000 UTC m=+2.847513408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-config") pod "route-controller-manager-5cb98fbc8c-xnx9t" (UID: "3f7d2cef-b17b-43ba-a222-9e6e8d8352e2") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.248868 master-0 kubenswrapper[18592]: E0308 03:53:10.248799 18592 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.248868 master-0 kubenswrapper[18592]: E0308 03:53:10.248845 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/634c0f6d-bce6-42cf-9253-80d1bcc7c507-samples-operator-tls podName:634c0f6d-bce6-42cf-9253-80d1bcc7c507 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.74881813 +0000 UTC m=+2.847572470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/634c0f6d-bce6-42cf-9253-80d1bcc7c507-samples-operator-tls") pod "cluster-samples-operator-664cb58b85-lrnks" (UID: "634c0f6d-bce6-42cf-9253-80d1bcc7c507") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.249964 master-0 kubenswrapper[18592]: E0308 03:53:10.249930 18592 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.250028 master-0 kubenswrapper[18592]: E0308 03:53:10.249986 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-config podName:b70adfe9-94f1-44bc-85ce-498e5f0a1ca7 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.749974016 +0000 UTC m=+2.848728366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-config") pod "machine-api-operator-84bf6db4f9-tdrf8" (UID: "b70adfe9-94f1-44bc-85ce-498e5f0a1ca7") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.250074 master-0 kubenswrapper[18592]: E0308 03:53:10.250013 18592 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.250145 master-0 kubenswrapper[18592]: E0308 03:53:10.250121 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-images podName:33ed331b-89e9-45f8-ab3c-4533a77cc7b6 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.750096258 +0000 UTC m=+2.848850648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-images") pod "cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" (UID: "33ed331b-89e9-45f8-ab3c-4533a77cc7b6") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.254122 master-0 kubenswrapper[18592]: E0308 03:53:10.254095 18592 secret.go:189] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.254177 master-0 kubenswrapper[18592]: E0308 03:53:10.254153 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/139881ee-6cfa-4a7e-b002-63cece048d16-control-plane-machine-set-operator-tls podName:139881ee-6cfa-4a7e-b002-63cece048d16 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.754140451 +0000 UTC m=+2.852894881 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/139881ee-6cfa-4a7e-b002-63cece048d16-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-6686554ddc-7bcsk" (UID: "139881ee-6cfa-4a7e-b002-63cece048d16") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:53:10.254212 master-0 kubenswrapper[18592]: E0308 03:53:10.254194 18592 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.254242 master-0 kubenswrapper[18592]: E0308 03:53:10.254237 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-auth-proxy-config podName:33ed331b-89e9-45f8-ab3c-4533a77cc7b6 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.754225714 +0000 UTC m=+2.852980174 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" (UID: "33ed331b-89e9-45f8-ab3c-4533a77cc7b6") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.255447 master-0 kubenswrapper[18592]: E0308 03:53:10.255424 18592 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.255511 master-0 kubenswrapper[18592]: E0308 03:53:10.255465 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-images podName:b70adfe9-94f1-44bc-85ce-498e5f0a1ca7 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.755456451 +0000 UTC m=+2.854210881 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-images") pod "machine-api-operator-84bf6db4f9-tdrf8" (UID: "b70adfe9-94f1-44bc-85ce-498e5f0a1ca7") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.258703 master-0 kubenswrapper[18592]: E0308 03:53:10.258662 18592 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.258788 master-0 kubenswrapper[18592]: E0308 03:53:10.258763 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-config podName:a6c4695c-da78-46b6-8f92-ca93c5ebb96b nodeName:}" failed. No retries permitted until 2026-03-08 03:53:10.758745477 +0000 UTC m=+2.857499837 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-config") pod "controller-manager-6999cc9685-kprrt" (UID: "a6c4695c-da78-46b6-8f92-ca93c5ebb96b") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:53:10.264987 master-0 kubenswrapper[18592]: I0308 03:53:10.264959 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-hdmbr" Mar 08 03:53:10.285446 master-0 kubenswrapper[18592]: I0308 03:53:10.285254 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-7cjnr" Mar 08 03:53:10.305182 master-0 kubenswrapper[18592]: I0308 03:53:10.305114 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 03:53:10.326078 master-0 kubenswrapper[18592]: I0308 03:53:10.326022 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 03:53:10.351628 master-0 kubenswrapper[18592]: I0308 03:53:10.351562 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 03:53:10.366246 master-0 kubenswrapper[18592]: I0308 03:53:10.366129 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 03:53:10.384932 master-0 kubenswrapper[18592]: I0308 03:53:10.384876 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 03:53:10.405749 master-0 kubenswrapper[18592]: I0308 03:53:10.405703 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 03:53:10.426198 master-0 kubenswrapper[18592]: I0308 03:53:10.426162 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-27jjw" Mar 08 03:53:10.446409 master-0 kubenswrapper[18592]: I0308 03:53:10.446370 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 03:53:10.466111 master-0 kubenswrapper[18592]: I0308 03:53:10.466063 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 03:53:10.484820 master-0 kubenswrapper[18592]: I0308 03:53:10.484760 18592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:53:10.485018 master-0 kubenswrapper[18592]: I0308 03:53:10.484854 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:10.486026 master-0 kubenswrapper[18592]: I0308 03:53:10.485991 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 03:53:10.505700 master-0 kubenswrapper[18592]: I0308 03:53:10.505637 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 03:53:10.525423 master-0 kubenswrapper[18592]: I0308 03:53:10.525349 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 03:53:10.546632 master-0 kubenswrapper[18592]: I0308 03:53:10.546559 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-qfnls" Mar 08 03:53:10.566737 master-0 kubenswrapper[18592]: I0308 03:53:10.566632 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-hbsn4" Mar 08 03:53:10.586683 master-0 kubenswrapper[18592]: I0308 03:53:10.586584 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 03:53:10.605974 master-0 kubenswrapper[18592]: I0308 03:53:10.605920 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 03:53:10.626317 master-0 kubenswrapper[18592]: I0308 03:53:10.626139 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 03:53:10.646622 master-0 kubenswrapper[18592]: I0308 03:53:10.646573 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-hvbwb" Mar 08 03:53:10.666465 master-0 kubenswrapper[18592]: I0308 03:53:10.666408 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 03:53:10.685672 master-0 kubenswrapper[18592]: I0308 03:53:10.685610 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-n8nt7" Mar 08 03:53:10.706008 master-0 kubenswrapper[18592]: I0308 03:53:10.705955 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 03:53:10.726664 master-0 kubenswrapper[18592]: I0308 03:53:10.726633 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 03:53:10.747495 master-0 kubenswrapper[18592]: I0308 03:53:10.747258 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 03:53:10.765933 master-0 kubenswrapper[18592]: I0308 03:53:10.765819 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 08 03:53:10.786372 master-0 kubenswrapper[18592]: I0308 03:53:10.786272 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-rnqsm" Mar 08 03:53:10.790061 master-0 kubenswrapper[18592]: I0308 03:53:10.790010 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-images\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:53:10.790203 master-0 kubenswrapper[18592]: I0308 03:53:10.790176 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/139881ee-6cfa-4a7e-b002-63cece048d16-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-7bcsk\" (UID: \"139881ee-6cfa-4a7e-b002-63cece048d16\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk" Mar 08 03:53:10.790570 master-0 kubenswrapper[18592]: I0308 03:53:10.790519 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:53:10.790772 master-0 kubenswrapper[18592]: I0308 03:53:10.790743 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-config\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:53:10.791011 master-0 kubenswrapper[18592]: I0308 03:53:10.790981 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2262647b-c315-477a-93bd-f168c1810475-serving-cert\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:53:10.791248 master-0 kubenswrapper[18592]: I0308 03:53:10.791221 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/127c3f92-8283-4179-9e40-a12dcabaaa12-config\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:53:10.791411 master-0 kubenswrapper[18592]: I0308 03:53:10.791375 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2262647b-c315-477a-93bd-f168c1810475-serving-cert\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:53:10.791537 master-0 kubenswrapper[18592]: I0308 03:53:10.790665 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/139881ee-6cfa-4a7e-b002-63cece048d16-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-7bcsk\" (UID: \"139881ee-6cfa-4a7e-b002-63cece048d16\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk" Mar 08 03:53:10.791791 master-0 kubenswrapper[18592]: I0308 03:53:10.791762 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:53:10.792468 master-0 kubenswrapper[18592]: I0308 03:53:10.792233 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b700d17-83d2-46c8-afbc-e5774822eabe-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-bv67b\" (UID: \"1b700d17-83d2-46c8-afbc-e5774822eabe\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:53:10.793457 master-0 kubenswrapper[18592]: I0308 03:53:10.792315 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-config\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:53:10.793588 master-0 kubenswrapper[18592]: I0308 03:53:10.793021 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-serving-cert\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:53:10.793655 master-0 kubenswrapper[18592]: I0308 03:53:10.793600 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-serving-cert\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:53:10.793724 master-0 kubenswrapper[18592]: I0308 03:53:10.793668 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-proxy-ca-bundles\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:53:10.793786 master-0 kubenswrapper[18592]: I0308 03:53:10.793729 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-client-ca\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:53:10.793964 master-0 kubenswrapper[18592]: I0308 03:53:10.793925 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:53:10.793964 master-0 kubenswrapper[18592]: I0308 03:53:10.793424 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-serving-cert\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:53:10.794588 master-0 kubenswrapper[18592]: I0308 03:53:10.794544 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0861ccd-5e86-4277-9082-95f3133508a0-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-5v6gs\" (UID: \"c0861ccd-5e86-4277-9082-95f3133508a0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:53:10.794588 master-0 kubenswrapper[18592]: I0308 03:53:10.794577 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-client-ca\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:53:10.794740 master-0 kubenswrapper[18592]: I0308 03:53:10.794633 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2262647b-c315-477a-93bd-f168c1810475-service-ca\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:53:10.794740 master-0 kubenswrapper[18592]: I0308 03:53:10.794674 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/127c3f92-8283-4179-9e40-a12dcabaaa12-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:53:10.794918 master-0 kubenswrapper[18592]: I0308 03:53:10.794861 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/127c3f92-8283-4179-9e40-a12dcabaaa12-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:53:10.795045 master-0 kubenswrapper[18592]: I0308 03:53:10.794957 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0861ccd-5e86-4277-9082-95f3133508a0-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-5v6gs\" (UID: \"c0861ccd-5e86-4277-9082-95f3133508a0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:53:10.795232 master-0 kubenswrapper[18592]: I0308 03:53:10.795196 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-proxy-ca-bundles\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:53:10.795376 master-0 kubenswrapper[18592]: I0308 03:53:10.795199 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2262647b-c315-477a-93bd-f168c1810475-service-ca\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:53:10.795524 master-0 kubenswrapper[18592]: I0308 03:53:10.795269 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b700d17-83d2-46c8-afbc-e5774822eabe-cert\") pod \"cluster-autoscaler-operator-69576476f7-bv67b\" (UID: \"1b700d17-83d2-46c8-afbc-e5774822eabe\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:53:10.795813 master-0 kubenswrapper[18592]: I0308 03:53:10.795782 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/634c0f6d-bce6-42cf-9253-80d1bcc7c507-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-lrnks\" (UID: \"634c0f6d-bce6-42cf-9253-80d1bcc7c507\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks" Mar 08 03:53:10.796063 master-0 kubenswrapper[18592]: I0308 03:53:10.796026 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-client-ca\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:53:10.796225 master-0 kubenswrapper[18592]: I0308 03:53:10.796181 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/634c0f6d-bce6-42cf-9253-80d1bcc7c507-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-lrnks\" (UID: \"634c0f6d-bce6-42cf-9253-80d1bcc7c507\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks" Mar 08 03:53:10.796400 master-0 kubenswrapper[18592]: I0308 03:53:10.796372 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-config\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:53:10.796628 master-0 kubenswrapper[18592]: I0308 03:53:10.796601 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:53:10.796794 master-0 kubenswrapper[18592]: I0308 03:53:10.796756 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-config\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:53:10.796904 master-0 kubenswrapper[18592]: I0308 03:53:10.796378 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-client-ca\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:53:10.797011 master-0 kubenswrapper[18592]: I0308 03:53:10.796982 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-config\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:53:10.797285 master-0 kubenswrapper[18592]: I0308 03:53:10.797242 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-serving-cert\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:53:10.805749 master-0 kubenswrapper[18592]: I0308 03:53:10.805696 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 08 03:53:10.806782 master-0 kubenswrapper[18592]: I0308 03:53:10.806730 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:53:10.810561 master-0 kubenswrapper[18592]: I0308 03:53:10.810511 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 08 03:53:10.816059 master-0 kubenswrapper[18592]: I0308 03:53:10.815992 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0861ccd-5e86-4277-9082-95f3133508a0-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-5v6gs\" (UID: \"c0861ccd-5e86-4277-9082-95f3133508a0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:53:10.825719 master-0 kubenswrapper[18592]: I0308 03:53:10.825669 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 08 03:53:10.826438 master-0 kubenswrapper[18592]: I0308 03:53:10.826382 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0861ccd-5e86-4277-9082-95f3133508a0-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-5v6gs\" (UID: \"c0861ccd-5e86-4277-9082-95f3133508a0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:53:10.847658 master-0 kubenswrapper[18592]: I0308 03:53:10.847552 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 08 03:53:10.866266 master-0 kubenswrapper[18592]: I0308 03:53:10.866213 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 03:53:10.871353 master-0 kubenswrapper[18592]: I0308 03:53:10.871289 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-images\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:53:10.885946 master-0 kubenswrapper[18592]: I0308 03:53:10.885761 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 03:53:10.892327 master-0 kubenswrapper[18592]: I0308 03:53:10.892267 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:53:10.906119 master-0 kubenswrapper[18592]: I0308 03:53:10.906068 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 08 03:53:10.914508 master-0 kubenswrapper[18592]: I0308 03:53:10.914466 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b700d17-83d2-46c8-afbc-e5774822eabe-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-bv67b\" (UID: \"1b700d17-83d2-46c8-afbc-e5774822eabe\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:53:10.926658 master-0 kubenswrapper[18592]: I0308 03:53:10.926598 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-slxzk" Mar 08 03:53:10.946333 master-0 kubenswrapper[18592]: I0308 03:53:10.946283 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 03:53:10.947502 master-0 kubenswrapper[18592]: I0308 03:53:10.947466 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-config\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:53:10.966457 master-0 kubenswrapper[18592]: I0308 03:53:10.966370 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-4rdlv" Mar 08 03:53:10.986231 master-0 kubenswrapper[18592]: I0308 03:53:10.986139 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 08 03:53:10.996087 master-0 kubenswrapper[18592]: I0308 03:53:10.996029 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b700d17-83d2-46c8-afbc-e5774822eabe-cert\") pod \"cluster-autoscaler-operator-69576476f7-bv67b\" (UID: \"1b700d17-83d2-46c8-afbc-e5774822eabe\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:53:11.006247 master-0 kubenswrapper[18592]: I0308 03:53:11.006174 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 03:53:11.007067 master-0 kubenswrapper[18592]: I0308 03:53:11.006963 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/127c3f92-8283-4179-9e40-a12dcabaaa12-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:53:11.026057 master-0 kubenswrapper[18592]: I0308 03:53:11.026008 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 03:53:11.046341 master-0 kubenswrapper[18592]: I0308 03:53:11.046278 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 03:53:11.055584 master-0 kubenswrapper[18592]: I0308 03:53:11.055473 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/127c3f92-8283-4179-9e40-a12dcabaaa12-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:53:11.066728 master-0 kubenswrapper[18592]: I0308 03:53:11.066646 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-92fqc" Mar 08 03:53:11.087325 master-0 kubenswrapper[18592]: I0308 03:53:11.087259 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 03:53:11.092871 master-0 kubenswrapper[18592]: I0308 03:53:11.092782 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/127c3f92-8283-4179-9e40-a12dcabaaa12-config\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:53:11.106227 master-0 kubenswrapper[18592]: I0308 03:53:11.106162 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-lnbcj" Mar 08 03:53:11.124595 master-0 kubenswrapper[18592]: I0308 03:53:11.124528 18592 request.go:700] Waited for 1.999003237s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cloud-controller-manager-operator/configmaps?fieldSelector=metadata.name%3Dcloud-controller-manager-images&limit=500&resourceVersion=0 Mar 08 03:53:11.126297 master-0 kubenswrapper[18592]: I0308 03:53:11.126250 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 08 03:53:11.127620 master-0 kubenswrapper[18592]: I0308 03:53:11.127549 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:53:11.146144 master-0 kubenswrapper[18592]: I0308 03:53:11.145998 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 03:53:11.165491 master-0 kubenswrapper[18592]: I0308 03:53:11.165425 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 08 03:53:11.175717 master-0 kubenswrapper[18592]: I0308 03:53:11.175659 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:53:11.185936 master-0 kubenswrapper[18592]: I0308 03:53:11.185890 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:53:11.206513 master-0 kubenswrapper[18592]: I0308 03:53:11.206457 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 08 03:53:11.211294 master-0 kubenswrapper[18592]: I0308 03:53:11.211244 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:53:11.225520 master-0 kubenswrapper[18592]: I0308 03:53:11.225471 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 03:53:11.271427 master-0 kubenswrapper[18592]: I0308 03:53:11.271337 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd7d5\" (UniqueName: \"kubernetes.io/projected/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-kube-api-access-bd7d5\") pod \"controller-manager-6999cc9685-kprrt\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:53:11.285307 master-0 kubenswrapper[18592]: I0308 03:53:11.285250 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hccv4\" (UniqueName: \"kubernetes.io/projected/0ebf1330-e044-4ff5-8b48-2d667e0c5625-kube-api-access-hccv4\") pod \"openshift-controller-manager-operator-8565d84698-kt66j\" (UID: \"0ebf1330-e044-4ff5-8b48-2d667e0c5625\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" Mar 08 03:53:11.299508 master-0 kubenswrapper[18592]: I0308 03:53:11.299466 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfz6w\" (UniqueName: \"kubernetes.io/projected/7b485db9-29b5-45a1-a4fb-b4264c6bf2d6-kube-api-access-nfz6w\") pod \"dns-default-4pjsn\" (UID: \"7b485db9-29b5-45a1-a4fb-b4264c6bf2d6\") " pod="openshift-dns/dns-default-4pjsn" Mar 08 03:53:11.328054 master-0 kubenswrapper[18592]: I0308 03:53:11.328014 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mghmh\" (UniqueName: \"kubernetes.io/projected/0918ba32-8e55-48d0-8e50-027c0dcb4bbd-kube-api-access-mghmh\") pod \"openshift-config-operator-64488f9d78-vfgfp\" (UID: \"0918ba32-8e55-48d0-8e50-027c0dcb4bbd\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:53:11.348372 master-0 kubenswrapper[18592]: I0308 03:53:11.348326 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp98d\" (UniqueName: \"kubernetes.io/projected/6bee226a-2a66-4032-8aba-2c8b82abcb6a-kube-api-access-tp98d\") pod \"redhat-operators-rnz4w\" (UID: \"6bee226a-2a66-4032-8aba-2c8b82abcb6a\") " pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:53:11.371629 master-0 kubenswrapper[18592]: I0308 03:53:11.371542 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnzt7\" (UniqueName: \"kubernetes.io/projected/2dd4279d-a1a9-450a-a061-9008cd1ea8e0-kube-api-access-pnzt7\") pod \"olm-operator-d64cfc9db-qddlp\" (UID: \"2dd4279d-a1a9-450a-a061-9008cd1ea8e0\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:53:11.376867 master-0 kubenswrapper[18592]: I0308 03:53:11.376787 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:53:11.406871 master-0 kubenswrapper[18592]: I0308 03:53:11.406698 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vndvf\" (UniqueName: \"kubernetes.io/projected/9ec89e27-4360-48f2-a7ca-5d823bda4510-kube-api-access-vndvf\") pod \"csi-snapshot-controller-7577d6f48-h4qlp\" (UID: \"9ec89e27-4360-48f2-a7ca-5d823bda4510\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" Mar 08 03:53:11.426442 master-0 kubenswrapper[18592]: I0308 03:53:11.426376 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmpdd\" (UniqueName: \"kubernetes.io/projected/0418ff42-7eac-4266-97b5-4df88623d066-kube-api-access-kmpdd\") pod \"cluster-monitoring-operator-674cbfbd9d-clqwj\" (UID: \"0418ff42-7eac-4266-97b5-4df88623d066\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-clqwj" Mar 08 03:53:11.447636 master-0 kubenswrapper[18592]: I0308 03:53:11.447550 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rchj5\" (UniqueName: \"kubernetes.io/projected/e78b283b-981e-48d7-a5f2-53f8401766ea-kube-api-access-rchj5\") pod \"machine-config-operator-fdb5c78b5-2vjh2\" (UID: \"e78b283b-981e-48d7-a5f2-53f8401766ea\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:53:11.456540 master-0 kubenswrapper[18592]: I0308 03:53:11.456464 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw7bx\" (UniqueName: \"kubernetes.io/projected/4a19441e-e61b-4d58-85db-813ae88e1f9b-kube-api-access-dw7bx\") pod \"multus-additional-cni-plugins-g564l\" (UID: \"4a19441e-e61b-4d58-85db-813ae88e1f9b\") " pod="openshift-multus/multus-additional-cni-plugins-g564l" Mar 08 03:53:11.491289 master-0 kubenswrapper[18592]: I0308 03:53:11.491202 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtts2\" (UniqueName: \"kubernetes.io/projected/76ba45a2-8945-4afe-b913-126c26725867-kube-api-access-dtts2\") pod \"apiserver-7695b9f8b5-4jpgl\" (UID: \"76ba45a2-8945-4afe-b913-126c26725867\") " pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:11.492346 master-0 kubenswrapper[18592]: I0308 03:53:11.492290 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:53:11.493592 master-0 kubenswrapper[18592]: I0308 03:53:11.493403 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" Mar 08 03:53:11.493741 master-0 kubenswrapper[18592]: I0308 03:53:11.493626 18592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:53:11.493741 master-0 kubenswrapper[18592]: I0308 03:53:11.493666 18592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:53:11.503224 master-0 kubenswrapper[18592]: I0308 03:53:11.503173 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-vfgfp" Mar 08 03:53:11.503669 master-0 kubenswrapper[18592]: I0308 03:53:11.503623 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg9kg\" (UniqueName: \"kubernetes.io/projected/e187516f-8f33-4c17-81d6-60c10b580bb0-kube-api-access-vg9kg\") pod \"tuned-bpwdb\" (UID: \"e187516f-8f33-4c17-81d6-60c10b580bb0\") " pod="openshift-cluster-node-tuning-operator/tuned-bpwdb" Mar 08 03:53:11.529623 master-0 kubenswrapper[18592]: I0308 03:53:11.529570 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4pjsn" Mar 08 03:53:11.530962 master-0 kubenswrapper[18592]: I0308 03:53:11.530900 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4pjsn" Mar 08 03:53:11.541530 master-0 kubenswrapper[18592]: I0308 03:53:11.540431 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwkwt\" (UniqueName: \"kubernetes.io/projected/d831cb23-7411-4072-8273-c167d9afca28-kube-api-access-dwkwt\") pod \"cluster-baremetal-operator-5cdb4c5598-jghp5\" (UID: \"d831cb23-7411-4072-8273-c167d9afca28\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" Mar 08 03:53:11.546717 master-0 kubenswrapper[18592]: I0308 03:53:11.546667 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fncng\" (UniqueName: \"kubernetes.io/projected/30211469-7108-4820-a988-26fc4ced734e-kube-api-access-fncng\") pod \"openshift-apiserver-operator-799b6db4d7-75682\" (UID: \"30211469-7108-4820-a988-26fc4ced734e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-75682" Mar 08 03:53:11.561900 master-0 kubenswrapper[18592]: I0308 03:53:11.561810 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v45k\" (UniqueName: \"kubernetes.io/projected/c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f-kube-api-access-4v45k\") pod \"apiserver-6b779d99b8-7kmck\" (UID: \"c01bb96f-07c6-4d6a-8ee6-0f3d07e0510f\") " pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:11.580948 master-0 kubenswrapper[18592]: I0308 03:53:11.580806 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkzb2\" (UniqueName: \"kubernetes.io/projected/4c5a0c1d-867a-4ce4-9570-ea66452c8db3-kube-api-access-mkzb2\") pod \"iptables-alerter-7c28p\" (UID: \"4c5a0c1d-867a-4ce4-9570-ea66452c8db3\") " pod="openshift-network-operator/iptables-alerter-7c28p" Mar 08 03:53:11.605287 master-0 kubenswrapper[18592]: I0308 03:53:11.605228 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cbcb403-a424-4496-8c5c-5eb5e42dfb93-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-8gfmf\" (UID: \"1cbcb403-a424-4496-8c5c-5eb5e42dfb93\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" Mar 08 03:53:11.620906 master-0 kubenswrapper[18592]: I0308 03:53:11.620803 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26180f77-0b1a-4d0f-9ed0-a12fdee69817-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-chpl6\" (UID: \"26180f77-0b1a-4d0f-9ed0-a12fdee69817\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-chpl6" Mar 08 03:53:11.702587 master-0 kubenswrapper[18592]: I0308 03:53:11.664238 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2h62\" (UniqueName: \"kubernetes.io/projected/1482d789-884b-4337-b598-f0e2b71eb9f2-kube-api-access-m2h62\") pod \"catalog-operator-7d9c49f57b-qlfgq\" (UID: \"1482d789-884b-4337-b598-f0e2b71eb9f2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:53:11.717237 master-0 kubenswrapper[18592]: I0308 03:53:11.716766 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4stz\" (UniqueName: \"kubernetes.io/projected/164586b1-f133-4427-8ab6-eb0839b79738-kube-api-access-r4stz\") pod \"network-node-identity-ggzm8\" (UID: \"164586b1-f133-4427-8ab6-eb0839b79738\") " pod="openshift-network-node-identity/network-node-identity-ggzm8" Mar 08 03:53:11.717237 master-0 kubenswrapper[18592]: I0308 03:53:11.716778 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lfcj\" (UniqueName: \"kubernetes.io/projected/54ad284e-d40e-4e69-b898-f5093952a0e6-kube-api-access-9lfcj\") pod \"marketplace-operator-64bf9778cb-9sw2d\" (UID: \"54ad284e-d40e-4e69-b898-f5093952a0e6\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:53:11.717237 master-0 kubenswrapper[18592]: I0308 03:53:11.717182 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sx5s\" (UniqueName: \"kubernetes.io/projected/b3eea925-73b3-4693-8f0e-6dd26107f60a-kube-api-access-6sx5s\") pod \"cluster-storage-operator-6fbfc8dc8f-nm8fj\" (UID: \"b3eea925-73b3-4693-8f0e-6dd26107f60a\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" Mar 08 03:53:11.718273 master-0 kubenswrapper[18592]: I0308 03:53:11.718232 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb87l\" (UniqueName: \"kubernetes.io/projected/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-kube-api-access-pb87l\") pod \"route-controller-manager-5cb98fbc8c-xnx9t\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:53:11.736810 master-0 kubenswrapper[18592]: I0308 03:53:11.736765 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx4fw\" (UniqueName: \"kubernetes.io/projected/232c421d-96f0-4894-b8d8-74f43d02bbd3-kube-api-access-fx4fw\") pod \"cluster-node-tuning-operator-66c7586884-qjv52\" (UID: \"232c421d-96f0-4894-b8d8-74f43d02bbd3\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-qjv52" Mar 08 03:53:11.758153 master-0 kubenswrapper[18592]: I0308 03:53:11.758090 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fzmf\" (UniqueName: \"kubernetes.io/projected/10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79-kube-api-access-7fzmf\") pod \"packageserver-67b55db9c7-4qgpb\" (UID: \"10c165f1-53b4-4e7e-9cd1-00bb4d9cbc79\") " pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:53:11.780597 master-0 kubenswrapper[18592]: I0308 03:53:11.780535 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4541b7b-3f7f-4851-9bd9-26fcda5cab13-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-g6n58\" (UID: \"e4541b7b-3f7f-4851-9bd9-26fcda5cab13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-g6n58" Mar 08 03:53:11.796490 master-0 kubenswrapper[18592]: I0308 03:53:11.796438 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfz27\" (UniqueName: \"kubernetes.io/projected/1b69fbf6-1ca5-413e-bffd-965730bcec1b-kube-api-access-nfz27\") pod \"catalogd-controller-manager-7f8b8b6f4c-8h6fj\" (UID: \"1b69fbf6-1ca5-413e-bffd-965730bcec1b\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:53:11.818634 master-0 kubenswrapper[18592]: I0308 03:53:11.818584 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-bound-sa-token\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:53:11.836555 master-0 kubenswrapper[18592]: I0308 03:53:11.836474 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9z8g\" (UniqueName: \"kubernetes.io/projected/139881ee-6cfa-4a7e-b002-63cece048d16-kube-api-access-h9z8g\") pod \"control-plane-machine-set-operator-6686554ddc-7bcsk\" (UID: \"139881ee-6cfa-4a7e-b002-63cece048d16\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk" Mar 08 03:53:11.861344 master-0 kubenswrapper[18592]: I0308 03:53:11.861295 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nlq2\" (UniqueName: \"kubernetes.io/projected/b70adfe9-94f1-44bc-85ce-498e5f0a1ca7-kube-api-access-6nlq2\") pod \"machine-api-operator-84bf6db4f9-tdrf8\" (UID: \"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" Mar 08 03:53:11.891516 master-0 kubenswrapper[18592]: I0308 03:53:11.891364 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nk8r\" (UniqueName: \"kubernetes.io/projected/093f17f0-2818-4e24-b3c3-6ab4da9d21fb-kube-api-access-7nk8r\") pod \"multus-rpppb\" (UID: \"093f17f0-2818-4e24-b3c3-6ab4da9d21fb\") " pod="openshift-multus/multus-rpppb" Mar 08 03:53:11.900501 master-0 kubenswrapper[18592]: I0308 03:53:11.900447 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpsx7\" (UniqueName: \"kubernetes.io/projected/8efdcef9-9b31-4567-b7f9-cb59a894273d-kube-api-access-cpsx7\") pod \"dns-operator-589895fbb7-xttlz\" (UID: \"8efdcef9-9b31-4567-b7f9-cb59a894273d\") " pod="openshift-dns-operator/dns-operator-589895fbb7-xttlz" Mar 08 03:53:11.920300 master-0 kubenswrapper[18592]: I0308 03:53:11.920276 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhms8\" (UniqueName: \"kubernetes.io/projected/0031e3a9-b253-4dda-a890-bf3e4d8737e8-kube-api-access-qhms8\") pod \"certified-operators-p8nq8\" (UID: \"0031e3a9-b253-4dda-a890-bf3e4d8737e8\") " pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:53:11.937616 master-0 kubenswrapper[18592]: I0308 03:53:11.937585 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgc7c\" (UniqueName: \"kubernetes.io/projected/3ddfd0e7-fe76-41bc-b316-94505df81002-kube-api-access-bgc7c\") pod \"network-operator-7c649bf6d4-99d2k\" (UID: \"3ddfd0e7-fe76-41bc-b316-94505df81002\") " pod="openshift-network-operator/network-operator-7c649bf6d4-99d2k" Mar 08 03:53:11.956889 master-0 kubenswrapper[18592]: I0308 03:53:11.956772 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6gml\" (UniqueName: \"kubernetes.io/projected/7e5935ea-8d95-45e3-b836-c7892953ef3d-kube-api-access-c6gml\") pod \"ovnkube-control-plane-66b55d57d-bjjfh\" (UID: \"7e5935ea-8d95-45e3-b836-c7892953ef3d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" Mar 08 03:53:11.976618 master-0 kubenswrapper[18592]: I0308 03:53:11.976565 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2262647b-c315-477a-93bd-f168c1810475-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-zq9rp\" (UID: \"2262647b-c315-477a-93bd-f168c1810475\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" Mar 08 03:53:11.997098 master-0 kubenswrapper[18592]: I0308 03:53:11.997062 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcjr9\" (UniqueName: \"kubernetes.io/projected/49ec083d-dc74-457e-b10f-3bde04e9e75e-kube-api-access-zcjr9\") pod \"service-ca-84bfdbbb7f-gj69x\" (UID: \"49ec083d-dc74-457e-b10f-3bde04e9e75e\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" Mar 08 03:53:12.002633 master-0 kubenswrapper[18592]: I0308 03:53:12.002604 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:53:12.008131 master-0 kubenswrapper[18592]: I0308 03:53:12.008100 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:53:12.018659 master-0 kubenswrapper[18592]: I0308 03:53:12.018633 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxxhh\" (UniqueName: \"kubernetes.io/projected/ee586416-6f56-4ea4-ad62-95de1e6df23b-kube-api-access-sxxhh\") pod \"insights-operator-8f89dfddd-4mr6p\" (UID: \"ee586416-6f56-4ea4-ad62-95de1e6df23b\") " pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" Mar 08 03:53:12.035083 master-0 kubenswrapper[18592]: I0308 03:53:12.035044 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2"] Mar 08 03:53:12.056137 master-0 kubenswrapper[18592]: I0308 03:53:12.056101 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfqc5\" (UniqueName: \"kubernetes.io/projected/7ff63c73-62a3-44b4-acd3-1b3df175794f-kube-api-access-vfqc5\") pod \"cluster-olm-operator-77899cf6d-x9h9q\" (UID: \"7ff63c73-62a3-44b4-acd3-1b3df175794f\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" Mar 08 03:53:12.060208 master-0 kubenswrapper[18592]: I0308 03:53:12.060147 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cwmn\" (UniqueName: \"kubernetes.io/projected/634c0f6d-bce6-42cf-9253-80d1bcc7c507-kube-api-access-8cwmn\") pod \"cluster-samples-operator-664cb58b85-lrnks\" (UID: \"634c0f6d-bce6-42cf-9253-80d1bcc7c507\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lrnks" Mar 08 03:53:12.082654 master-0 kubenswrapper[18592]: I0308 03:53:12.082613 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw7mr\" (UniqueName: \"kubernetes.io/projected/69eb8ba2-7bfb-4433-8951-08f89e7bcb5f-kube-api-access-fw7mr\") pod \"cluster-image-registry-operator-86d6d77c7c-572xh\" (UID: \"69eb8ba2-7bfb-4433-8951-08f89e7bcb5f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-572xh" Mar 08 03:53:12.108320 master-0 kubenswrapper[18592]: I0308 03:53:12.108284 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7rsc\" (UniqueName: \"kubernetes.io/projected/c0861ccd-5e86-4277-9082-95f3133508a0-kube-api-access-n7rsc\") pod \"cloud-credential-operator-55d85b7b47-5v6gs\" (UID: \"c0861ccd-5e86-4277-9082-95f3133508a0\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-5v6gs" Mar 08 03:53:12.120514 master-0 kubenswrapper[18592]: I0308 03:53:12.120480 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmsj5\" (UniqueName: \"kubernetes.io/projected/33ed331b-89e9-45f8-ab3c-4533a77cc7b6-kube-api-access-hmsj5\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-f89bv\" (UID: \"33ed331b-89e9-45f8-ab3c-4533a77cc7b6\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" Mar 08 03:53:12.124689 master-0 kubenswrapper[18592]: I0308 03:53:12.124649 18592 request.go:700] Waited for 2.876549225s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-ovn-kubernetes/serviceaccounts/ovn-kubernetes-node/token Mar 08 03:53:12.138187 master-0 kubenswrapper[18592]: I0308 03:53:12.138142 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vklx\" (UniqueName: \"kubernetes.io/projected/f0f5f3f3-0856-4da3-9157-15f65c6aba6e-kube-api-access-2vklx\") pod \"ovnkube-node-jc6rf\" (UID: \"f0f5f3f3-0856-4da3-9157-15f65c6aba6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:12.171203 master-0 kubenswrapper[18592]: I0308 03:53:12.171132 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxkm6\" (UniqueName: \"kubernetes.io/projected/a60bc804-52e7-422a-87fd-ac4c5aa90cb3-kube-api-access-zxkm6\") pod \"authentication-operator-7c6989d6c4-slm72\" (UID: \"a60bc804-52e7-422a-87fd-ac4c5aa90cb3\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-slm72" Mar 08 03:53:12.179034 master-0 kubenswrapper[18592]: I0308 03:53:12.178988 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfgc6\" (UniqueName: \"kubernetes.io/projected/2b3d1dc7-22f9-4c0c-802a-d7314894b255-kube-api-access-zfgc6\") pod \"community-operators-lwt58\" (UID: \"2b3d1dc7-22f9-4c0c-802a-d7314894b255\") " pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:53:12.210816 master-0 kubenswrapper[18592]: I0308 03:53:12.210763 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv7sd\" (UniqueName: \"kubernetes.io/projected/1b700d17-83d2-46c8-afbc-e5774822eabe-kube-api-access-cv7sd\") pod \"cluster-autoscaler-operator-69576476f7-bv67b\" (UID: \"1b700d17-83d2-46c8-afbc-e5774822eabe\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" Mar 08 03:53:12.223660 master-0 kubenswrapper[18592]: I0308 03:53:12.223576 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm59c\" (UniqueName: \"kubernetes.io/projected/8e283f49-b85d-4789-a71f-3fcb5033cdf0-kube-api-access-zm59c\") pod \"redhat-marketplace-4h8qm\" (UID: \"8e283f49-b85d-4789-a71f-3fcb5033cdf0\") " pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:53:12.241359 master-0 kubenswrapper[18592]: I0308 03:53:12.241266 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jljzc\" (UniqueName: \"kubernetes.io/projected/d2cd5b23-e622-4b96-aee8-dbc942b73b4a-kube-api-access-jljzc\") pod \"node-resolver-wjl9v\" (UID: \"d2cd5b23-e622-4b96-aee8-dbc942b73b4a\") " pod="openshift-dns/node-resolver-wjl9v" Mar 08 03:53:12.263511 master-0 kubenswrapper[18592]: I0308 03:53:12.263461 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdn9r\" (UniqueName: \"kubernetes.io/projected/127c3f92-8283-4179-9e40-a12dcabaaa12-kube-api-access-zdn9r\") pod \"machine-approver-754bdc9f9d-z4sdd\" (UID: \"127c3f92-8283-4179-9e40-a12dcabaaa12\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" Mar 08 03:53:12.278574 master-0 kubenswrapper[18592]: I0308 03:53:12.278525 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kc5q\" (UniqueName: \"kubernetes.io/projected/e93b5361-30e6-44fd-a59e-2bc410c59480-kube-api-access-4kc5q\") pod \"network-check-target-xmgpj\" (UID: \"e93b5361-30e6-44fd-a59e-2bc410c59480\") " pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:53:12.304015 master-0 kubenswrapper[18592]: I0308 03:53:12.303668 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqhzl\" (UniqueName: \"kubernetes.io/projected/1eb851be-f157-48ea-9a39-1361b68d2639-kube-api-access-nqhzl\") pod \"multus-admission-controller-8d675b596-j8pv6\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:53:12.325757 master-0 kubenswrapper[18592]: I0308 03:53:12.325652 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29dpg\" (UniqueName: \"kubernetes.io/projected/c9de4939-680a-4e3e-89fd-e20ecb8b10f2-kube-api-access-29dpg\") pod \"ingress-operator-677db989d6-t77qr\" (UID: \"c9de4939-680a-4e3e-89fd-e20ecb8b10f2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-t77qr" Mar 08 03:53:12.351986 master-0 kubenswrapper[18592]: I0308 03:53:12.351892 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfvnn\" (UniqueName: \"kubernetes.io/projected/5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a-kube-api-access-cfvnn\") pod \"kube-storage-version-migrator-operator-7f65c457f5-6fhhs\" (UID: \"5953ccfc-d0f9-4e24-bf59-bfa85a2b9e4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-6fhhs" Mar 08 03:53:12.380166 master-0 kubenswrapper[18592]: I0308 03:53:12.375399 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x997v\" (UniqueName: \"kubernetes.io/projected/6cde5024-edf7-4fa4-8964-cabe7899578b-kube-api-access-x997v\") pod \"package-server-manager-854648ff6d-c46zz\" (UID: \"6cde5024-edf7-4fa4-8964-cabe7899578b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:53:12.391487 master-0 kubenswrapper[18592]: I0308 03:53:12.391364 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b5zb\" (UniqueName: \"kubernetes.io/projected/5a7752f9-7b9a-451f-997a-e9f696d38b34-kube-api-access-8b5zb\") pod \"etcd-operator-5884b9cd56-vzms7\" (UID: \"5a7752f9-7b9a-451f-997a-e9f696d38b34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" Mar 08 03:53:12.403278 master-0 kubenswrapper[18592]: I0308 03:53:12.403231 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qn5v\" (UniqueName: \"kubernetes.io/projected/0d377285-0336-41b7-b48f-c44a7b563498-kube-api-access-7qn5v\") pod \"service-ca-operator-69b6fc6b88-kg795\" (UID: \"0d377285-0336-41b7-b48f-c44a7b563498\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-kg795" Mar 08 03:53:12.420399 master-0 kubenswrapper[18592]: I0308 03:53:12.420343 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7smmf\" (UniqueName: \"kubernetes.io/projected/84d353ae-3992-4c17-a20e-3415edd92509-kube-api-access-7smmf\") pod \"migrator-57ccdf9b5-wqldq\" (UID: \"84d353ae-3992-4c17-a20e-3415edd92509\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-wqldq" Mar 08 03:53:12.446396 master-0 kubenswrapper[18592]: I0308 03:53:12.446340 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm7bw\" (UniqueName: \"kubernetes.io/projected/2f59fe81-deee-4ced-ae9d-f17752c82c4b-kube-api-access-bm7bw\") pod \"operator-controller-controller-manager-6598bfb6c4-75qmb\" (UID: \"2f59fe81-deee-4ced-ae9d-f17752c82c4b\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:53:12.460589 master-0 kubenswrapper[18592]: I0308 03:53:12.460508 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmhtb\" (UniqueName: \"kubernetes.io/projected/d5044ffd-0686-4679-9894-e696faf33699-kube-api-access-mmhtb\") pod \"network-metrics-daemon-schjl\" (UID: \"d5044ffd-0686-4679-9894-e696faf33699\") " pod="openshift-multus/network-metrics-daemon-schjl" Mar 08 03:53:12.476322 master-0 kubenswrapper[18592]: I0308 03:53:12.476291 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd549\" (UniqueName: \"kubernetes.io/projected/52b495ac-bb28-44f3-b925-3c54f86d5ec4-kube-api-access-dd549\") pod \"csi-snapshot-controller-operator-5685fbc7d-xhbrl\" (UID: \"52b495ac-bb28-44f3-b925-3c54f86d5ec4\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-xhbrl" Mar 08 03:53:12.497507 master-0 kubenswrapper[18592]: E0308 03:53:12.497457 18592 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:12.497507 master-0 kubenswrapper[18592]: E0308 03:53:12.497500 18592 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:12.497631 master-0 kubenswrapper[18592]: E0308 03:53:12.497580 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access podName:9c95709c-c3cb-46fb-afe7-626c8013f3c6 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:12.997557632 +0000 UTC m=+5.096311982 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "9c95709c-c3cb-46fb-afe7-626c8013f3c6") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:12.499444 master-0 kubenswrapper[18592]: I0308 03:53:12.499406 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" event={"ID":"e78b283b-981e-48d7-a5f2-53f8401766ea","Type":"ContainerStarted","Data":"681e71b6a629264d3ef580ed43d956b818fe75c6272b5ca29255c2efce510702"} Mar 08 03:53:12.499444 master-0 kubenswrapper[18592]: I0308 03:53:12.499445 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" event={"ID":"e78b283b-981e-48d7-a5f2-53f8401766ea","Type":"ContainerStarted","Data":"a9cb63d374e634553343804bbe1694fd59777432c1442ac27146224b4af9f933"} Mar 08 03:53:12.499563 master-0 kubenswrapper[18592]: I0308 03:53:12.499459 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" event={"ID":"e78b283b-981e-48d7-a5f2-53f8401766ea","Type":"ContainerStarted","Data":"51929a4039df8b43629777c0cb6bbf00b061c71b13837a786f61b14553dfedfa"} Mar 08 03:53:12.505516 master-0 kubenswrapper[18592]: I0308 03:53:12.505494 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:53:12.577086 master-0 kubenswrapper[18592]: I0308 03:53:12.577028 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:12.577336 master-0 kubenswrapper[18592]: I0308 03:53:12.577169 18592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:53:12.577336 master-0 kubenswrapper[18592]: I0308 03:53:12.577180 18592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:53:12.581663 master-0 kubenswrapper[18592]: I0308 03:53:12.581582 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:13.032147 master-0 kubenswrapper[18592]: I0308 03:53:13.032090 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:13.032736 master-0 kubenswrapper[18592]: E0308 03:53:13.032249 18592 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:13.032736 master-0 kubenswrapper[18592]: E0308 03:53:13.032266 18592 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:13.032736 master-0 kubenswrapper[18592]: E0308 03:53:13.032310 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access podName:9c95709c-c3cb-46fb-afe7-626c8013f3c6 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:14.032296491 +0000 UTC m=+6.131050841 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "9c95709c-c3cb-46fb-afe7-626c8013f3c6") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:13.059779 master-0 kubenswrapper[18592]: I0308 03:53:13.059703 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:13.066462 master-0 kubenswrapper[18592]: I0308 03:53:13.066411 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:13.390063 master-0 kubenswrapper[18592]: I0308 03:53:13.389917 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:53:13.396965 master-0 kubenswrapper[18592]: I0308 03:53:13.396902 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:53:13.438127 master-0 kubenswrapper[18592]: I0308 03:53:13.438047 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=15.438024023 podStartE2EDuration="15.438024023s" podCreationTimestamp="2026-03-08 03:52:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:53:13.436889327 +0000 UTC m=+5.535643707" watchObservedRunningTime="2026-03-08 03:53:13.438024023 +0000 UTC m=+5.536778413" Mar 08 03:53:13.438411 master-0 kubenswrapper[18592]: I0308 03:53:13.438350 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=5.43834123 podStartE2EDuration="5.43834123s" podCreationTimestamp="2026-03-08 03:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:53:13.386749587 +0000 UTC m=+5.485503977" watchObservedRunningTime="2026-03-08 03:53:13.43834123 +0000 UTC m=+5.537095620" Mar 08 03:53:13.474612 master-0 kubenswrapper[18592]: I0308 03:53:13.474541 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:13.509448 master-0 kubenswrapper[18592]: I0308 03:53:13.509378 18592 generic.go:334] "Generic (PLEG): container finished" podID="ee586416-6f56-4ea4-ad62-95de1e6df23b" containerID="ee2bfb125e22b7f5901652b9d324e5701d25b6ae22870a7e30683877ccc3b4cb" exitCode=0 Mar 08 03:53:13.510072 master-0 kubenswrapper[18592]: I0308 03:53:13.509965 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" event={"ID":"ee586416-6f56-4ea4-ad62-95de1e6df23b","Type":"ContainerDied","Data":"ee2bfb125e22b7f5901652b9d324e5701d25b6ae22870a7e30683877ccc3b4cb"} Mar 08 03:53:13.510295 master-0 kubenswrapper[18592]: I0308 03:53:13.510255 18592 scope.go:117] "RemoveContainer" containerID="644f0c7d4552f15957ecfc56f2d37a06ec2757ddcc7c2c371f0c34b92aa63533" Mar 08 03:53:13.524527 master-0 kubenswrapper[18592]: I0308 03:53:13.522953 18592 scope.go:117] "RemoveContainer" containerID="ee2bfb125e22b7f5901652b9d324e5701d25b6ae22870a7e30683877ccc3b4cb" Mar 08 03:53:13.539496 master-0 kubenswrapper[18592]: I0308 03:53:13.539404 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:53:13.617186 master-0 kubenswrapper[18592]: I0308 03:53:13.617133 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:53:13.909708 master-0 kubenswrapper[18592]: I0308 03:53:13.909593 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 08 03:53:13.922129 master-0 kubenswrapper[18592]: I0308 03:53:13.922072 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 08 03:53:14.049122 master-0 kubenswrapper[18592]: I0308 03:53:14.049058 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:14.049886 master-0 kubenswrapper[18592]: E0308 03:53:14.049209 18592 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:14.049886 master-0 kubenswrapper[18592]: E0308 03:53:14.049234 18592 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:14.049886 master-0 kubenswrapper[18592]: E0308 03:53:14.049283 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access podName:9c95709c-c3cb-46fb-afe7-626c8013f3c6 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:16.049268185 +0000 UTC m=+8.148022535 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "9c95709c-c3cb-46fb-afe7-626c8013f3c6") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:14.116157 master-0 kubenswrapper[18592]: I0308 03:53:14.116102 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:53:14.121615 master-0 kubenswrapper[18592]: I0308 03:53:14.121528 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-c46zz" Mar 08 03:53:14.172747 master-0 kubenswrapper[18592]: I0308 03:53:14.172575 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:53:14.204296 master-0 kubenswrapper[18592]: I0308 03:53:14.204247 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rnz4w" Mar 08 03:53:14.470025 master-0 kubenswrapper[18592]: I0308 03:53:14.469959 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:53:14.478536 master-0 kubenswrapper[18592]: I0308 03:53:14.477988 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:53:14.481185 master-0 kubenswrapper[18592]: I0308 03:53:14.480450 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:53:14.487923 master-0 kubenswrapper[18592]: I0308 03:53:14.487863 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:53:14.528029 master-0 kubenswrapper[18592]: I0308 03:53:14.524700 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" event={"ID":"ee586416-6f56-4ea4-ad62-95de1e6df23b","Type":"ContainerStarted","Data":"b993a7c9605bb38752ad78b483fef1e87627a44d0b8204e7dbbc52680443a98d"} Mar 08 03:53:14.532737 master-0 kubenswrapper[18592]: I0308 03:53:14.532692 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:53:14.549573 master-0 kubenswrapper[18592]: I0308 03:53:14.549514 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 08 03:53:14.637580 master-0 kubenswrapper[18592]: I0308 03:53:14.637528 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:14.642715 master-0 kubenswrapper[18592]: I0308 03:53:14.642631 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:15.052595 master-0 kubenswrapper[18592]: I0308 03:53:15.052543 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:15.085071 master-0 kubenswrapper[18592]: I0308 03:53:15.085006 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:15.088788 master-0 kubenswrapper[18592]: I0308 03:53:15.088709 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:15.099763 master-0 kubenswrapper[18592]: I0308 03:53:15.099257 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7695b9f8b5-4jpgl" Mar 08 03:53:15.163346 master-0 kubenswrapper[18592]: I0308 03:53:15.163286 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-2cbnp"] Mar 08 03:53:15.163545 master-0 kubenswrapper[18592]: E0308 03:53:15.163536 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 08 03:53:15.163588 master-0 kubenswrapper[18592]: I0308 03:53:15.163551 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 08 03:53:15.163588 master-0 kubenswrapper[18592]: E0308 03:53:15.163577 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:53:15.163588 master-0 kubenswrapper[18592]: I0308 03:53:15.163585 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:53:15.163670 master-0 kubenswrapper[18592]: E0308 03:53:15.163598 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d191ff84-f4e4-4d99-8cbb-c10771e68baf" containerName="installer" Mar 08 03:53:15.163670 master-0 kubenswrapper[18592]: I0308 03:53:15.163607 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="d191ff84-f4e4-4d99-8cbb-c10771e68baf" containerName="installer" Mar 08 03:53:15.163670 master-0 kubenswrapper[18592]: E0308 03:53:15.163616 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f865279-e751-456d-8c96-6381f8b45ce1" containerName="installer" Mar 08 03:53:15.163670 master-0 kubenswrapper[18592]: I0308 03:53:15.163624 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f865279-e751-456d-8c96-6381f8b45ce1" containerName="installer" Mar 08 03:53:15.163670 master-0 kubenswrapper[18592]: E0308 03:53:15.163645 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baab6171-046d-4fc9-b7d7-ff2fd12f185f" containerName="installer" Mar 08 03:53:15.163670 master-0 kubenswrapper[18592]: I0308 03:53:15.163652 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="baab6171-046d-4fc9-b7d7-ff2fd12f185f" containerName="installer" Mar 08 03:53:15.163670 master-0 kubenswrapper[18592]: E0308 03:53:15.163665 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bf40ef9-a79a-4f5d-933c-5276edcccb4b" containerName="installer" Mar 08 03:53:15.163670 master-0 kubenswrapper[18592]: I0308 03:53:15.163674 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bf40ef9-a79a-4f5d-933c-5276edcccb4b" containerName="installer" Mar 08 03:53:15.163945 master-0 kubenswrapper[18592]: E0308 03:53:15.163686 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c95709c-c3cb-46fb-afe7-626c8013f3c6" containerName="installer" Mar 08 03:53:15.163945 master-0 kubenswrapper[18592]: I0308 03:53:15.163694 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c95709c-c3cb-46fb-afe7-626c8013f3c6" containerName="installer" Mar 08 03:53:15.163945 master-0 kubenswrapper[18592]: E0308 03:53:15.163702 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" containerName="assisted-installer-controller" Mar 08 03:53:15.163945 master-0 kubenswrapper[18592]: I0308 03:53:15.163710 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" containerName="assisted-installer-controller" Mar 08 03:53:15.163945 master-0 kubenswrapper[18592]: E0308 03:53:15.163720 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47643289-ac4b-425d-8ea1-913b6ca39ee0" containerName="installer" Mar 08 03:53:15.163945 master-0 kubenswrapper[18592]: I0308 03:53:15.163727 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="47643289-ac4b-425d-8ea1-913b6ca39ee0" containerName="installer" Mar 08 03:53:15.163945 master-0 kubenswrapper[18592]: E0308 03:53:15.163737 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b" containerName="installer" Mar 08 03:53:15.163945 master-0 kubenswrapper[18592]: I0308 03:53:15.163745 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b" containerName="installer" Mar 08 03:53:15.163945 master-0 kubenswrapper[18592]: I0308 03:53:15.163868 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="d191ff84-f4e4-4d99-8cbb-c10771e68baf" containerName="installer" Mar 08 03:53:15.163945 master-0 kubenswrapper[18592]: I0308 03:53:15.163916 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bf40ef9-a79a-4f5d-933c-5276edcccb4b" containerName="installer" Mar 08 03:53:15.163945 master-0 kubenswrapper[18592]: I0308 03:53:15.163929 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:53:15.163945 master-0 kubenswrapper[18592]: I0308 03:53:15.163942 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c95709c-c3cb-46fb-afe7-626c8013f3c6" containerName="installer" Mar 08 03:53:15.164255 master-0 kubenswrapper[18592]: I0308 03:53:15.163955 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b" containerName="installer" Mar 08 03:53:15.164255 master-0 kubenswrapper[18592]: I0308 03:53:15.163970 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f865279-e751-456d-8c96-6381f8b45ce1" containerName="installer" Mar 08 03:53:15.164255 master-0 kubenswrapper[18592]: I0308 03:53:15.163982 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="47643289-ac4b-425d-8ea1-913b6ca39ee0" containerName="installer" Mar 08 03:53:15.164255 master-0 kubenswrapper[18592]: I0308 03:53:15.163998 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="baab6171-046d-4fc9-b7d7-ff2fd12f185f" containerName="installer" Mar 08 03:53:15.164255 master-0 kubenswrapper[18592]: I0308 03:53:15.164010 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 08 03:53:15.164255 master-0 kubenswrapper[18592]: I0308 03:53:15.164021 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7" containerName="assisted-installer-controller" Mar 08 03:53:15.164678 master-0 kubenswrapper[18592]: I0308 03:53:15.164648 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2cbnp" Mar 08 03:53:15.167626 master-0 kubenswrapper[18592]: I0308 03:53:15.167591 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 03:53:15.175986 master-0 kubenswrapper[18592]: I0308 03:53:15.175945 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:53:15.182304 master-0 kubenswrapper[18592]: I0308 03:53:15.182269 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:53:15.267615 master-0 kubenswrapper[18592]: I0308 03:53:15.267566 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77a4b443-2341-4682-bca1-d3481da34e16-proxy-tls\") pod \"machine-config-daemon-2cbnp\" (UID: \"77a4b443-2341-4682-bca1-d3481da34e16\") " pod="openshift-machine-config-operator/machine-config-daemon-2cbnp" Mar 08 03:53:15.267615 master-0 kubenswrapper[18592]: I0308 03:53:15.267619 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xhvs\" (UniqueName: \"kubernetes.io/projected/77a4b443-2341-4682-bca1-d3481da34e16-kube-api-access-6xhvs\") pod \"machine-config-daemon-2cbnp\" (UID: \"77a4b443-2341-4682-bca1-d3481da34e16\") " pod="openshift-machine-config-operator/machine-config-daemon-2cbnp" Mar 08 03:53:15.267894 master-0 kubenswrapper[18592]: I0308 03:53:15.267637 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/77a4b443-2341-4682-bca1-d3481da34e16-mcd-auth-proxy-config\") pod \"machine-config-daemon-2cbnp\" (UID: \"77a4b443-2341-4682-bca1-d3481da34e16\") " pod="openshift-machine-config-operator/machine-config-daemon-2cbnp" Mar 08 03:53:15.267894 master-0 kubenswrapper[18592]: I0308 03:53:15.267749 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/77a4b443-2341-4682-bca1-d3481da34e16-rootfs\") pod \"machine-config-daemon-2cbnp\" (UID: \"77a4b443-2341-4682-bca1-d3481da34e16\") " pod="openshift-machine-config-operator/machine-config-daemon-2cbnp" Mar 08 03:53:15.370159 master-0 kubenswrapper[18592]: I0308 03:53:15.370054 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77a4b443-2341-4682-bca1-d3481da34e16-proxy-tls\") pod \"machine-config-daemon-2cbnp\" (UID: \"77a4b443-2341-4682-bca1-d3481da34e16\") " pod="openshift-machine-config-operator/machine-config-daemon-2cbnp" Mar 08 03:53:15.370159 master-0 kubenswrapper[18592]: I0308 03:53:15.370108 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xhvs\" (UniqueName: \"kubernetes.io/projected/77a4b443-2341-4682-bca1-d3481da34e16-kube-api-access-6xhvs\") pod \"machine-config-daemon-2cbnp\" (UID: \"77a4b443-2341-4682-bca1-d3481da34e16\") " pod="openshift-machine-config-operator/machine-config-daemon-2cbnp" Mar 08 03:53:15.370355 master-0 kubenswrapper[18592]: I0308 03:53:15.370231 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/77a4b443-2341-4682-bca1-d3481da34e16-mcd-auth-proxy-config\") pod \"machine-config-daemon-2cbnp\" (UID: \"77a4b443-2341-4682-bca1-d3481da34e16\") " pod="openshift-machine-config-operator/machine-config-daemon-2cbnp" Mar 08 03:53:15.370355 master-0 kubenswrapper[18592]: I0308 03:53:15.370267 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/77a4b443-2341-4682-bca1-d3481da34e16-rootfs\") pod \"machine-config-daemon-2cbnp\" (UID: \"77a4b443-2341-4682-bca1-d3481da34e16\") " pod="openshift-machine-config-operator/machine-config-daemon-2cbnp" Mar 08 03:53:15.370420 master-0 kubenswrapper[18592]: I0308 03:53:15.370403 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/77a4b443-2341-4682-bca1-d3481da34e16-rootfs\") pod \"machine-config-daemon-2cbnp\" (UID: \"77a4b443-2341-4682-bca1-d3481da34e16\") " pod="openshift-machine-config-operator/machine-config-daemon-2cbnp" Mar 08 03:53:15.370672 master-0 kubenswrapper[18592]: I0308 03:53:15.370634 18592 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 08 03:53:15.371576 master-0 kubenswrapper[18592]: I0308 03:53:15.371548 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/77a4b443-2341-4682-bca1-d3481da34e16-mcd-auth-proxy-config\") pod \"machine-config-daemon-2cbnp\" (UID: \"77a4b443-2341-4682-bca1-d3481da34e16\") " pod="openshift-machine-config-operator/machine-config-daemon-2cbnp" Mar 08 03:53:15.373076 master-0 kubenswrapper[18592]: I0308 03:53:15.373048 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77a4b443-2341-4682-bca1-d3481da34e16-proxy-tls\") pod \"machine-config-daemon-2cbnp\" (UID: \"77a4b443-2341-4682-bca1-d3481da34e16\") " pod="openshift-machine-config-operator/machine-config-daemon-2cbnp" Mar 08 03:53:15.397518 master-0 kubenswrapper[18592]: I0308 03:53:15.397477 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xhvs\" (UniqueName: \"kubernetes.io/projected/77a4b443-2341-4682-bca1-d3481da34e16-kube-api-access-6xhvs\") pod \"machine-config-daemon-2cbnp\" (UID: \"77a4b443-2341-4682-bca1-d3481da34e16\") " pod="openshift-machine-config-operator/machine-config-daemon-2cbnp" Mar 08 03:53:15.485870 master-0 kubenswrapper[18592]: I0308 03:53:15.485789 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2cbnp" Mar 08 03:53:15.534121 master-0 kubenswrapper[18592]: I0308 03:53:15.534086 18592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:53:15.534232 master-0 kubenswrapper[18592]: I0308 03:53:15.534091 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2cbnp" event={"ID":"77a4b443-2341-4682-bca1-d3481da34e16","Type":"ContainerStarted","Data":"f807efd5b0378602c7a8e9ab37dc3979bbaa51560a0a90acd44dde1819dddf9c"} Mar 08 03:53:15.534232 master-0 kubenswrapper[18592]: I0308 03:53:15.534122 18592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:53:15.888163 master-0 kubenswrapper[18592]: I0308 03:53:15.888109 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:15.897444 master-0 kubenswrapper[18592]: I0308 03:53:15.897385 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-6b779d99b8-7kmck" Mar 08 03:53:16.083471 master-0 kubenswrapper[18592]: I0308 03:53:16.083408 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:16.083940 master-0 kubenswrapper[18592]: E0308 03:53:16.083592 18592 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:16.083940 master-0 kubenswrapper[18592]: E0308 03:53:16.083618 18592 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:16.083940 master-0 kubenswrapper[18592]: E0308 03:53:16.083679 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access podName:9c95709c-c3cb-46fb-afe7-626c8013f3c6 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:20.083662274 +0000 UTC m=+12.182416634 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "9c95709c-c3cb-46fb-afe7-626c8013f3c6") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:16.367134 master-0 kubenswrapper[18592]: I0308 03:53:16.367054 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:53:16.369238 master-0 kubenswrapper[18592]: I0308 03:53:16.369183 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 03:53:16.566294 master-0 kubenswrapper[18592]: I0308 03:53:16.566178 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2cbnp" event={"ID":"77a4b443-2341-4682-bca1-d3481da34e16","Type":"ContainerStarted","Data":"e711477ca6b161717a636b0278ba191de29e3b6bab1bef3fe6629f4afcca9c1e"} Mar 08 03:53:16.566294 master-0 kubenswrapper[18592]: I0308 03:53:16.566235 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2cbnp" event={"ID":"77a4b443-2341-4682-bca1-d3481da34e16","Type":"ContainerStarted","Data":"bc991bbcb9f09b499e651da9036ea13932bc3011290e97a4362ec00d28582d4e"} Mar 08 03:53:16.614374 master-0 kubenswrapper[18592]: I0308 03:53:16.613929 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2cbnp" podStartSLOduration=1.613911321 podStartE2EDuration="1.613911321s" podCreationTimestamp="2026-03-08 03:53:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:53:16.58813708 +0000 UTC m=+8.686891430" watchObservedRunningTime="2026-03-08 03:53:16.613911321 +0000 UTC m=+8.712665671" Mar 08 03:53:17.476844 master-0 kubenswrapper[18592]: I0308 03:53:17.476761 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:53:18.187943 master-0 kubenswrapper[18592]: I0308 03:53:18.187869 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:53:19.181754 master-0 kubenswrapper[18592]: I0308 03:53:19.181694 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:53:19.186997 master-0 kubenswrapper[18592]: I0308 03:53:19.186956 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-qlfgq" Mar 08 03:53:19.403933 master-0 kubenswrapper[18592]: I0308 03:53:19.403889 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth"] Mar 08 03:53:19.404895 master-0 kubenswrapper[18592]: I0308 03:53:19.404871 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth" Mar 08 03:53:19.406788 master-0 kubenswrapper[18592]: I0308 03:53:19.406760 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 08 03:53:19.419417 master-0 kubenswrapper[18592]: I0308 03:53:19.419350 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth"] Mar 08 03:53:19.539671 master-0 kubenswrapper[18592]: I0308 03:53:19.539567 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6552620e-b23b-4102-a6ed-a0fcaff0f144-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-6wkth\" (UID: \"6552620e-b23b-4102-a6ed-a0fcaff0f144\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth" Mar 08 03:53:19.539952 master-0 kubenswrapper[18592]: I0308 03:53:19.539680 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc7tz\" (UniqueName: \"kubernetes.io/projected/6552620e-b23b-4102-a6ed-a0fcaff0f144-kube-api-access-wc7tz\") pod \"machine-config-controller-ff46b7bdf-6wkth\" (UID: \"6552620e-b23b-4102-a6ed-a0fcaff0f144\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth" Mar 08 03:53:19.539952 master-0 kubenswrapper[18592]: I0308 03:53:19.539785 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6552620e-b23b-4102-a6ed-a0fcaff0f144-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-6wkth\" (UID: \"6552620e-b23b-4102-a6ed-a0fcaff0f144\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth" Mar 08 03:53:19.641172 master-0 kubenswrapper[18592]: I0308 03:53:19.641096 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6552620e-b23b-4102-a6ed-a0fcaff0f144-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-6wkth\" (UID: \"6552620e-b23b-4102-a6ed-a0fcaff0f144\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth" Mar 08 03:53:19.641459 master-0 kubenswrapper[18592]: I0308 03:53:19.641229 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6552620e-b23b-4102-a6ed-a0fcaff0f144-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-6wkth\" (UID: \"6552620e-b23b-4102-a6ed-a0fcaff0f144\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth" Mar 08 03:53:19.641459 master-0 kubenswrapper[18592]: I0308 03:53:19.641272 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc7tz\" (UniqueName: \"kubernetes.io/projected/6552620e-b23b-4102-a6ed-a0fcaff0f144-kube-api-access-wc7tz\") pod \"machine-config-controller-ff46b7bdf-6wkth\" (UID: \"6552620e-b23b-4102-a6ed-a0fcaff0f144\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth" Mar 08 03:53:19.642532 master-0 kubenswrapper[18592]: I0308 03:53:19.642455 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6552620e-b23b-4102-a6ed-a0fcaff0f144-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-6wkth\" (UID: \"6552620e-b23b-4102-a6ed-a0fcaff0f144\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth" Mar 08 03:53:19.660373 master-0 kubenswrapper[18592]: I0308 03:53:19.659853 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6552620e-b23b-4102-a6ed-a0fcaff0f144-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-6wkth\" (UID: \"6552620e-b23b-4102-a6ed-a0fcaff0f144\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth" Mar 08 03:53:19.665069 master-0 kubenswrapper[18592]: I0308 03:53:19.664989 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc7tz\" (UniqueName: \"kubernetes.io/projected/6552620e-b23b-4102-a6ed-a0fcaff0f144-kube-api-access-wc7tz\") pod \"machine-config-controller-ff46b7bdf-6wkth\" (UID: \"6552620e-b23b-4102-a6ed-a0fcaff0f144\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth" Mar 08 03:53:19.734492 master-0 kubenswrapper[18592]: I0308 03:53:19.734448 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth" Mar 08 03:53:19.747373 master-0 kubenswrapper[18592]: I0308 03:53:19.747337 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:53:19.748002 master-0 kubenswrapper[18592]: I0308 03:53:19.747980 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 03:53:19.964651 master-0 kubenswrapper[18592]: I0308 03:53:19.962483 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:53:19.971418 master-0 kubenswrapper[18592]: I0308 03:53:19.969533 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qddlp" Mar 08 03:53:20.173070 master-0 kubenswrapper[18592]: I0308 03:53:20.172475 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:20.173070 master-0 kubenswrapper[18592]: E0308 03:53:20.172707 18592 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:20.173070 master-0 kubenswrapper[18592]: E0308 03:53:20.172732 18592 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:20.173070 master-0 kubenswrapper[18592]: E0308 03:53:20.172786 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access podName:9c95709c-c3cb-46fb-afe7-626c8013f3c6 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:28.172768898 +0000 UTC m=+20.271523248 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "9c95709c-c3cb-46fb-afe7-626c8013f3c6") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:20.205246 master-0 kubenswrapper[18592]: I0308 03:53:20.205198 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth"] Mar 08 03:53:20.213021 master-0 kubenswrapper[18592]: W0308 03:53:20.212972 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6552620e_b23b_4102_a6ed_a0fcaff0f144.slice/crio-0549f3f77de8f11e8416c5f57c2870158f0f5c30aeba0b65ef2d085a46a05675 WatchSource:0}: Error finding container 0549f3f77de8f11e8416c5f57c2870158f0f5c30aeba0b65ef2d085a46a05675: Status 404 returned error can't find the container with id 0549f3f77de8f11e8416c5f57c2870158f0f5c30aeba0b65ef2d085a46a05675 Mar 08 03:53:20.424240 master-0 kubenswrapper[18592]: I0308 03:53:20.424151 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:53:20.426732 master-0 kubenswrapper[18592]: I0308 03:53:20.426705 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xmgpj" Mar 08 03:53:20.609077 master-0 kubenswrapper[18592]: I0308 03:53:20.608693 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kkrx6"] Mar 08 03:53:20.610350 master-0 kubenswrapper[18592]: I0308 03:53:20.609273 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-79f8cd6fdd-zmrb9"] Mar 08 03:53:20.610350 master-0 kubenswrapper[18592]: I0308 03:53:20.609630 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-ghjxr"] Mar 08 03:53:20.610350 master-0 kubenswrapper[18592]: I0308 03:53:20.609950 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kkrx6" Mar 08 03:53:20.614048 master-0 kubenswrapper[18592]: I0308 03:53:20.613975 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:20.616284 master-0 kubenswrapper[18592]: I0308 03:53:20.616078 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-ghjxr" Mar 08 03:53:20.616284 master-0 kubenswrapper[18592]: I0308 03:53:20.616156 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 08 03:53:20.618329 master-0 kubenswrapper[18592]: I0308 03:53:20.618307 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 08 03:53:20.618925 master-0 kubenswrapper[18592]: I0308 03:53:20.618582 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 08 03:53:20.618925 master-0 kubenswrapper[18592]: I0308 03:53:20.618707 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 08 03:53:20.618925 master-0 kubenswrapper[18592]: I0308 03:53:20.618839 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 08 03:53:20.623409 master-0 kubenswrapper[18592]: I0308 03:53:20.623372 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 08 03:53:20.623577 master-0 kubenswrapper[18592]: I0308 03:53:20.623550 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 08 03:53:20.626688 master-0 kubenswrapper[18592]: I0308 03:53:20.626662 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth" event={"ID":"6552620e-b23b-4102-a6ed-a0fcaff0f144","Type":"ContainerStarted","Data":"ebe87f562840881073dc3dbc24a404d096c67eaa7b6d5850985951d9e75a2616"} Mar 08 03:53:20.626768 master-0 kubenswrapper[18592]: I0308 03:53:20.626695 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth" event={"ID":"6552620e-b23b-4102-a6ed-a0fcaff0f144","Type":"ContainerStarted","Data":"c162dc56ff027f45de8ba28290228ab4d7bf1321c54c8a9b17c086435474a021"} Mar 08 03:53:20.626768 master-0 kubenswrapper[18592]: I0308 03:53:20.626706 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-ghjxr"] Mar 08 03:53:20.626768 master-0 kubenswrapper[18592]: I0308 03:53:20.626721 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth" event={"ID":"6552620e-b23b-4102-a6ed-a0fcaff0f144","Type":"ContainerStarted","Data":"0549f3f77de8f11e8416c5f57c2870158f0f5c30aeba0b65ef2d085a46a05675"} Mar 08 03:53:20.643712 master-0 kubenswrapper[18592]: I0308 03:53:20.643666 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kkrx6"] Mar 08 03:53:20.660145 master-0 kubenswrapper[18592]: I0308 03:53:20.660093 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xs7l6"] Mar 08 03:53:20.661287 master-0 kubenswrapper[18592]: I0308 03:53:20.661259 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xs7l6" Mar 08 03:53:20.663567 master-0 kubenswrapper[18592]: I0308 03:53:20.663192 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:53:20.664491 master-0 kubenswrapper[18592]: I0308 03:53:20.664457 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 08 03:53:20.664621 master-0 kubenswrapper[18592]: I0308 03:53:20.664591 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 08 03:53:20.665087 master-0 kubenswrapper[18592]: I0308 03:53:20.664463 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 08 03:53:20.691971 master-0 kubenswrapper[18592]: I0308 03:53:20.691180 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/acaa7c53-f877-480c-8f36-58af35e0e305-stats-auth\") pod \"router-default-79f8cd6fdd-zmrb9\" (UID: \"acaa7c53-f877-480c-8f36-58af35e0e305\") " pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:20.691971 master-0 kubenswrapper[18592]: I0308 03:53:20.691258 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acaa7c53-f877-480c-8f36-58af35e0e305-service-ca-bundle\") pod \"router-default-79f8cd6fdd-zmrb9\" (UID: \"acaa7c53-f877-480c-8f36-58af35e0e305\") " pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:20.691971 master-0 kubenswrapper[18592]: I0308 03:53:20.691302 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/acaa7c53-f877-480c-8f36-58af35e0e305-default-certificate\") pod \"router-default-79f8cd6fdd-zmrb9\" (UID: \"acaa7c53-f877-480c-8f36-58af35e0e305\") " pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:20.691971 master-0 kubenswrapper[18592]: I0308 03:53:20.691317 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qpkv\" (UniqueName: \"kubernetes.io/projected/acaa7c53-f877-480c-8f36-58af35e0e305-kube-api-access-6qpkv\") pod \"router-default-79f8cd6fdd-zmrb9\" (UID: \"acaa7c53-f877-480c-8f36-58af35e0e305\") " pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:20.691971 master-0 kubenswrapper[18592]: I0308 03:53:20.691336 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ec003aa0-e60e-4c9b-8110-48502405d3a7-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-kkrx6\" (UID: \"ec003aa0-e60e-4c9b-8110-48502405d3a7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kkrx6" Mar 08 03:53:20.691971 master-0 kubenswrapper[18592]: I0308 03:53:20.691418 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q9jp\" (UniqueName: \"kubernetes.io/projected/a9a3f03a-e376-415a-acf1-bdff171ff9b9-kube-api-access-5q9jp\") pod \"network-check-source-7c67b67d47-ghjxr\" (UID: \"a9a3f03a-e376-415a-acf1-bdff171ff9b9\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-ghjxr" Mar 08 03:53:20.691971 master-0 kubenswrapper[18592]: I0308 03:53:20.691436 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acaa7c53-f877-480c-8f36-58af35e0e305-metrics-certs\") pod \"router-default-79f8cd6fdd-zmrb9\" (UID: \"acaa7c53-f877-480c-8f36-58af35e0e305\") " pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:20.695911 master-0 kubenswrapper[18592]: I0308 03:53:20.692701 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xs7l6"] Mar 08 03:53:20.750023 master-0 kubenswrapper[18592]: I0308 03:53:20.749733 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth" podStartSLOduration=1.749716395 podStartE2EDuration="1.749716395s" podCreationTimestamp="2026-03-08 03:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:53:20.719305857 +0000 UTC m=+12.818060207" watchObservedRunningTime="2026-03-08 03:53:20.749716395 +0000 UTC m=+12.848470745" Mar 08 03:53:20.762027 master-0 kubenswrapper[18592]: I0308 03:53:20.761979 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:53:20.793846 master-0 kubenswrapper[18592]: I0308 03:53:20.792952 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/acaa7c53-f877-480c-8f36-58af35e0e305-default-certificate\") pod \"router-default-79f8cd6fdd-zmrb9\" (UID: \"acaa7c53-f877-480c-8f36-58af35e0e305\") " pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:20.793846 master-0 kubenswrapper[18592]: I0308 03:53:20.793007 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e51a412e-9068-4720-aec4-f07e8fc465c9-cert\") pod \"ingress-canary-xs7l6\" (UID: \"e51a412e-9068-4720-aec4-f07e8fc465c9\") " pod="openshift-ingress-canary/ingress-canary-xs7l6" Mar 08 03:53:20.793846 master-0 kubenswrapper[18592]: I0308 03:53:20.793025 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qpkv\" (UniqueName: \"kubernetes.io/projected/acaa7c53-f877-480c-8f36-58af35e0e305-kube-api-access-6qpkv\") pod \"router-default-79f8cd6fdd-zmrb9\" (UID: \"acaa7c53-f877-480c-8f36-58af35e0e305\") " pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:20.793846 master-0 kubenswrapper[18592]: I0308 03:53:20.793045 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ec003aa0-e60e-4c9b-8110-48502405d3a7-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-kkrx6\" (UID: \"ec003aa0-e60e-4c9b-8110-48502405d3a7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kkrx6" Mar 08 03:53:20.798762 master-0 kubenswrapper[18592]: I0308 03:53:20.798612 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q9jp\" (UniqueName: \"kubernetes.io/projected/a9a3f03a-e376-415a-acf1-bdff171ff9b9-kube-api-access-5q9jp\") pod \"network-check-source-7c67b67d47-ghjxr\" (UID: \"a9a3f03a-e376-415a-acf1-bdff171ff9b9\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-ghjxr" Mar 08 03:53:20.798762 master-0 kubenswrapper[18592]: I0308 03:53:20.798683 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acaa7c53-f877-480c-8f36-58af35e0e305-metrics-certs\") pod \"router-default-79f8cd6fdd-zmrb9\" (UID: \"acaa7c53-f877-480c-8f36-58af35e0e305\") " pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:20.798762 master-0 kubenswrapper[18592]: I0308 03:53:20.798726 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvqmp\" (UniqueName: \"kubernetes.io/projected/e51a412e-9068-4720-aec4-f07e8fc465c9-kube-api-access-lvqmp\") pod \"ingress-canary-xs7l6\" (UID: \"e51a412e-9068-4720-aec4-f07e8fc465c9\") " pod="openshift-ingress-canary/ingress-canary-xs7l6" Mar 08 03:53:20.798762 master-0 kubenswrapper[18592]: I0308 03:53:20.798766 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/acaa7c53-f877-480c-8f36-58af35e0e305-stats-auth\") pod \"router-default-79f8cd6fdd-zmrb9\" (UID: \"acaa7c53-f877-480c-8f36-58af35e0e305\") " pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:20.798981 master-0 kubenswrapper[18592]: I0308 03:53:20.798810 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acaa7c53-f877-480c-8f36-58af35e0e305-service-ca-bundle\") pod \"router-default-79f8cd6fdd-zmrb9\" (UID: \"acaa7c53-f877-480c-8f36-58af35e0e305\") " pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:20.799716 master-0 kubenswrapper[18592]: I0308 03:53:20.799593 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/acaa7c53-f877-480c-8f36-58af35e0e305-default-certificate\") pod \"router-default-79f8cd6fdd-zmrb9\" (UID: \"acaa7c53-f877-480c-8f36-58af35e0e305\") " pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:20.799716 master-0 kubenswrapper[18592]: I0308 03:53:20.799677 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acaa7c53-f877-480c-8f36-58af35e0e305-service-ca-bundle\") pod \"router-default-79f8cd6fdd-zmrb9\" (UID: \"acaa7c53-f877-480c-8f36-58af35e0e305\") " pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:20.799792 master-0 kubenswrapper[18592]: I0308 03:53:20.799771 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ec003aa0-e60e-4c9b-8110-48502405d3a7-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-kkrx6\" (UID: \"ec003aa0-e60e-4c9b-8110-48502405d3a7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kkrx6" Mar 08 03:53:20.806857 master-0 kubenswrapper[18592]: I0308 03:53:20.803290 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acaa7c53-f877-480c-8f36-58af35e0e305-metrics-certs\") pod \"router-default-79f8cd6fdd-zmrb9\" (UID: \"acaa7c53-f877-480c-8f36-58af35e0e305\") " pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:20.811847 master-0 kubenswrapper[18592]: I0308 03:53:20.808246 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/acaa7c53-f877-480c-8f36-58af35e0e305-stats-auth\") pod \"router-default-79f8cd6fdd-zmrb9\" (UID: \"acaa7c53-f877-480c-8f36-58af35e0e305\") " pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:20.825841 master-0 kubenswrapper[18592]: I0308 03:53:20.825425 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qpkv\" (UniqueName: \"kubernetes.io/projected/acaa7c53-f877-480c-8f36-58af35e0e305-kube-api-access-6qpkv\") pod \"router-default-79f8cd6fdd-zmrb9\" (UID: \"acaa7c53-f877-480c-8f36-58af35e0e305\") " pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:20.853845 master-0 kubenswrapper[18592]: I0308 03:53:20.848337 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q9jp\" (UniqueName: \"kubernetes.io/projected/a9a3f03a-e376-415a-acf1-bdff171ff9b9-kube-api-access-5q9jp\") pod \"network-check-source-7c67b67d47-ghjxr\" (UID: \"a9a3f03a-e376-415a-acf1-bdff171ff9b9\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-ghjxr" Mar 08 03:53:20.900542 master-0 kubenswrapper[18592]: I0308 03:53:20.900472 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvqmp\" (UniqueName: \"kubernetes.io/projected/e51a412e-9068-4720-aec4-f07e8fc465c9-kube-api-access-lvqmp\") pod \"ingress-canary-xs7l6\" (UID: \"e51a412e-9068-4720-aec4-f07e8fc465c9\") " pod="openshift-ingress-canary/ingress-canary-xs7l6" Mar 08 03:53:20.900542 master-0 kubenswrapper[18592]: I0308 03:53:20.900546 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e51a412e-9068-4720-aec4-f07e8fc465c9-cert\") pod \"ingress-canary-xs7l6\" (UID: \"e51a412e-9068-4720-aec4-f07e8fc465c9\") " pod="openshift-ingress-canary/ingress-canary-xs7l6" Mar 08 03:53:20.903519 master-0 kubenswrapper[18592]: I0308 03:53:20.903478 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e51a412e-9068-4720-aec4-f07e8fc465c9-cert\") pod \"ingress-canary-xs7l6\" (UID: \"e51a412e-9068-4720-aec4-f07e8fc465c9\") " pod="openshift-ingress-canary/ingress-canary-xs7l6" Mar 08 03:53:20.914785 master-0 kubenswrapper[18592]: I0308 03:53:20.914753 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvqmp\" (UniqueName: \"kubernetes.io/projected/e51a412e-9068-4720-aec4-f07e8fc465c9-kube-api-access-lvqmp\") pod \"ingress-canary-xs7l6\" (UID: \"e51a412e-9068-4720-aec4-f07e8fc465c9\") " pod="openshift-ingress-canary/ingress-canary-xs7l6" Mar 08 03:53:20.973438 master-0 kubenswrapper[18592]: I0308 03:53:20.973325 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kkrx6" Mar 08 03:53:21.003195 master-0 kubenswrapper[18592]: I0308 03:53:21.002781 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:21.019891 master-0 kubenswrapper[18592]: I0308 03:53:21.018994 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-ghjxr" Mar 08 03:53:21.020743 master-0 kubenswrapper[18592]: I0308 03:53:21.020420 18592 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 03:53:21.050783 master-0 kubenswrapper[18592]: I0308 03:53:21.050719 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xs7l6" Mar 08 03:53:21.115969 master-0 kubenswrapper[18592]: I0308 03:53:21.110092 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:21.115969 master-0 kubenswrapper[18592]: I0308 03:53:21.110244 18592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:53:21.115969 master-0 kubenswrapper[18592]: I0308 03:53:21.110253 18592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:53:21.135847 master-0 kubenswrapper[18592]: I0308 03:53:21.133557 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:53:21.153861 master-0 kubenswrapper[18592]: I0308 03:53:21.153201 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-67b55db9c7-4qgpb" Mar 08 03:53:21.180844 master-0 kubenswrapper[18592]: I0308 03:53:21.180138 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:21.453348 master-0 kubenswrapper[18592]: I0308 03:53:21.453306 18592 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 03:53:21.542727 master-0 kubenswrapper[18592]: I0308 03:53:21.542620 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kkrx6"] Mar 08 03:53:21.551103 master-0 kubenswrapper[18592]: W0308 03:53:21.551032 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec003aa0_e60e_4c9b_8110_48502405d3a7.slice/crio-453fdd0348565dd9384451a6b5a72d653a72167d89db49c6103ebb3a1be62b96 WatchSource:0}: Error finding container 453fdd0348565dd9384451a6b5a72d653a72167d89db49c6103ebb3a1be62b96: Status 404 returned error can't find the container with id 453fdd0348565dd9384451a6b5a72d653a72167d89db49c6103ebb3a1be62b96 Mar 08 03:53:21.564627 master-0 kubenswrapper[18592]: I0308 03:53:21.564081 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:53:21.616512 master-0 kubenswrapper[18592]: I0308 03:53:21.612204 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-ghjxr"] Mar 08 03:53:21.642987 master-0 kubenswrapper[18592]: I0308 03:53:21.641426 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" event={"ID":"acaa7c53-f877-480c-8f36-58af35e0e305","Type":"ContainerStarted","Data":"55b78679973c2ebe8075120bc23251da8ce51bc5704ec0a88ab5e9c09e4d99cc"} Mar 08 03:53:21.644020 master-0 kubenswrapper[18592]: I0308 03:53:21.643978 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kkrx6" event={"ID":"ec003aa0-e60e-4c9b-8110-48502405d3a7","Type":"ContainerStarted","Data":"453fdd0348565dd9384451a6b5a72d653a72167d89db49c6103ebb3a1be62b96"} Mar 08 03:53:21.646039 master-0 kubenswrapper[18592]: I0308 03:53:21.646005 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-ghjxr" event={"ID":"a9a3f03a-e376-415a-acf1-bdff171ff9b9","Type":"ContainerStarted","Data":"a5e4d4f08498cbe8dd180ddf3318cf21560885b0f538e5b79c21289354d739c0"} Mar 08 03:53:21.646638 master-0 kubenswrapper[18592]: I0308 03:53:21.646569 18592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:53:21.662681 master-0 kubenswrapper[18592]: I0308 03:53:21.662344 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xs7l6"] Mar 08 03:53:21.675240 master-0 kubenswrapper[18592]: W0308 03:53:21.675172 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode51a412e_9068_4720_aec4_f07e8fc465c9.slice/crio-2c07b0fa0d5188a55363f9700365bfcc2cc337ba90c234e1f8d2a557d23a2c09 WatchSource:0}: Error finding container 2c07b0fa0d5188a55363f9700365bfcc2cc337ba90c234e1f8d2a557d23a2c09: Status 404 returned error can't find the container with id 2c07b0fa0d5188a55363f9700365bfcc2cc337ba90c234e1f8d2a557d23a2c09 Mar 08 03:53:21.810895 master-0 kubenswrapper[18592]: I0308 03:53:21.809961 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:53:21.890284 master-0 kubenswrapper[18592]: I0308 03:53:21.890233 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4h8qm" Mar 08 03:53:22.318156 master-0 kubenswrapper[18592]: I0308 03:53:22.318072 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:53:22.662046 master-0 kubenswrapper[18592]: I0308 03:53:22.661931 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xs7l6" event={"ID":"e51a412e-9068-4720-aec4-f07e8fc465c9","Type":"ContainerStarted","Data":"d4ab0e2899bb29248d9da04915077a851ed70354fce4f8a0c98d599336c9f5b0"} Mar 08 03:53:22.662046 master-0 kubenswrapper[18592]: I0308 03:53:22.661989 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xs7l6" event={"ID":"e51a412e-9068-4720-aec4-f07e8fc465c9","Type":"ContainerStarted","Data":"2c07b0fa0d5188a55363f9700365bfcc2cc337ba90c234e1f8d2a557d23a2c09"} Mar 08 03:53:22.665183 master-0 kubenswrapper[18592]: I0308 03:53:22.665133 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-ghjxr" event={"ID":"a9a3f03a-e376-415a-acf1-bdff171ff9b9","Type":"ContainerStarted","Data":"9f055cfef9ac49279aa9dccd08109bca5fea81c0deeffeb7f94a566d3f3bb1ce"} Mar 08 03:53:22.721832 master-0 kubenswrapper[18592]: I0308 03:53:22.721747 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xs7l6" podStartSLOduration=2.721730474 podStartE2EDuration="2.721730474s" podCreationTimestamp="2026-03-08 03:53:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:53:22.684364417 +0000 UTC m=+14.783118767" watchObservedRunningTime="2026-03-08 03:53:22.721730474 +0000 UTC m=+14.820484824" Mar 08 03:53:22.743538 master-0 kubenswrapper[18592]: I0308 03:53:22.741679 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-ghjxr" podStartSLOduration=421.74166294 podStartE2EDuration="7m1.74166294s" podCreationTimestamp="2026-03-08 03:46:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:53:22.740291079 +0000 UTC m=+14.839045429" watchObservedRunningTime="2026-03-08 03:53:22.74166294 +0000 UTC m=+14.840417290" Mar 08 03:53:23.156338 master-0 kubenswrapper[18592]: I0308 03:53:23.156277 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-mr9k6"] Mar 08 03:53:23.156988 master-0 kubenswrapper[18592]: I0308 03:53:23.156963 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:53:23.162843 master-0 kubenswrapper[18592]: I0308 03:53:23.160446 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 08 03:53:23.162843 master-0 kubenswrapper[18592]: I0308 03:53:23.160492 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 08 03:53:23.162843 master-0 kubenswrapper[18592]: I0308 03:53:23.160577 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 08 03:53:23.162843 master-0 kubenswrapper[18592]: I0308 03:53:23.160695 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 08 03:53:23.162843 master-0 kubenswrapper[18592]: I0308 03:53:23.161038 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 08 03:53:23.182499 master-0 kubenswrapper[18592]: I0308 03:53:23.180320 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-mr9k6"] Mar 08 03:53:23.255844 master-0 kubenswrapper[18592]: I0308 03:53:23.255407 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqrw9\" (UniqueName: \"kubernetes.io/projected/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-kube-api-access-mqrw9\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:53:23.255844 master-0 kubenswrapper[18592]: I0308 03:53:23.255505 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:53:23.255844 master-0 kubenswrapper[18592]: I0308 03:53:23.255529 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-config\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:53:23.255844 master-0 kubenswrapper[18592]: I0308 03:53:23.255553 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-serving-cert\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:53:23.357519 master-0 kubenswrapper[18592]: I0308 03:53:23.357441 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqrw9\" (UniqueName: \"kubernetes.io/projected/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-kube-api-access-mqrw9\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:53:23.357899 master-0 kubenswrapper[18592]: I0308 03:53:23.357543 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:53:23.357899 master-0 kubenswrapper[18592]: I0308 03:53:23.357566 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-config\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:53:23.360933 master-0 kubenswrapper[18592]: I0308 03:53:23.357929 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-serving-cert\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:53:23.360933 master-0 kubenswrapper[18592]: E0308 03:53:23.358085 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca podName:48ab3c8e-a2bd-4380-9e8d-a41d515a989d nodeName:}" failed. No retries permitted until 2026-03-08 03:53:23.858062471 +0000 UTC m=+15.956816821 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca") pod "console-operator-6c7fb6b958-mr9k6" (UID: "48ab3c8e-a2bd-4380-9e8d-a41d515a989d") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:23.360933 master-0 kubenswrapper[18592]: I0308 03:53:23.358729 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-config\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:53:23.366436 master-0 kubenswrapper[18592]: I0308 03:53:23.365331 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-serving-cert\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:53:23.395470 master-0 kubenswrapper[18592]: I0308 03:53:23.395430 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqrw9\" (UniqueName: \"kubernetes.io/projected/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-kube-api-access-mqrw9\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:53:23.481255 master-0 kubenswrapper[18592]: I0308 03:53:23.481132 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:53:23.866749 master-0 kubenswrapper[18592]: I0308 03:53:23.866645 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:53:23.867300 master-0 kubenswrapper[18592]: E0308 03:53:23.866895 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca podName:48ab3c8e-a2bd-4380-9e8d-a41d515a989d nodeName:}" failed. No retries permitted until 2026-03-08 03:53:24.866871766 +0000 UTC m=+16.965626126 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca") pod "console-operator-6c7fb6b958-mr9k6" (UID: "48ab3c8e-a2bd-4380-9e8d-a41d515a989d") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:23.876771 master-0 kubenswrapper[18592]: I0308 03:53:23.876726 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-nn467"] Mar 08 03:53:23.879569 master-0 kubenswrapper[18592]: I0308 03:53:23.879531 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-nn467" Mar 08 03:53:23.881503 master-0 kubenswrapper[18592]: I0308 03:53:23.881464 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 08 03:53:23.882681 master-0 kubenswrapper[18592]: I0308 03:53:23.882656 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 08 03:53:23.968520 master-0 kubenswrapper[18592]: I0308 03:53:23.968469 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrjd4\" (UniqueName: \"kubernetes.io/projected/491fe54e-cac5-4ea2-b745-c5fc9bad3ca0-kube-api-access-qrjd4\") pod \"machine-config-server-nn467\" (UID: \"491fe54e-cac5-4ea2-b745-c5fc9bad3ca0\") " pod="openshift-machine-config-operator/machine-config-server-nn467" Mar 08 03:53:23.968691 master-0 kubenswrapper[18592]: I0308 03:53:23.968558 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/491fe54e-cac5-4ea2-b745-c5fc9bad3ca0-certs\") pod \"machine-config-server-nn467\" (UID: \"491fe54e-cac5-4ea2-b745-c5fc9bad3ca0\") " pod="openshift-machine-config-operator/machine-config-server-nn467" Mar 08 03:53:23.968691 master-0 kubenswrapper[18592]: I0308 03:53:23.968593 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/491fe54e-cac5-4ea2-b745-c5fc9bad3ca0-node-bootstrap-token\") pod \"machine-config-server-nn467\" (UID: \"491fe54e-cac5-4ea2-b745-c5fc9bad3ca0\") " pod="openshift-machine-config-operator/machine-config-server-nn467" Mar 08 03:53:24.069563 master-0 kubenswrapper[18592]: I0308 03:53:24.069500 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrjd4\" (UniqueName: \"kubernetes.io/projected/491fe54e-cac5-4ea2-b745-c5fc9bad3ca0-kube-api-access-qrjd4\") pod \"machine-config-server-nn467\" (UID: \"491fe54e-cac5-4ea2-b745-c5fc9bad3ca0\") " pod="openshift-machine-config-operator/machine-config-server-nn467" Mar 08 03:53:24.069757 master-0 kubenswrapper[18592]: I0308 03:53:24.069627 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/491fe54e-cac5-4ea2-b745-c5fc9bad3ca0-certs\") pod \"machine-config-server-nn467\" (UID: \"491fe54e-cac5-4ea2-b745-c5fc9bad3ca0\") " pod="openshift-machine-config-operator/machine-config-server-nn467" Mar 08 03:53:24.069757 master-0 kubenswrapper[18592]: I0308 03:53:24.069663 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/491fe54e-cac5-4ea2-b745-c5fc9bad3ca0-node-bootstrap-token\") pod \"machine-config-server-nn467\" (UID: \"491fe54e-cac5-4ea2-b745-c5fc9bad3ca0\") " pod="openshift-machine-config-operator/machine-config-server-nn467" Mar 08 03:53:24.074453 master-0 kubenswrapper[18592]: I0308 03:53:24.074431 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/491fe54e-cac5-4ea2-b745-c5fc9bad3ca0-node-bootstrap-token\") pod \"machine-config-server-nn467\" (UID: \"491fe54e-cac5-4ea2-b745-c5fc9bad3ca0\") " pod="openshift-machine-config-operator/machine-config-server-nn467" Mar 08 03:53:24.075133 master-0 kubenswrapper[18592]: I0308 03:53:24.075097 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/491fe54e-cac5-4ea2-b745-c5fc9bad3ca0-certs\") pod \"machine-config-server-nn467\" (UID: \"491fe54e-cac5-4ea2-b745-c5fc9bad3ca0\") " pod="openshift-machine-config-operator/machine-config-server-nn467" Mar 08 03:53:24.100026 master-0 kubenswrapper[18592]: I0308 03:53:24.099987 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrjd4\" (UniqueName: \"kubernetes.io/projected/491fe54e-cac5-4ea2-b745-c5fc9bad3ca0-kube-api-access-qrjd4\") pod \"machine-config-server-nn467\" (UID: \"491fe54e-cac5-4ea2-b745-c5fc9bad3ca0\") " pod="openshift-machine-config-operator/machine-config-server-nn467" Mar 08 03:53:24.216535 master-0 kubenswrapper[18592]: I0308 03:53:24.216501 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-nn467" Mar 08 03:53:24.879299 master-0 kubenswrapper[18592]: I0308 03:53:24.879246 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:53:24.879884 master-0 kubenswrapper[18592]: E0308 03:53:24.879405 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca podName:48ab3c8e-a2bd-4380-9e8d-a41d515a989d nodeName:}" failed. No retries permitted until 2026-03-08 03:53:26.879392168 +0000 UTC m=+18.978146508 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca") pod "console-operator-6c7fb6b958-mr9k6" (UID: "48ab3c8e-a2bd-4380-9e8d-a41d515a989d") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:25.006009 master-0 kubenswrapper[18592]: W0308 03:53:25.004897 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod491fe54e_cac5_4ea2_b745_c5fc9bad3ca0.slice/crio-e3683442d6fd2a0cb8df4fce92d4bc8696a85d0de5d31f06c46146bf34233a37 WatchSource:0}: Error finding container e3683442d6fd2a0cb8df4fce92d4bc8696a85d0de5d31f06c46146bf34233a37: Status 404 returned error can't find the container with id e3683442d6fd2a0cb8df4fce92d4bc8696a85d0de5d31f06c46146bf34233a37 Mar 08 03:53:25.278114 master-0 kubenswrapper[18592]: I0308 03:53:25.277966 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:25.278400 master-0 kubenswrapper[18592]: I0308 03:53:25.278163 18592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:53:25.309707 master-0 kubenswrapper[18592]: I0308 03:53:25.309672 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jc6rf" Mar 08 03:53:25.707001 master-0 kubenswrapper[18592]: I0308 03:53:25.706903 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-nn467" event={"ID":"491fe54e-cac5-4ea2-b745-c5fc9bad3ca0","Type":"ContainerStarted","Data":"a4216e4c539f78b5bad2470d34d191e4da24b5c1090ee50b5ab1cf4138402270"} Mar 08 03:53:25.707467 master-0 kubenswrapper[18592]: I0308 03:53:25.707425 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-nn467" event={"ID":"491fe54e-cac5-4ea2-b745-c5fc9bad3ca0","Type":"ContainerStarted","Data":"e3683442d6fd2a0cb8df4fce92d4bc8696a85d0de5d31f06c46146bf34233a37"} Mar 08 03:53:25.709685 master-0 kubenswrapper[18592]: I0308 03:53:25.709642 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" event={"ID":"acaa7c53-f877-480c-8f36-58af35e0e305","Type":"ContainerStarted","Data":"8f3536d03fe9387bce461651e89775257b36f3feec28c60c952ede78d5773070"} Mar 08 03:53:25.712387 master-0 kubenswrapper[18592]: I0308 03:53:25.712318 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kkrx6" event={"ID":"ec003aa0-e60e-4c9b-8110-48502405d3a7","Type":"ContainerStarted","Data":"a6bf97a1df882eb823b70a1002418a6af8c03ece41359f5366c5361cfdc612a2"} Mar 08 03:53:25.760914 master-0 kubenswrapper[18592]: I0308 03:53:25.760785 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-nn467" podStartSLOduration=2.7607620539999997 podStartE2EDuration="2.760762054s" podCreationTimestamp="2026-03-08 03:53:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:53:25.759781721 +0000 UTC m=+17.858536081" watchObservedRunningTime="2026-03-08 03:53:25.760762054 +0000 UTC m=+17.859516424" Mar 08 03:53:25.847137 master-0 kubenswrapper[18592]: I0308 03:53:25.847036 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kkrx6" podStartSLOduration=316.398832591 podStartE2EDuration="5m19.847014062s" podCreationTimestamp="2026-03-08 03:48:06 +0000 UTC" firstStartedPulling="2026-03-08 03:53:21.560970263 +0000 UTC m=+13.659724623" lastFinishedPulling="2026-03-08 03:53:25.009151744 +0000 UTC m=+17.107906094" observedRunningTime="2026-03-08 03:53:25.813622986 +0000 UTC m=+17.912377346" watchObservedRunningTime="2026-03-08 03:53:25.847014062 +0000 UTC m=+17.945768422" Mar 08 03:53:25.847760 master-0 kubenswrapper[18592]: I0308 03:53:25.847712 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" podStartSLOduration=313.867104411 podStartE2EDuration="5m17.847705867s" podCreationTimestamp="2026-03-08 03:48:08 +0000 UTC" firstStartedPulling="2026-03-08 03:53:21.02037825 +0000 UTC m=+13.119132590" lastFinishedPulling="2026-03-08 03:53:25.000979686 +0000 UTC m=+17.099734046" observedRunningTime="2026-03-08 03:53:25.843618633 +0000 UTC m=+17.942372993" watchObservedRunningTime="2026-03-08 03:53:25.847705867 +0000 UTC m=+17.946460227" Mar 08 03:53:26.004233 master-0 kubenswrapper[18592]: I0308 03:53:26.004093 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:26.007951 master-0 kubenswrapper[18592]: I0308 03:53:26.007905 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:26.719629 master-0 kubenswrapper[18592]: I0308 03:53:26.719540 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:26.719908 master-0 kubenswrapper[18592]: I0308 03:53:26.719722 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kkrx6" Mar 08 03:53:26.724398 master-0 kubenswrapper[18592]: I0308 03:53:26.724319 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-79f8cd6fdd-zmrb9" Mar 08 03:53:26.730148 master-0 kubenswrapper[18592]: I0308 03:53:26.729895 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kkrx6" Mar 08 03:53:26.911913 master-0 kubenswrapper[18592]: I0308 03:53:26.911851 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:53:26.912161 master-0 kubenswrapper[18592]: E0308 03:53:26.912122 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca podName:48ab3c8e-a2bd-4380-9e8d-a41d515a989d nodeName:}" failed. No retries permitted until 2026-03-08 03:53:30.912099019 +0000 UTC m=+23.010853379 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca") pod "console-operator-6c7fb6b958-mr9k6" (UID: "48ab3c8e-a2bd-4380-9e8d-a41d515a989d") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:27.545485 master-0 kubenswrapper[18592]: I0308 03:53:27.545406 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:53:27.605436 master-0 kubenswrapper[18592]: I0308 03:53:27.605351 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lwt58" Mar 08 03:53:27.660431 master-0 kubenswrapper[18592]: I0308 03:53:27.659754 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9"] Mar 08 03:53:27.660708 master-0 kubenswrapper[18592]: I0308 03:53:27.660618 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" Mar 08 03:53:27.662137 master-0 kubenswrapper[18592]: I0308 03:53:27.662089 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-xbmc7" Mar 08 03:53:27.662419 master-0 kubenswrapper[18592]: I0308 03:53:27.662379 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 08 03:53:27.662946 master-0 kubenswrapper[18592]: I0308 03:53:27.662889 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 08 03:53:27.664044 master-0 kubenswrapper[18592]: I0308 03:53:27.663989 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 08 03:53:27.732160 master-0 kubenswrapper[18592]: I0308 03:53:27.725508 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x487n\" (UniqueName: \"kubernetes.io/projected/46b636ff-fb55-4e68-9836-04e46bd462ee-kube-api-access-x487n\") pod \"prometheus-operator-5ff8674d55-lv2h9\" (UID: \"46b636ff-fb55-4e68-9836-04e46bd462ee\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" Mar 08 03:53:27.732160 master-0 kubenswrapper[18592]: I0308 03:53:27.725595 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46b636ff-fb55-4e68-9836-04e46bd462ee-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-lv2h9\" (UID: \"46b636ff-fb55-4e68-9836-04e46bd462ee\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" Mar 08 03:53:27.732160 master-0 kubenswrapper[18592]: I0308 03:53:27.725927 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46b636ff-fb55-4e68-9836-04e46bd462ee-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-lv2h9\" (UID: \"46b636ff-fb55-4e68-9836-04e46bd462ee\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" Mar 08 03:53:27.732160 master-0 kubenswrapper[18592]: I0308 03:53:27.726053 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/46b636ff-fb55-4e68-9836-04e46bd462ee-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-lv2h9\" (UID: \"46b636ff-fb55-4e68-9836-04e46bd462ee\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" Mar 08 03:53:27.747467 master-0 kubenswrapper[18592]: I0308 03:53:27.747368 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9"] Mar 08 03:53:27.828161 master-0 kubenswrapper[18592]: I0308 03:53:27.828076 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46b636ff-fb55-4e68-9836-04e46bd462ee-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-lv2h9\" (UID: \"46b636ff-fb55-4e68-9836-04e46bd462ee\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" Mar 08 03:53:27.828968 master-0 kubenswrapper[18592]: I0308 03:53:27.828443 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46b636ff-fb55-4e68-9836-04e46bd462ee-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-lv2h9\" (UID: \"46b636ff-fb55-4e68-9836-04e46bd462ee\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" Mar 08 03:53:27.828968 master-0 kubenswrapper[18592]: I0308 03:53:27.828908 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/46b636ff-fb55-4e68-9836-04e46bd462ee-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-lv2h9\" (UID: \"46b636ff-fb55-4e68-9836-04e46bd462ee\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" Mar 08 03:53:27.828968 master-0 kubenswrapper[18592]: I0308 03:53:27.828948 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46b636ff-fb55-4e68-9836-04e46bd462ee-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-lv2h9\" (UID: \"46b636ff-fb55-4e68-9836-04e46bd462ee\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" Mar 08 03:53:27.829225 master-0 kubenswrapper[18592]: E0308 03:53:27.829047 18592 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Mar 08 03:53:27.829225 master-0 kubenswrapper[18592]: E0308 03:53:27.829163 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b636ff-fb55-4e68-9836-04e46bd462ee-prometheus-operator-tls podName:46b636ff-fb55-4e68-9836-04e46bd462ee nodeName:}" failed. No retries permitted until 2026-03-08 03:53:28.329128422 +0000 UTC m=+20.427882852 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/46b636ff-fb55-4e68-9836-04e46bd462ee-prometheus-operator-tls") pod "prometheus-operator-5ff8674d55-lv2h9" (UID: "46b636ff-fb55-4e68-9836-04e46bd462ee") : secret "prometheus-operator-tls" not found Mar 08 03:53:27.829358 master-0 kubenswrapper[18592]: I0308 03:53:27.829259 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x487n\" (UniqueName: \"kubernetes.io/projected/46b636ff-fb55-4e68-9836-04e46bd462ee-kube-api-access-x487n\") pod \"prometheus-operator-5ff8674d55-lv2h9\" (UID: \"46b636ff-fb55-4e68-9836-04e46bd462ee\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" Mar 08 03:53:27.833738 master-0 kubenswrapper[18592]: I0308 03:53:27.833683 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46b636ff-fb55-4e68-9836-04e46bd462ee-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-lv2h9\" (UID: \"46b636ff-fb55-4e68-9836-04e46bd462ee\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" Mar 08 03:53:28.235226 master-0 kubenswrapper[18592]: I0308 03:53:28.233872 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:28.235226 master-0 kubenswrapper[18592]: E0308 03:53:28.234112 18592 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:28.235226 master-0 kubenswrapper[18592]: E0308 03:53:28.234141 18592 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:28.235226 master-0 kubenswrapper[18592]: E0308 03:53:28.234206 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access podName:9c95709c-c3cb-46fb-afe7-626c8013f3c6 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:44.234183518 +0000 UTC m=+36.332937908 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "9c95709c-c3cb-46fb-afe7-626c8013f3c6") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:28.269377 master-0 kubenswrapper[18592]: I0308 03:53:28.269305 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:53:28.331023 master-0 kubenswrapper[18592]: I0308 03:53:28.330957 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p8nq8" Mar 08 03:53:28.334853 master-0 kubenswrapper[18592]: I0308 03:53:28.334784 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/46b636ff-fb55-4e68-9836-04e46bd462ee-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-lv2h9\" (UID: \"46b636ff-fb55-4e68-9836-04e46bd462ee\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" Mar 08 03:53:28.359132 master-0 kubenswrapper[18592]: I0308 03:53:28.359063 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/46b636ff-fb55-4e68-9836-04e46bd462ee-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-lv2h9\" (UID: \"46b636ff-fb55-4e68-9836-04e46bd462ee\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" Mar 08 03:53:28.382849 master-0 kubenswrapper[18592]: I0308 03:53:28.379898 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 08 03:53:28.382849 master-0 kubenswrapper[18592]: I0308 03:53:28.381116 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:53:28.382849 master-0 kubenswrapper[18592]: I0308 03:53:28.382088 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 08 03:53:28.387877 master-0 kubenswrapper[18592]: I0308 03:53:28.385279 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x487n\" (UniqueName: \"kubernetes.io/projected/46b636ff-fb55-4e68-9836-04e46bd462ee-kube-api-access-x487n\") pod \"prometheus-operator-5ff8674d55-lv2h9\" (UID: \"46b636ff-fb55-4e68-9836-04e46bd462ee\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" Mar 08 03:53:28.391850 master-0 kubenswrapper[18592]: I0308 03:53:28.388346 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 03:53:28.391850 master-0 kubenswrapper[18592]: I0308 03:53:28.388354 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-7rcx9" Mar 08 03:53:28.452106 master-0 kubenswrapper[18592]: I0308 03:53:28.445747 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e5dd61e1-5034-4d59-b752-9a4f6adb92d8-var-lock\") pod \"installer-4-master-0\" (UID: \"e5dd61e1-5034-4d59-b752-9a4f6adb92d8\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:53:28.452106 master-0 kubenswrapper[18592]: I0308 03:53:28.445876 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5dd61e1-5034-4d59-b752-9a4f6adb92d8-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e5dd61e1-5034-4d59-b752-9a4f6adb92d8\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:53:28.452106 master-0 kubenswrapper[18592]: I0308 03:53:28.445902 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5dd61e1-5034-4d59-b752-9a4f6adb92d8-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"e5dd61e1-5034-4d59-b752-9a4f6adb92d8\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:53:28.546630 master-0 kubenswrapper[18592]: I0308 03:53:28.546575 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5dd61e1-5034-4d59-b752-9a4f6adb92d8-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e5dd61e1-5034-4d59-b752-9a4f6adb92d8\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:53:28.546630 master-0 kubenswrapper[18592]: I0308 03:53:28.546634 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5dd61e1-5034-4d59-b752-9a4f6adb92d8-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"e5dd61e1-5034-4d59-b752-9a4f6adb92d8\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:53:28.547462 master-0 kubenswrapper[18592]: I0308 03:53:28.546690 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e5dd61e1-5034-4d59-b752-9a4f6adb92d8-var-lock\") pod \"installer-4-master-0\" (UID: \"e5dd61e1-5034-4d59-b752-9a4f6adb92d8\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:53:28.547462 master-0 kubenswrapper[18592]: I0308 03:53:28.546761 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e5dd61e1-5034-4d59-b752-9a4f6adb92d8-var-lock\") pod \"installer-4-master-0\" (UID: \"e5dd61e1-5034-4d59-b752-9a4f6adb92d8\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:53:28.547462 master-0 kubenswrapper[18592]: I0308 03:53:28.546807 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5dd61e1-5034-4d59-b752-9a4f6adb92d8-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"e5dd61e1-5034-4d59-b752-9a4f6adb92d8\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:53:28.564674 master-0 kubenswrapper[18592]: I0308 03:53:28.564618 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5dd61e1-5034-4d59-b752-9a4f6adb92d8-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e5dd61e1-5034-4d59-b752-9a4f6adb92d8\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:53:28.588596 master-0 kubenswrapper[18592]: I0308 03:53:28.588548 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" Mar 08 03:53:28.771813 master-0 kubenswrapper[18592]: I0308 03:53:28.771733 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:53:29.069345 master-0 kubenswrapper[18592]: I0308 03:53:29.069298 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9"] Mar 08 03:53:29.072132 master-0 kubenswrapper[18592]: W0308 03:53:29.072067 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46b636ff_fb55_4e68_9836_04e46bd462ee.slice/crio-9df9fa6b3a07d69ed586d48479eb7f1b9bbed31d641fd07d2e1ff34b993a6c45 WatchSource:0}: Error finding container 9df9fa6b3a07d69ed586d48479eb7f1b9bbed31d641fd07d2e1ff34b993a6c45: Status 404 returned error can't find the container with id 9df9fa6b3a07d69ed586d48479eb7f1b9bbed31d641fd07d2e1ff34b993a6c45 Mar 08 03:53:29.414288 master-0 kubenswrapper[18592]: I0308 03:53:29.413818 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 08 03:53:29.425304 master-0 kubenswrapper[18592]: W0308 03:53:29.425236 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode5dd61e1_5034_4d59_b752_9a4f6adb92d8.slice/crio-34dd4de04e0481f8dceb9d2e1dc60475fecde9f20ac0b059e7cfbc6cbaa160ff WatchSource:0}: Error finding container 34dd4de04e0481f8dceb9d2e1dc60475fecde9f20ac0b059e7cfbc6cbaa160ff: Status 404 returned error can't find the container with id 34dd4de04e0481f8dceb9d2e1dc60475fecde9f20ac0b059e7cfbc6cbaa160ff Mar 08 03:53:29.750734 master-0 kubenswrapper[18592]: I0308 03:53:29.750658 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" event={"ID":"46b636ff-fb55-4e68-9836-04e46bd462ee","Type":"ContainerStarted","Data":"9df9fa6b3a07d69ed586d48479eb7f1b9bbed31d641fd07d2e1ff34b993a6c45"} Mar 08 03:53:29.758302 master-0 kubenswrapper[18592]: I0308 03:53:29.752687 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"e5dd61e1-5034-4d59-b752-9a4f6adb92d8","Type":"ContainerStarted","Data":"34dd4de04e0481f8dceb9d2e1dc60475fecde9f20ac0b059e7cfbc6cbaa160ff"} Mar 08 03:53:30.764256 master-0 kubenswrapper[18592]: I0308 03:53:30.764198 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"e5dd61e1-5034-4d59-b752-9a4f6adb92d8","Type":"ContainerStarted","Data":"923c74cb0eb187b16aa30ad8f198a5424dc4e0bd386f1790150cfb8a6ba477c4"} Mar 08 03:53:30.787664 master-0 kubenswrapper[18592]: I0308 03:53:30.787602 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-4-master-0" podStartSLOduration=2.787586926 podStartE2EDuration="2.787586926s" podCreationTimestamp="2026-03-08 03:53:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:53:30.785570631 +0000 UTC m=+22.884324981" watchObservedRunningTime="2026-03-08 03:53:30.787586926 +0000 UTC m=+22.886341276" Mar 08 03:53:30.928057 master-0 kubenswrapper[18592]: I0308 03:53:30.927918 18592 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 03:53:30.928247 master-0 kubenswrapper[18592]: I0308 03:53:30.928167 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" containerID="cri-o://0bbd3b73d51b06514693db13893aa6ce69354b9ab4f18d355441678c9479dc95" gracePeriod=5 Mar 08 03:53:30.998467 master-0 kubenswrapper[18592]: I0308 03:53:30.998365 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:53:30.998668 master-0 kubenswrapper[18592]: E0308 03:53:30.998565 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca podName:48ab3c8e-a2bd-4380-9e8d-a41d515a989d nodeName:}" failed. No retries permitted until 2026-03-08 03:53:38.998544513 +0000 UTC m=+31.097298863 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca") pod "console-operator-6c7fb6b958-mr9k6" (UID: "48ab3c8e-a2bd-4380-9e8d-a41d515a989d") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:31.773643 master-0 kubenswrapper[18592]: I0308 03:53:31.773569 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" event={"ID":"46b636ff-fb55-4e68-9836-04e46bd462ee","Type":"ContainerStarted","Data":"fe4358bdb24bef0948bf77f233ef05a526d1f4b81668805dc2b85820ef96e545"} Mar 08 03:53:31.774261 master-0 kubenswrapper[18592]: I0308 03:53:31.773681 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" event={"ID":"46b636ff-fb55-4e68-9836-04e46bd462ee","Type":"ContainerStarted","Data":"c1b832f68434ecd6d08c642a0d756fcaaaac024a60503ec2e84e83d4b71add11"} Mar 08 03:53:31.797109 master-0 kubenswrapper[18592]: I0308 03:53:31.797031 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5ff8674d55-lv2h9" podStartSLOduration=2.586486792 podStartE2EDuration="4.797013609s" podCreationTimestamp="2026-03-08 03:53:27 +0000 UTC" firstStartedPulling="2026-03-08 03:53:29.080071192 +0000 UTC m=+21.178825542" lastFinishedPulling="2026-03-08 03:53:31.290598009 +0000 UTC m=+23.389352359" observedRunningTime="2026-03-08 03:53:31.794934331 +0000 UTC m=+23.893688671" watchObservedRunningTime="2026-03-08 03:53:31.797013609 +0000 UTC m=+23.895767979" Mar 08 03:53:34.018866 master-0 kubenswrapper[18592]: I0308 03:53:34.018802 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-f2sxm"] Mar 08 03:53:34.019391 master-0 kubenswrapper[18592]: E0308 03:53:34.019030 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" Mar 08 03:53:34.019391 master-0 kubenswrapper[18592]: I0308 03:53:34.019042 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" Mar 08 03:53:34.019391 master-0 kubenswrapper[18592]: I0308 03:53:34.019168 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" Mar 08 03:53:34.019924 master-0 kubenswrapper[18592]: I0308 03:53:34.019900 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.021651 master-0 kubenswrapper[18592]: I0308 03:53:34.021608 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 08 03:53:34.021651 master-0 kubenswrapper[18592]: I0308 03:53:34.021630 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 08 03:53:34.052655 master-0 kubenswrapper[18592]: I0308 03:53:34.052609 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb"] Mar 08 03:53:34.053560 master-0 kubenswrapper[18592]: I0308 03:53:34.053536 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz"] Mar 08 03:53:34.054421 master-0 kubenswrapper[18592]: I0308 03:53:34.054377 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" Mar 08 03:53:34.058168 master-0 kubenswrapper[18592]: I0308 03:53:34.058134 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 08 03:53:34.066497 master-0 kubenswrapper[18592]: I0308 03:53:34.064972 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.066497 master-0 kubenswrapper[18592]: I0308 03:53:34.065148 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 08 03:53:34.066713 master-0 kubenswrapper[18592]: I0308 03:53:34.066626 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 08 03:53:34.066748 master-0 kubenswrapper[18592]: I0308 03:53:34.066711 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 08 03:53:34.071436 master-0 kubenswrapper[18592]: I0308 03:53:34.071397 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 08 03:53:34.085996 master-0 kubenswrapper[18592]: I0308 03:53:34.085756 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb"] Mar 08 03:53:34.085996 master-0 kubenswrapper[18592]: I0308 03:53:34.085805 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz"] Mar 08 03:53:34.144012 master-0 kubenswrapper[18592]: I0308 03:53:34.143897 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/10d13e6c-631d-4753-b564-fd88ceb7d358-root\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.144012 master-0 kubenswrapper[18592]: I0308 03:53:34.143946 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fa83f817-2611-4894-9bad-d9c8640520b3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-gtrmb\" (UID: \"fa83f817-2611-4894-9bad-d9c8640520b3\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" Mar 08 03:53:34.144237 master-0 kubenswrapper[18592]: I0308 03:53:34.144038 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10d13e6c-631d-4753-b564-fd88ceb7d358-metrics-client-ca\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.144237 master-0 kubenswrapper[18592]: I0308 03:53:34.144059 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa83f817-2611-4894-9bad-d9c8640520b3-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-gtrmb\" (UID: \"fa83f817-2611-4894-9bad-d9c8640520b3\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" Mar 08 03:53:34.144237 master-0 kubenswrapper[18592]: I0308 03:53:34.144080 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/93ebbf2d-6b34-40ae-9f2e-f861e8a20183-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-22jdz\" (UID: \"93ebbf2d-6b34-40ae-9f2e-f861e8a20183\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.144237 master-0 kubenswrapper[18592]: I0308 03:53:34.144100 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/10d13e6c-631d-4753-b564-fd88ceb7d358-sys\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.147775 master-0 kubenswrapper[18592]: I0308 03:53:34.145809 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh6bp\" (UniqueName: \"kubernetes.io/projected/93ebbf2d-6b34-40ae-9f2e-f861e8a20183-kube-api-access-hh6bp\") pod \"kube-state-metrics-68b88f8cb5-22jdz\" (UID: \"93ebbf2d-6b34-40ae-9f2e-f861e8a20183\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.147775 master-0 kubenswrapper[18592]: I0308 03:53:34.145868 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/93ebbf2d-6b34-40ae-9f2e-f861e8a20183-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-22jdz\" (UID: \"93ebbf2d-6b34-40ae-9f2e-f861e8a20183\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.147775 master-0 kubenswrapper[18592]: I0308 03:53:34.145887 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/10d13e6c-631d-4753-b564-fd88ceb7d358-node-exporter-wtmp\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.147775 master-0 kubenswrapper[18592]: I0308 03:53:34.145968 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/10d13e6c-631d-4753-b564-fd88ceb7d358-node-exporter-textfile\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.147775 master-0 kubenswrapper[18592]: I0308 03:53:34.146040 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp5mm\" (UniqueName: \"kubernetes.io/projected/10d13e6c-631d-4753-b564-fd88ceb7d358-kube-api-access-pp5mm\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.147775 master-0 kubenswrapper[18592]: I0308 03:53:34.146075 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93ebbf2d-6b34-40ae-9f2e-f861e8a20183-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-22jdz\" (UID: \"93ebbf2d-6b34-40ae-9f2e-f861e8a20183\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.147775 master-0 kubenswrapper[18592]: I0308 03:53:34.146099 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfv9h\" (UniqueName: \"kubernetes.io/projected/fa83f817-2611-4894-9bad-d9c8640520b3-kube-api-access-sfv9h\") pod \"openshift-state-metrics-74cc79fd76-gtrmb\" (UID: \"fa83f817-2611-4894-9bad-d9c8640520b3\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" Mar 08 03:53:34.147775 master-0 kubenswrapper[18592]: I0308 03:53:34.146146 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93ebbf2d-6b34-40ae-9f2e-f861e8a20183-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-22jdz\" (UID: \"93ebbf2d-6b34-40ae-9f2e-f861e8a20183\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.147775 master-0 kubenswrapper[18592]: I0308 03:53:34.146182 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/10d13e6c-631d-4753-b564-fd88ceb7d358-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.147775 master-0 kubenswrapper[18592]: I0308 03:53:34.146240 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa83f817-2611-4894-9bad-d9c8640520b3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-gtrmb\" (UID: \"fa83f817-2611-4894-9bad-d9c8640520b3\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" Mar 08 03:53:34.147775 master-0 kubenswrapper[18592]: I0308 03:53:34.146263 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/93ebbf2d-6b34-40ae-9f2e-f861e8a20183-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-22jdz\" (UID: \"93ebbf2d-6b34-40ae-9f2e-f861e8a20183\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.147775 master-0 kubenswrapper[18592]: I0308 03:53:34.146306 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/10d13e6c-631d-4753-b564-fd88ceb7d358-node-exporter-tls\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.247905 master-0 kubenswrapper[18592]: I0308 03:53:34.247837 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa83f817-2611-4894-9bad-d9c8640520b3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-gtrmb\" (UID: \"fa83f817-2611-4894-9bad-d9c8640520b3\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" Mar 08 03:53:34.248139 master-0 kubenswrapper[18592]: I0308 03:53:34.247916 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/93ebbf2d-6b34-40ae-9f2e-f861e8a20183-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-22jdz\" (UID: \"93ebbf2d-6b34-40ae-9f2e-f861e8a20183\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.248139 master-0 kubenswrapper[18592]: I0308 03:53:34.247950 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/10d13e6c-631d-4753-b564-fd88ceb7d358-node-exporter-tls\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.248139 master-0 kubenswrapper[18592]: I0308 03:53:34.248001 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/10d13e6c-631d-4753-b564-fd88ceb7d358-root\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.248139 master-0 kubenswrapper[18592]: I0308 03:53:34.248028 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fa83f817-2611-4894-9bad-d9c8640520b3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-gtrmb\" (UID: \"fa83f817-2611-4894-9bad-d9c8640520b3\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" Mar 08 03:53:34.248139 master-0 kubenswrapper[18592]: I0308 03:53:34.248106 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/10d13e6c-631d-4753-b564-fd88ceb7d358-root\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.248319 master-0 kubenswrapper[18592]: I0308 03:53:34.248275 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10d13e6c-631d-4753-b564-fd88ceb7d358-metrics-client-ca\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.248357 master-0 kubenswrapper[18592]: I0308 03:53:34.248334 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa83f817-2611-4894-9bad-d9c8640520b3-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-gtrmb\" (UID: \"fa83f817-2611-4894-9bad-d9c8640520b3\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" Mar 08 03:53:34.248393 master-0 kubenswrapper[18592]: I0308 03:53:34.248382 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/93ebbf2d-6b34-40ae-9f2e-f861e8a20183-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-22jdz\" (UID: \"93ebbf2d-6b34-40ae-9f2e-f861e8a20183\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.248584 master-0 kubenswrapper[18592]: I0308 03:53:34.248550 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/10d13e6c-631d-4753-b564-fd88ceb7d358-sys\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.248621 master-0 kubenswrapper[18592]: I0308 03:53:34.248604 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh6bp\" (UniqueName: \"kubernetes.io/projected/93ebbf2d-6b34-40ae-9f2e-f861e8a20183-kube-api-access-hh6bp\") pod \"kube-state-metrics-68b88f8cb5-22jdz\" (UID: \"93ebbf2d-6b34-40ae-9f2e-f861e8a20183\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.248652 master-0 kubenswrapper[18592]: I0308 03:53:34.248639 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/93ebbf2d-6b34-40ae-9f2e-f861e8a20183-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-22jdz\" (UID: \"93ebbf2d-6b34-40ae-9f2e-f861e8a20183\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.248688 master-0 kubenswrapper[18592]: I0308 03:53:34.248665 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/10d13e6c-631d-4753-b564-fd88ceb7d358-node-exporter-wtmp\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.248719 master-0 kubenswrapper[18592]: I0308 03:53:34.248674 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/10d13e6c-631d-4753-b564-fd88ceb7d358-sys\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.248719 master-0 kubenswrapper[18592]: I0308 03:53:34.248700 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/10d13e6c-631d-4753-b564-fd88ceb7d358-node-exporter-textfile\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.248776 master-0 kubenswrapper[18592]: I0308 03:53:34.248747 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp5mm\" (UniqueName: \"kubernetes.io/projected/10d13e6c-631d-4753-b564-fd88ceb7d358-kube-api-access-pp5mm\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.248809 master-0 kubenswrapper[18592]: I0308 03:53:34.248785 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93ebbf2d-6b34-40ae-9f2e-f861e8a20183-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-22jdz\" (UID: \"93ebbf2d-6b34-40ae-9f2e-f861e8a20183\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.248809 master-0 kubenswrapper[18592]: I0308 03:53:34.248793 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/93ebbf2d-6b34-40ae-9f2e-f861e8a20183-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-22jdz\" (UID: \"93ebbf2d-6b34-40ae-9f2e-f861e8a20183\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.248889 master-0 kubenswrapper[18592]: I0308 03:53:34.248804 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfv9h\" (UniqueName: \"kubernetes.io/projected/fa83f817-2611-4894-9bad-d9c8640520b3-kube-api-access-sfv9h\") pod \"openshift-state-metrics-74cc79fd76-gtrmb\" (UID: \"fa83f817-2611-4894-9bad-d9c8640520b3\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" Mar 08 03:53:34.248889 master-0 kubenswrapper[18592]: I0308 03:53:34.248866 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/93ebbf2d-6b34-40ae-9f2e-f861e8a20183-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-22jdz\" (UID: \"93ebbf2d-6b34-40ae-9f2e-f861e8a20183\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.248944 master-0 kubenswrapper[18592]: I0308 03:53:34.248887 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93ebbf2d-6b34-40ae-9f2e-f861e8a20183-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-22jdz\" (UID: \"93ebbf2d-6b34-40ae-9f2e-f861e8a20183\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.248944 master-0 kubenswrapper[18592]: I0308 03:53:34.248924 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/10d13e6c-631d-4753-b564-fd88ceb7d358-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.249003 master-0 kubenswrapper[18592]: I0308 03:53:34.248979 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10d13e6c-631d-4753-b564-fd88ceb7d358-metrics-client-ca\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.249031 master-0 kubenswrapper[18592]: I0308 03:53:34.249005 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/10d13e6c-631d-4753-b564-fd88ceb7d358-node-exporter-textfile\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.249135 master-0 kubenswrapper[18592]: I0308 03:53:34.249112 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/10d13e6c-631d-4753-b564-fd88ceb7d358-node-exporter-wtmp\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.249514 master-0 kubenswrapper[18592]: I0308 03:53:34.249474 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa83f817-2611-4894-9bad-d9c8640520b3-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-gtrmb\" (UID: \"fa83f817-2611-4894-9bad-d9c8640520b3\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" Mar 08 03:53:34.249604 master-0 kubenswrapper[18592]: I0308 03:53:34.249582 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93ebbf2d-6b34-40ae-9f2e-f861e8a20183-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-22jdz\" (UID: \"93ebbf2d-6b34-40ae-9f2e-f861e8a20183\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.251793 master-0 kubenswrapper[18592]: I0308 03:53:34.251716 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fa83f817-2611-4894-9bad-d9c8640520b3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-gtrmb\" (UID: \"fa83f817-2611-4894-9bad-d9c8640520b3\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" Mar 08 03:53:34.253523 master-0 kubenswrapper[18592]: I0308 03:53:34.252817 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/10d13e6c-631d-4753-b564-fd88ceb7d358-node-exporter-tls\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.253711 master-0 kubenswrapper[18592]: I0308 03:53:34.253688 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/10d13e6c-631d-4753-b564-fd88ceb7d358-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.253771 master-0 kubenswrapper[18592]: I0308 03:53:34.253741 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa83f817-2611-4894-9bad-d9c8640520b3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-gtrmb\" (UID: \"fa83f817-2611-4894-9bad-d9c8640520b3\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" Mar 08 03:53:34.253993 master-0 kubenswrapper[18592]: I0308 03:53:34.253973 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/93ebbf2d-6b34-40ae-9f2e-f861e8a20183-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-22jdz\" (UID: \"93ebbf2d-6b34-40ae-9f2e-f861e8a20183\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.254032 master-0 kubenswrapper[18592]: I0308 03:53:34.253996 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93ebbf2d-6b34-40ae-9f2e-f861e8a20183-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-22jdz\" (UID: \"93ebbf2d-6b34-40ae-9f2e-f861e8a20183\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.266479 master-0 kubenswrapper[18592]: I0308 03:53:34.266439 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfv9h\" (UniqueName: \"kubernetes.io/projected/fa83f817-2611-4894-9bad-d9c8640520b3-kube-api-access-sfv9h\") pod \"openshift-state-metrics-74cc79fd76-gtrmb\" (UID: \"fa83f817-2611-4894-9bad-d9c8640520b3\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" Mar 08 03:53:34.267224 master-0 kubenswrapper[18592]: I0308 03:53:34.267191 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh6bp\" (UniqueName: \"kubernetes.io/projected/93ebbf2d-6b34-40ae-9f2e-f861e8a20183-kube-api-access-hh6bp\") pod \"kube-state-metrics-68b88f8cb5-22jdz\" (UID: \"93ebbf2d-6b34-40ae-9f2e-f861e8a20183\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.267530 master-0 kubenswrapper[18592]: I0308 03:53:34.267495 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp5mm\" (UniqueName: \"kubernetes.io/projected/10d13e6c-631d-4753-b564-fd88ceb7d358-kube-api-access-pp5mm\") pod \"node-exporter-f2sxm\" (UID: \"10d13e6c-631d-4753-b564-fd88ceb7d358\") " pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.333896 master-0 kubenswrapper[18592]: I0308 03:53:34.333849 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-f2sxm" Mar 08 03:53:34.350863 master-0 kubenswrapper[18592]: W0308 03:53:34.350781 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10d13e6c_631d_4753_b564_fd88ceb7d358.slice/crio-207ca36e2406ae608643c17c45d3506b6b5f9b7717ec0380948e27dc8bc871e7 WatchSource:0}: Error finding container 207ca36e2406ae608643c17c45d3506b6b5f9b7717ec0380948e27dc8bc871e7: Status 404 returned error can't find the container with id 207ca36e2406ae608643c17c45d3506b6b5f9b7717ec0380948e27dc8bc871e7 Mar 08 03:53:34.386661 master-0 kubenswrapper[18592]: I0308 03:53:34.386550 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" Mar 08 03:53:34.407187 master-0 kubenswrapper[18592]: I0308 03:53:34.406311 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" Mar 08 03:53:34.826932 master-0 kubenswrapper[18592]: I0308 03:53:34.826900 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb"] Mar 08 03:53:34.839675 master-0 kubenswrapper[18592]: W0308 03:53:34.835516 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa83f817_2611_4894_9bad_d9c8640520b3.slice/crio-9810bf726145fb419b57df233de4ca93bb7a3aa9c8db5142900f81f171ea11b5 WatchSource:0}: Error finding container 9810bf726145fb419b57df233de4ca93bb7a3aa9c8db5142900f81f171ea11b5: Status 404 returned error can't find the container with id 9810bf726145fb419b57df233de4ca93bb7a3aa9c8db5142900f81f171ea11b5 Mar 08 03:53:34.839675 master-0 kubenswrapper[18592]: I0308 03:53:34.836160 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f2sxm" event={"ID":"10d13e6c-631d-4753-b564-fd88ceb7d358","Type":"ContainerStarted","Data":"207ca36e2406ae608643c17c45d3506b6b5f9b7717ec0380948e27dc8bc871e7"} Mar 08 03:53:34.903208 master-0 kubenswrapper[18592]: I0308 03:53:34.901444 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz"] Mar 08 03:53:34.919673 master-0 kubenswrapper[18592]: W0308 03:53:34.919638 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93ebbf2d_6b34_40ae_9f2e_f861e8a20183.slice/crio-d8f5fade324c01fb738e3942810446f0fd019bbbab2c3924c2e426937ecc66c3 WatchSource:0}: Error finding container d8f5fade324c01fb738e3942810446f0fd019bbbab2c3924c2e426937ecc66c3: Status 404 returned error can't find the container with id d8f5fade324c01fb738e3942810446f0fd019bbbab2c3924c2e426937ecc66c3 Mar 08 03:53:35.103579 master-0 kubenswrapper[18592]: I0308 03:53:35.102048 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 03:53:35.111176 master-0 kubenswrapper[18592]: I0308 03:53:35.103757 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.111176 master-0 kubenswrapper[18592]: I0308 03:53:35.105661 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 08 03:53:35.111176 master-0 kubenswrapper[18592]: I0308 03:53:35.105854 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-65bbw" Mar 08 03:53:35.111176 master-0 kubenswrapper[18592]: I0308 03:53:35.106019 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 08 03:53:35.111176 master-0 kubenswrapper[18592]: I0308 03:53:35.106201 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 08 03:53:35.111176 master-0 kubenswrapper[18592]: I0308 03:53:35.106245 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 08 03:53:35.111176 master-0 kubenswrapper[18592]: I0308 03:53:35.106981 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 08 03:53:35.111176 master-0 kubenswrapper[18592]: I0308 03:53:35.107115 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 08 03:53:35.111176 master-0 kubenswrapper[18592]: I0308 03:53:35.107210 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 08 03:53:35.111176 master-0 kubenswrapper[18592]: I0308 03:53:35.107311 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 08 03:53:35.127243 master-0 kubenswrapper[18592]: I0308 03:53:35.125906 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 03:53:35.168063 master-0 kubenswrapper[18592]: I0308 03:53:35.160841 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cacb9582-2132-4543-8a31-7b100ba4dd2f-config-out\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.168063 master-0 kubenswrapper[18592]: I0308 03:53:35.160876 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzp7p\" (UniqueName: \"kubernetes.io/projected/cacb9582-2132-4543-8a31-7b100ba4dd2f-kube-api-access-lzp7p\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.168063 master-0 kubenswrapper[18592]: I0308 03:53:35.160904 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-web-config\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.168063 master-0 kubenswrapper[18592]: I0308 03:53:35.160925 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.168063 master-0 kubenswrapper[18592]: I0308 03:53:35.161006 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.168063 master-0 kubenswrapper[18592]: I0308 03:53:35.161053 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.168063 master-0 kubenswrapper[18592]: I0308 03:53:35.161138 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.168063 master-0 kubenswrapper[18592]: I0308 03:53:35.161166 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cacb9582-2132-4543-8a31-7b100ba4dd2f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.168063 master-0 kubenswrapper[18592]: I0308 03:53:35.161190 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.168063 master-0 kubenswrapper[18592]: I0308 03:53:35.161281 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-config-volume\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.168063 master-0 kubenswrapper[18592]: I0308 03:53:35.161328 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.168063 master-0 kubenswrapper[18592]: I0308 03:53:35.161416 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.263186 master-0 kubenswrapper[18592]: I0308 03:53:35.263122 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.263186 master-0 kubenswrapper[18592]: I0308 03:53:35.263196 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cacb9582-2132-4543-8a31-7b100ba4dd2f-config-out\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.263618 master-0 kubenswrapper[18592]: I0308 03:53:35.263214 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzp7p\" (UniqueName: \"kubernetes.io/projected/cacb9582-2132-4543-8a31-7b100ba4dd2f-kube-api-access-lzp7p\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.263618 master-0 kubenswrapper[18592]: I0308 03:53:35.263254 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-web-config\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.263618 master-0 kubenswrapper[18592]: I0308 03:53:35.263276 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.263618 master-0 kubenswrapper[18592]: I0308 03:53:35.263293 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.263618 master-0 kubenswrapper[18592]: I0308 03:53:35.263331 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.263618 master-0 kubenswrapper[18592]: I0308 03:53:35.263364 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.263618 master-0 kubenswrapper[18592]: I0308 03:53:35.263383 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cacb9582-2132-4543-8a31-7b100ba4dd2f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.263618 master-0 kubenswrapper[18592]: I0308 03:53:35.263428 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.263618 master-0 kubenswrapper[18592]: I0308 03:53:35.263448 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-config-volume\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.263618 master-0 kubenswrapper[18592]: I0308 03:53:35.263489 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.264914 master-0 kubenswrapper[18592]: I0308 03:53:35.264039 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.265766 master-0 kubenswrapper[18592]: E0308 03:53:35.265726 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle podName:cacb9582-2132-4543-8a31-7b100ba4dd2f nodeName:}" failed. No retries permitted until 2026-03-08 03:53:35.765699249 +0000 UTC m=+27.864453659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:35.267283 master-0 kubenswrapper[18592]: I0308 03:53:35.267235 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.269287 master-0 kubenswrapper[18592]: I0308 03:53:35.269254 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.269886 master-0 kubenswrapper[18592]: I0308 03:53:35.269853 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.271097 master-0 kubenswrapper[18592]: I0308 03:53:35.271065 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.272221 master-0 kubenswrapper[18592]: I0308 03:53:35.272191 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cacb9582-2132-4543-8a31-7b100ba4dd2f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.272623 master-0 kubenswrapper[18592]: I0308 03:53:35.272595 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.272796 master-0 kubenswrapper[18592]: I0308 03:53:35.272770 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cacb9582-2132-4543-8a31-7b100ba4dd2f-config-out\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.274519 master-0 kubenswrapper[18592]: I0308 03:53:35.274489 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-web-config\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.275916 master-0 kubenswrapper[18592]: I0308 03:53:35.275886 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-config-volume\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.277976 master-0 kubenswrapper[18592]: I0308 03:53:35.277922 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzp7p\" (UniqueName: \"kubernetes.io/projected/cacb9582-2132-4543-8a31-7b100ba4dd2f-kube-api-access-lzp7p\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.777896 master-0 kubenswrapper[18592]: I0308 03:53:35.777145 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:35.777896 master-0 kubenswrapper[18592]: E0308 03:53:35.777359 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle podName:cacb9582-2132-4543-8a31-7b100ba4dd2f nodeName:}" failed. No retries permitted until 2026-03-08 03:53:36.777334657 +0000 UTC m=+28.876089007 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:35.849564 master-0 kubenswrapper[18592]: I0308 03:53:35.848708 18592 generic.go:334] "Generic (PLEG): container finished" podID="10d13e6c-631d-4753-b564-fd88ceb7d358" containerID="0af2d01e0f9703f49f2dd0053a9f71549e68cd4e68efa740a0f1c3832f0cacd9" exitCode=0 Mar 08 03:53:35.849564 master-0 kubenswrapper[18592]: I0308 03:53:35.848767 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f2sxm" event={"ID":"10d13e6c-631d-4753-b564-fd88ceb7d358","Type":"ContainerDied","Data":"0af2d01e0f9703f49f2dd0053a9f71549e68cd4e68efa740a0f1c3832f0cacd9"} Mar 08 03:53:35.851354 master-0 kubenswrapper[18592]: I0308 03:53:35.851320 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" event={"ID":"93ebbf2d-6b34-40ae-9f2e-f861e8a20183","Type":"ContainerStarted","Data":"d8f5fade324c01fb738e3942810446f0fd019bbbab2c3924c2e426937ecc66c3"} Mar 08 03:53:35.854717 master-0 kubenswrapper[18592]: I0308 03:53:35.854643 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" event={"ID":"fa83f817-2611-4894-9bad-d9c8640520b3","Type":"ContainerStarted","Data":"2efbaa9727f5b65e95bbfb1c1f12dc8a1c825fb9fa714f15c78ef9f70696a866"} Mar 08 03:53:35.854717 master-0 kubenswrapper[18592]: I0308 03:53:35.854663 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" event={"ID":"fa83f817-2611-4894-9bad-d9c8640520b3","Type":"ContainerStarted","Data":"2939022f2ae348e4111b1f204d3edb17b8361aafb41191b19d38a380566ad363"} Mar 08 03:53:35.854717 master-0 kubenswrapper[18592]: I0308 03:53:35.854673 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" event={"ID":"fa83f817-2611-4894-9bad-d9c8640520b3","Type":"ContainerStarted","Data":"9810bf726145fb419b57df233de4ca93bb7a3aa9c8db5142900f81f171ea11b5"} Mar 08 03:53:36.120994 master-0 kubenswrapper[18592]: I0308 03:53:36.120954 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7cfbff6469-pjfrs"] Mar 08 03:53:36.127343 master-0 kubenswrapper[18592]: I0308 03:53:36.127313 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.132772 master-0 kubenswrapper[18592]: I0308 03:53:36.132140 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 08 03:53:36.133544 master-0 kubenswrapper[18592]: I0308 03:53:36.133511 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 08 03:53:36.133889 master-0 kubenswrapper[18592]: I0308 03:53:36.133870 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 08 03:53:36.135784 master-0 kubenswrapper[18592]: I0308 03:53:36.135761 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 08 03:53:36.136132 master-0 kubenswrapper[18592]: I0308 03:53:36.136115 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-3ipfg9gfac918" Mar 08 03:53:36.136261 master-0 kubenswrapper[18592]: I0308 03:53:36.136237 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 08 03:53:36.136374 master-0 kubenswrapper[18592]: I0308 03:53:36.136360 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-cg9nh" Mar 08 03:53:36.167432 master-0 kubenswrapper[18592]: I0308 03:53:36.167392 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7cfbff6469-pjfrs"] Mar 08 03:53:36.284890 master-0 kubenswrapper[18592]: I0308 03:53:36.284759 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-668bq\" (UniqueName: \"kubernetes.io/projected/04bf080c-cf49-4717-abdf-f247a4cdbf46-kube-api-access-668bq\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.284890 master-0 kubenswrapper[18592]: I0308 03:53:36.284815 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/04bf080c-cf49-4717-abdf-f247a4cdbf46-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.284890 master-0 kubenswrapper[18592]: I0308 03:53:36.284883 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/04bf080c-cf49-4717-abdf-f247a4cdbf46-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.285177 master-0 kubenswrapper[18592]: I0308 03:53:36.284927 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/04bf080c-cf49-4717-abdf-f247a4cdbf46-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.285177 master-0 kubenswrapper[18592]: I0308 03:53:36.284947 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/04bf080c-cf49-4717-abdf-f247a4cdbf46-secret-grpc-tls\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.285177 master-0 kubenswrapper[18592]: I0308 03:53:36.284968 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/04bf080c-cf49-4717-abdf-f247a4cdbf46-secret-thanos-querier-tls\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.285177 master-0 kubenswrapper[18592]: I0308 03:53:36.284990 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04bf080c-cf49-4717-abdf-f247a4cdbf46-metrics-client-ca\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.285177 master-0 kubenswrapper[18592]: I0308 03:53:36.285040 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/04bf080c-cf49-4717-abdf-f247a4cdbf46-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.387846 master-0 kubenswrapper[18592]: I0308 03:53:36.387056 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/04bf080c-cf49-4717-abdf-f247a4cdbf46-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.388059 master-0 kubenswrapper[18592]: I0308 03:53:36.387934 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/04bf080c-cf49-4717-abdf-f247a4cdbf46-secret-grpc-tls\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.388059 master-0 kubenswrapper[18592]: I0308 03:53:36.387975 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/04bf080c-cf49-4717-abdf-f247a4cdbf46-secret-thanos-querier-tls\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.388059 master-0 kubenswrapper[18592]: I0308 03:53:36.388010 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04bf080c-cf49-4717-abdf-f247a4cdbf46-metrics-client-ca\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.388162 master-0 kubenswrapper[18592]: I0308 03:53:36.388097 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/04bf080c-cf49-4717-abdf-f247a4cdbf46-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.388577 master-0 kubenswrapper[18592]: I0308 03:53:36.388524 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-668bq\" (UniqueName: \"kubernetes.io/projected/04bf080c-cf49-4717-abdf-f247a4cdbf46-kube-api-access-668bq\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.388629 master-0 kubenswrapper[18592]: I0308 03:53:36.388593 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/04bf080c-cf49-4717-abdf-f247a4cdbf46-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.388663 master-0 kubenswrapper[18592]: I0308 03:53:36.388653 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/04bf080c-cf49-4717-abdf-f247a4cdbf46-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.389357 master-0 kubenswrapper[18592]: I0308 03:53:36.389321 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04bf080c-cf49-4717-abdf-f247a4cdbf46-metrics-client-ca\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.391616 master-0 kubenswrapper[18592]: I0308 03:53:36.391563 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/04bf080c-cf49-4717-abdf-f247a4cdbf46-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.393207 master-0 kubenswrapper[18592]: I0308 03:53:36.393155 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/04bf080c-cf49-4717-abdf-f247a4cdbf46-secret-thanos-querier-tls\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.394659 master-0 kubenswrapper[18592]: I0308 03:53:36.393596 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/04bf080c-cf49-4717-abdf-f247a4cdbf46-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.394659 master-0 kubenswrapper[18592]: I0308 03:53:36.394641 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/04bf080c-cf49-4717-abdf-f247a4cdbf46-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.397252 master-0 kubenswrapper[18592]: I0308 03:53:36.397201 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/04bf080c-cf49-4717-abdf-f247a4cdbf46-secret-grpc-tls\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.397520 master-0 kubenswrapper[18592]: I0308 03:53:36.397483 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/04bf080c-cf49-4717-abdf-f247a4cdbf46-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.414528 master-0 kubenswrapper[18592]: I0308 03:53:36.414503 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-668bq\" (UniqueName: \"kubernetes.io/projected/04bf080c-cf49-4717-abdf-f247a4cdbf46-kube-api-access-668bq\") pod \"thanos-querier-7cfbff6469-pjfrs\" (UID: \"04bf080c-cf49-4717-abdf-f247a4cdbf46\") " pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.471170 master-0 kubenswrapper[18592]: I0308 03:53:36.471139 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:36.602161 master-0 kubenswrapper[18592]: I0308 03:53:36.602125 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_f417e14665db2ffffa887ce21c9ff0ed/startup-monitor/0.log" Mar 08 03:53:36.602591 master-0 kubenswrapper[18592]: I0308 03:53:36.602247 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:53:36.692261 master-0 kubenswrapper[18592]: I0308 03:53:36.692212 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 08 03:53:36.692474 master-0 kubenswrapper[18592]: I0308 03:53:36.692296 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 08 03:53:36.692474 master-0 kubenswrapper[18592]: I0308 03:53:36.692337 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 08 03:53:36.692474 master-0 kubenswrapper[18592]: I0308 03:53:36.692367 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests" (OuterVolumeSpecName: "manifests") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:53:36.692474 master-0 kubenswrapper[18592]: I0308 03:53:36.692372 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 08 03:53:36.692653 master-0 kubenswrapper[18592]: I0308 03:53:36.692473 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 08 03:53:36.692653 master-0 kubenswrapper[18592]: I0308 03:53:36.692384 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:53:36.692653 master-0 kubenswrapper[18592]: I0308 03:53:36.692423 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log" (OuterVolumeSpecName: "var-log") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:53:36.692653 master-0 kubenswrapper[18592]: I0308 03:53:36.692640 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock" (OuterVolumeSpecName: "var-lock") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:53:36.693075 master-0 kubenswrapper[18592]: I0308 03:53:36.693044 18592 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:53:36.693144 master-0 kubenswrapper[18592]: I0308 03:53:36.693064 18592 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") on node \"master-0\" DevicePath \"\"" Mar 08 03:53:36.693144 master-0 kubenswrapper[18592]: I0308 03:53:36.693094 18592 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") on node \"master-0\" DevicePath \"\"" Mar 08 03:53:36.693144 master-0 kubenswrapper[18592]: I0308 03:53:36.693103 18592 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:53:36.698387 master-0 kubenswrapper[18592]: I0308 03:53:36.698326 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:53:36.794451 master-0 kubenswrapper[18592]: I0308 03:53:36.794394 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:36.794642 master-0 kubenswrapper[18592]: I0308 03:53:36.794540 18592 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:53:36.794642 master-0 kubenswrapper[18592]: E0308 03:53:36.794585 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle podName:cacb9582-2132-4543-8a31-7b100ba4dd2f nodeName:}" failed. No retries permitted until 2026-03-08 03:53:38.794556948 +0000 UTC m=+30.893311298 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:36.863426 master-0 kubenswrapper[18592]: I0308 03:53:36.863318 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_f417e14665db2ffffa887ce21c9ff0ed/startup-monitor/0.log" Mar 08 03:53:36.863426 master-0 kubenswrapper[18592]: I0308 03:53:36.863362 18592 generic.go:334] "Generic (PLEG): container finished" podID="f417e14665db2ffffa887ce21c9ff0ed" containerID="0bbd3b73d51b06514693db13893aa6ce69354b9ab4f18d355441678c9479dc95" exitCode=137 Mar 08 03:53:36.863606 master-0 kubenswrapper[18592]: I0308 03:53:36.863437 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:53:36.863606 master-0 kubenswrapper[18592]: I0308 03:53:36.863489 18592 scope.go:117] "RemoveContainer" containerID="0bbd3b73d51b06514693db13893aa6ce69354b9ab4f18d355441678c9479dc95" Mar 08 03:53:36.866621 master-0 kubenswrapper[18592]: I0308 03:53:36.866581 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f2sxm" event={"ID":"10d13e6c-631d-4753-b564-fd88ceb7d358","Type":"ContainerStarted","Data":"0ba49d42860605c3f16c10c8eb70ca5b038e1361e26f6256d1c3082d171ebc49"} Mar 08 03:53:36.866678 master-0 kubenswrapper[18592]: I0308 03:53:36.866625 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f2sxm" event={"ID":"10d13e6c-631d-4753-b564-fd88ceb7d358","Type":"ContainerStarted","Data":"38a94adb701c6f2ea2eefe3faf3d2f3623e01706d15205f4cacd5a35ed6fb66d"} Mar 08 03:53:36.913248 master-0 kubenswrapper[18592]: I0308 03:53:36.908027 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-f2sxm" podStartSLOduration=2.789052047 podStartE2EDuration="3.907962828s" podCreationTimestamp="2026-03-08 03:53:33 +0000 UTC" firstStartedPulling="2026-03-08 03:53:34.352461822 +0000 UTC m=+26.451216172" lastFinishedPulling="2026-03-08 03:53:35.471372603 +0000 UTC m=+27.570126953" observedRunningTime="2026-03-08 03:53:36.88933266 +0000 UTC m=+28.988087010" watchObservedRunningTime="2026-03-08 03:53:36.907962828 +0000 UTC m=+29.006717178" Mar 08 03:53:36.917545 master-0 kubenswrapper[18592]: I0308 03:53:36.917505 18592 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="cc55376d-d8f5-4ebf-8597-01f615f8dce7" Mar 08 03:53:36.947054 master-0 kubenswrapper[18592]: I0308 03:53:36.943321 18592 scope.go:117] "RemoveContainer" containerID="0bbd3b73d51b06514693db13893aa6ce69354b9ab4f18d355441678c9479dc95" Mar 08 03:53:36.947054 master-0 kubenswrapper[18592]: E0308 03:53:36.944376 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bbd3b73d51b06514693db13893aa6ce69354b9ab4f18d355441678c9479dc95\": container with ID starting with 0bbd3b73d51b06514693db13893aa6ce69354b9ab4f18d355441678c9479dc95 not found: ID does not exist" containerID="0bbd3b73d51b06514693db13893aa6ce69354b9ab4f18d355441678c9479dc95" Mar 08 03:53:36.947054 master-0 kubenswrapper[18592]: I0308 03:53:36.944421 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bbd3b73d51b06514693db13893aa6ce69354b9ab4f18d355441678c9479dc95"} err="failed to get container status \"0bbd3b73d51b06514693db13893aa6ce69354b9ab4f18d355441678c9479dc95\": rpc error: code = NotFound desc = could not find container \"0bbd3b73d51b06514693db13893aa6ce69354b9ab4f18d355441678c9479dc95\": container with ID starting with 0bbd3b73d51b06514693db13893aa6ce69354b9ab4f18d355441678c9479dc95 not found: ID does not exist" Mar 08 03:53:37.450279 master-0 kubenswrapper[18592]: I0308 03:53:37.450217 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7cfbff6469-pjfrs"] Mar 08 03:53:37.882197 master-0 kubenswrapper[18592]: I0308 03:53:37.882112 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" event={"ID":"fa83f817-2611-4894-9bad-d9c8640520b3","Type":"ContainerStarted","Data":"c4c19d227548cbeca2643379ad02773c304f56d5f79a87d3c319418933880dd5"} Mar 08 03:53:37.886050 master-0 kubenswrapper[18592]: I0308 03:53:37.885970 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" event={"ID":"04bf080c-cf49-4717-abdf-f247a4cdbf46","Type":"ContainerStarted","Data":"8aabb1b7fdbc830a41195726531a85f27db0b5943d8b430aece2d9208bdcf94d"} Mar 08 03:53:37.890965 master-0 kubenswrapper[18592]: I0308 03:53:37.890878 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" event={"ID":"93ebbf2d-6b34-40ae-9f2e-f861e8a20183","Type":"ContainerStarted","Data":"37187caa7062c52638d80c8dc15ac02ff463f79f86bcd7bf881d7299a2b36795"} Mar 08 03:53:37.890965 master-0 kubenswrapper[18592]: I0308 03:53:37.890954 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" event={"ID":"93ebbf2d-6b34-40ae-9f2e-f861e8a20183","Type":"ContainerStarted","Data":"025d897deedc5608fffeb35249f05e0114f46e99e855d3ff3f717c89d3bfcbc6"} Mar 08 03:53:37.891266 master-0 kubenswrapper[18592]: I0308 03:53:37.890977 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" event={"ID":"93ebbf2d-6b34-40ae-9f2e-f861e8a20183","Type":"ContainerStarted","Data":"54ec32f7c083d2abd7e7e1e00111760f11aa44451c6e583dffb7c66ccb67b463"} Mar 08 03:53:37.915106 master-0 kubenswrapper[18592]: I0308 03:53:37.915013 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-gtrmb" podStartSLOduration=2.112750008 podStartE2EDuration="3.914983065s" podCreationTimestamp="2026-03-08 03:53:34 +0000 UTC" firstStartedPulling="2026-03-08 03:53:35.160785373 +0000 UTC m=+27.259539723" lastFinishedPulling="2026-03-08 03:53:36.96301843 +0000 UTC m=+29.061772780" observedRunningTime="2026-03-08 03:53:37.912485947 +0000 UTC m=+30.011240327" watchObservedRunningTime="2026-03-08 03:53:37.914983065 +0000 UTC m=+30.013737455" Mar 08 03:53:37.947938 master-0 kubenswrapper[18592]: I0308 03:53:37.944932 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-22jdz" podStartSLOduration=1.907924472 podStartE2EDuration="3.94490642s" podCreationTimestamp="2026-03-08 03:53:34 +0000 UTC" firstStartedPulling="2026-03-08 03:53:34.921685842 +0000 UTC m=+27.020440192" lastFinishedPulling="2026-03-08 03:53:36.95866778 +0000 UTC m=+29.057422140" observedRunningTime="2026-03-08 03:53:37.94402211 +0000 UTC m=+30.042776500" watchObservedRunningTime="2026-03-08 03:53:37.94490642 +0000 UTC m=+30.043660810" Mar 08 03:53:38.156912 master-0 kubenswrapper[18592]: I0308 03:53:38.156688 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f417e14665db2ffffa887ce21c9ff0ed" path="/var/lib/kubelet/pods/f417e14665db2ffffa887ce21c9ff0ed/volumes" Mar 08 03:53:38.157286 master-0 kubenswrapper[18592]: I0308 03:53:38.157229 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Mar 08 03:53:38.188352 master-0 kubenswrapper[18592]: I0308 03:53:38.187953 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 03:53:38.188352 master-0 kubenswrapper[18592]: I0308 03:53:38.188016 18592 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="cc55376d-d8f5-4ebf-8597-01f615f8dce7" Mar 08 03:53:38.188352 master-0 kubenswrapper[18592]: I0308 03:53:38.188600 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 03:53:38.188352 master-0 kubenswrapper[18592]: I0308 03:53:38.188644 18592 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="cc55376d-d8f5-4ebf-8597-01f615f8dce7" Mar 08 03:53:38.827395 master-0 kubenswrapper[18592]: I0308 03:53:38.826962 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:38.828241 master-0 kubenswrapper[18592]: E0308 03:53:38.827528 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle podName:cacb9582-2132-4543-8a31-7b100ba4dd2f nodeName:}" failed. No retries permitted until 2026-03-08 03:53:42.827508044 +0000 UTC m=+34.926262404 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:39.030648 master-0 kubenswrapper[18592]: I0308 03:53:39.030525 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:53:39.031497 master-0 kubenswrapper[18592]: E0308 03:53:39.031444 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca podName:48ab3c8e-a2bd-4380-9e8d-a41d515a989d nodeName:}" failed. No retries permitted until 2026-03-08 03:53:55.031427939 +0000 UTC m=+47.130182289 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca") pod "console-operator-6c7fb6b958-mr9k6" (UID: "48ab3c8e-a2bd-4380-9e8d-a41d515a989d") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:39.438570 master-0 kubenswrapper[18592]: I0308 03:53:39.437192 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7c975945c4-kbb6q"] Mar 08 03:53:39.438570 master-0 kubenswrapper[18592]: I0308 03:53:39.438198 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.446237 master-0 kubenswrapper[18592]: I0308 03:53:39.446190 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-55li3r6nupslu" Mar 08 03:53:39.446509 master-0 kubenswrapper[18592]: I0308 03:53:39.446463 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 08 03:53:39.448753 master-0 kubenswrapper[18592]: I0308 03:53:39.446649 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 08 03:53:39.448753 master-0 kubenswrapper[18592]: I0308 03:53:39.446723 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 08 03:53:39.448753 master-0 kubenswrapper[18592]: I0308 03:53:39.446674 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-gnr8f" Mar 08 03:53:39.448753 master-0 kubenswrapper[18592]: I0308 03:53:39.447152 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 08 03:53:39.502542 master-0 kubenswrapper[18592]: I0308 03:53:39.501899 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7c975945c4-kbb6q"] Mar 08 03:53:39.537126 master-0 kubenswrapper[18592]: I0308 03:53:39.537047 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/934ced6d-5bc3-4215-8b97-059f77762cfd-secret-metrics-client-certs\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.537300 master-0 kubenswrapper[18592]: I0308 03:53:39.537146 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/934ced6d-5bc3-4215-8b97-059f77762cfd-secret-metrics-server-tls\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.537300 master-0 kubenswrapper[18592]: I0308 03:53:39.537240 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934ced6d-5bc3-4215-8b97-059f77762cfd-client-ca-bundle\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.537380 master-0 kubenswrapper[18592]: I0308 03:53:39.537364 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/934ced6d-5bc3-4215-8b97-059f77762cfd-metrics-server-audit-profiles\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.537437 master-0 kubenswrapper[18592]: I0308 03:53:39.537406 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/934ced6d-5bc3-4215-8b97-059f77762cfd-audit-log\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.537480 master-0 kubenswrapper[18592]: I0308 03:53:39.537459 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/934ced6d-5bc3-4215-8b97-059f77762cfd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.537521 master-0 kubenswrapper[18592]: I0308 03:53:39.537495 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkjbd\" (UniqueName: \"kubernetes.io/projected/934ced6d-5bc3-4215-8b97-059f77762cfd-kube-api-access-hkjbd\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.638624 master-0 kubenswrapper[18592]: I0308 03:53:39.638363 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/934ced6d-5bc3-4215-8b97-059f77762cfd-metrics-server-audit-profiles\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.638860 master-0 kubenswrapper[18592]: I0308 03:53:39.638721 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/934ced6d-5bc3-4215-8b97-059f77762cfd-audit-log\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.638860 master-0 kubenswrapper[18592]: I0308 03:53:39.638772 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/934ced6d-5bc3-4215-8b97-059f77762cfd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.638860 master-0 kubenswrapper[18592]: I0308 03:53:39.638811 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkjbd\" (UniqueName: \"kubernetes.io/projected/934ced6d-5bc3-4215-8b97-059f77762cfd-kube-api-access-hkjbd\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.639002 master-0 kubenswrapper[18592]: I0308 03:53:39.638916 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/934ced6d-5bc3-4215-8b97-059f77762cfd-secret-metrics-client-certs\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.639002 master-0 kubenswrapper[18592]: I0308 03:53:39.638952 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/934ced6d-5bc3-4215-8b97-059f77762cfd-secret-metrics-server-tls\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.639095 master-0 kubenswrapper[18592]: I0308 03:53:39.639008 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934ced6d-5bc3-4215-8b97-059f77762cfd-client-ca-bundle\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.639945 master-0 kubenswrapper[18592]: I0308 03:53:39.639851 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/934ced6d-5bc3-4215-8b97-059f77762cfd-audit-log\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.640618 master-0 kubenswrapper[18592]: I0308 03:53:39.640534 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/934ced6d-5bc3-4215-8b97-059f77762cfd-metrics-server-audit-profiles\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.642333 master-0 kubenswrapper[18592]: I0308 03:53:39.642255 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/934ced6d-5bc3-4215-8b97-059f77762cfd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.644199 master-0 kubenswrapper[18592]: I0308 03:53:39.644142 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934ced6d-5bc3-4215-8b97-059f77762cfd-client-ca-bundle\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.645420 master-0 kubenswrapper[18592]: I0308 03:53:39.644958 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/934ced6d-5bc3-4215-8b97-059f77762cfd-secret-metrics-server-tls\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.645420 master-0 kubenswrapper[18592]: I0308 03:53:39.645121 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/934ced6d-5bc3-4215-8b97-059f77762cfd-secret-metrics-client-certs\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.659925 master-0 kubenswrapper[18592]: I0308 03:53:39.659864 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkjbd\" (UniqueName: \"kubernetes.io/projected/934ced6d-5bc3-4215-8b97-059f77762cfd-kube-api-access-hkjbd\") pod \"metrics-server-7c975945c4-kbb6q\" (UID: \"934ced6d-5bc3-4215-8b97-059f77762cfd\") " pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.795997 master-0 kubenswrapper[18592]: I0308 03:53:39.795884 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-64589489d-z2spc"] Mar 08 03:53:39.796756 master-0 kubenswrapper[18592]: I0308 03:53:39.796737 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-64589489d-z2spc" Mar 08 03:53:39.798849 master-0 kubenswrapper[18592]: I0308 03:53:39.798803 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-c6d9k" Mar 08 03:53:39.801748 master-0 kubenswrapper[18592]: I0308 03:53:39.801711 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 08 03:53:39.814426 master-0 kubenswrapper[18592]: I0308 03:53:39.814339 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:39.820935 master-0 kubenswrapper[18592]: I0308 03:53:39.820890 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-64589489d-z2spc"] Mar 08 03:53:39.842081 master-0 kubenswrapper[18592]: I0308 03:53:39.842011 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b79990f-516d-4eb7-bc3f-bf63ff11f105-monitoring-plugin-cert\") pod \"monitoring-plugin-64589489d-z2spc\" (UID: \"8b79990f-516d-4eb7-bc3f-bf63ff11f105\") " pod="openshift-monitoring/monitoring-plugin-64589489d-z2spc" Mar 08 03:53:39.944287 master-0 kubenswrapper[18592]: I0308 03:53:39.943869 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b79990f-516d-4eb7-bc3f-bf63ff11f105-monitoring-plugin-cert\") pod \"monitoring-plugin-64589489d-z2spc\" (UID: \"8b79990f-516d-4eb7-bc3f-bf63ff11f105\") " pod="openshift-monitoring/monitoring-plugin-64589489d-z2spc" Mar 08 03:53:39.953533 master-0 kubenswrapper[18592]: I0308 03:53:39.953472 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b79990f-516d-4eb7-bc3f-bf63ff11f105-monitoring-plugin-cert\") pod \"monitoring-plugin-64589489d-z2spc\" (UID: \"8b79990f-516d-4eb7-bc3f-bf63ff11f105\") " pod="openshift-monitoring/monitoring-plugin-64589489d-z2spc" Mar 08 03:53:40.121571 master-0 kubenswrapper[18592]: I0308 03:53:40.121353 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-64589489d-z2spc" Mar 08 03:53:40.534361 master-0 kubenswrapper[18592]: I0308 03:53:40.534286 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 03:53:40.544771 master-0 kubenswrapper[18592]: I0308 03:53:40.544720 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.549237 master-0 kubenswrapper[18592]: I0308 03:53:40.548470 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 08 03:53:40.549237 master-0 kubenswrapper[18592]: I0308 03:53:40.548517 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 08 03:53:40.549237 master-0 kubenswrapper[18592]: I0308 03:53:40.548561 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 08 03:53:40.549237 master-0 kubenswrapper[18592]: I0308 03:53:40.548614 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-44sfif47rohlm" Mar 08 03:53:40.549237 master-0 kubenswrapper[18592]: I0308 03:53:40.548666 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 08 03:53:40.549237 master-0 kubenswrapper[18592]: I0308 03:53:40.548789 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 08 03:53:40.549237 master-0 kubenswrapper[18592]: I0308 03:53:40.548936 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 08 03:53:40.549237 master-0 kubenswrapper[18592]: I0308 03:53:40.548947 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 08 03:53:40.549237 master-0 kubenswrapper[18592]: I0308 03:53:40.549058 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-snc9w" Mar 08 03:53:40.549237 master-0 kubenswrapper[18592]: I0308 03:53:40.549114 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 08 03:53:40.550074 master-0 kubenswrapper[18592]: I0308 03:53:40.549712 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 08 03:53:40.553802 master-0 kubenswrapper[18592]: I0308 03:53:40.551443 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 08 03:53:40.553802 master-0 kubenswrapper[18592]: I0308 03:53:40.552348 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 08 03:53:40.631100 master-0 kubenswrapper[18592]: I0308 03:53:40.630147 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 03:53:40.660681 master-0 kubenswrapper[18592]: I0308 03:53:40.660549 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-config\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.660681 master-0 kubenswrapper[18592]: I0308 03:53:40.660593 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a26c661f-f843-45c5-85f0-2c2f72cbf580-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.660681 master-0 kubenswrapper[18592]: I0308 03:53:40.660618 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.660944 master-0 kubenswrapper[18592]: I0308 03:53:40.660717 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.660944 master-0 kubenswrapper[18592]: I0308 03:53:40.660795 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.660944 master-0 kubenswrapper[18592]: I0308 03:53:40.660838 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a26c661f-f843-45c5-85f0-2c2f72cbf580-config-out\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.660944 master-0 kubenswrapper[18592]: I0308 03:53:40.660855 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.660944 master-0 kubenswrapper[18592]: I0308 03:53:40.660876 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.660944 master-0 kubenswrapper[18592]: I0308 03:53:40.660903 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-web-config\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.660944 master-0 kubenswrapper[18592]: I0308 03:53:40.660919 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.661142 master-0 kubenswrapper[18592]: I0308 03:53:40.660987 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.661142 master-0 kubenswrapper[18592]: I0308 03:53:40.661025 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.662522 master-0 kubenswrapper[18592]: I0308 03:53:40.662038 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.662522 master-0 kubenswrapper[18592]: I0308 03:53:40.662080 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.662522 master-0 kubenswrapper[18592]: I0308 03:53:40.662100 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.663895 master-0 kubenswrapper[18592]: I0308 03:53:40.663875 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.664015 master-0 kubenswrapper[18592]: I0308 03:53:40.663999 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.664066 master-0 kubenswrapper[18592]: I0308 03:53:40.664038 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk8dj\" (UniqueName: \"kubernetes.io/projected/a26c661f-f843-45c5-85f0-2c2f72cbf580-kube-api-access-sk8dj\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.765864 master-0 kubenswrapper[18592]: I0308 03:53:40.764990 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-config\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.765864 master-0 kubenswrapper[18592]: I0308 03:53:40.765051 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a26c661f-f843-45c5-85f0-2c2f72cbf580-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.765864 master-0 kubenswrapper[18592]: I0308 03:53:40.765072 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.765864 master-0 kubenswrapper[18592]: I0308 03:53:40.765102 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.765864 master-0 kubenswrapper[18592]: I0308 03:53:40.765125 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.765864 master-0 kubenswrapper[18592]: I0308 03:53:40.765142 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a26c661f-f843-45c5-85f0-2c2f72cbf580-config-out\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.765864 master-0 kubenswrapper[18592]: I0308 03:53:40.765161 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.765864 master-0 kubenswrapper[18592]: I0308 03:53:40.765180 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.765864 master-0 kubenswrapper[18592]: I0308 03:53:40.765200 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-web-config\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.765864 master-0 kubenswrapper[18592]: I0308 03:53:40.765215 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.765864 master-0 kubenswrapper[18592]: I0308 03:53:40.765252 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.765864 master-0 kubenswrapper[18592]: I0308 03:53:40.765278 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.765864 master-0 kubenswrapper[18592]: I0308 03:53:40.765300 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.765864 master-0 kubenswrapper[18592]: I0308 03:53:40.765321 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.765864 master-0 kubenswrapper[18592]: I0308 03:53:40.765337 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.765864 master-0 kubenswrapper[18592]: I0308 03:53:40.765355 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.765864 master-0 kubenswrapper[18592]: I0308 03:53:40.765384 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.765864 master-0 kubenswrapper[18592]: I0308 03:53:40.765405 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk8dj\" (UniqueName: \"kubernetes.io/projected/a26c661f-f843-45c5-85f0-2c2f72cbf580-kube-api-access-sk8dj\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.769077 master-0 kubenswrapper[18592]: I0308 03:53:40.768671 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.769077 master-0 kubenswrapper[18592]: I0308 03:53:40.769016 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.770128 master-0 kubenswrapper[18592]: I0308 03:53:40.769637 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.774501 master-0 kubenswrapper[18592]: I0308 03:53:40.770386 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-web-config\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.774501 master-0 kubenswrapper[18592]: I0308 03:53:40.772191 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.776364 master-0 kubenswrapper[18592]: E0308 03:53:40.775177 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle podName:a26c661f-f843-45c5-85f0-2c2f72cbf580 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:41.275153164 +0000 UTC m=+33.373907514 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:40.776364 master-0 kubenswrapper[18592]: I0308 03:53:40.776157 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.778021 master-0 kubenswrapper[18592]: I0308 03:53:40.777516 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a26c661f-f843-45c5-85f0-2c2f72cbf580-config-out\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.780100 master-0 kubenswrapper[18592]: I0308 03:53:40.780071 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a26c661f-f843-45c5-85f0-2c2f72cbf580-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.780155 master-0 kubenswrapper[18592]: I0308 03:53:40.780075 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-config\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.784461 master-0 kubenswrapper[18592]: I0308 03:53:40.784414 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.786409 master-0 kubenswrapper[18592]: I0308 03:53:40.786373 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.787003 master-0 kubenswrapper[18592]: I0308 03:53:40.786965 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.788547 master-0 kubenswrapper[18592]: I0308 03:53:40.788079 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.788547 master-0 kubenswrapper[18592]: I0308 03:53:40.788232 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.788547 master-0 kubenswrapper[18592]: I0308 03:53:40.788510 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.789166 master-0 kubenswrapper[18592]: I0308 03:53:40.789127 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk8dj\" (UniqueName: \"kubernetes.io/projected/a26c661f-f843-45c5-85f0-2c2f72cbf580-kube-api-access-sk8dj\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.792737 master-0 kubenswrapper[18592]: I0308 03:53:40.792700 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:40.919903 master-0 kubenswrapper[18592]: I0308 03:53:40.919236 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" event={"ID":"04bf080c-cf49-4717-abdf-f247a4cdbf46","Type":"ContainerStarted","Data":"6f6852a6f3f16bd46213703f28035908c256ffa54154652846d5af026b42fe05"} Mar 08 03:53:40.919903 master-0 kubenswrapper[18592]: I0308 03:53:40.919304 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" event={"ID":"04bf080c-cf49-4717-abdf-f247a4cdbf46","Type":"ContainerStarted","Data":"f006337103c76b6e3ccd04c622d21fe03fbe0c9d6c05da63fd353154665d898f"} Mar 08 03:53:40.981452 master-0 kubenswrapper[18592]: I0308 03:53:40.981387 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-64589489d-z2spc"] Mar 08 03:53:40.984885 master-0 kubenswrapper[18592]: W0308 03:53:40.984840 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b79990f_516d_4eb7_bc3f_bf63ff11f105.slice/crio-8d02c443badcea8331c0106623002893e3b340d2ac1e2616b51a592e4da212cf WatchSource:0}: Error finding container 8d02c443badcea8331c0106623002893e3b340d2ac1e2616b51a592e4da212cf: Status 404 returned error can't find the container with id 8d02c443badcea8331c0106623002893e3b340d2ac1e2616b51a592e4da212cf Mar 08 03:53:41.033909 master-0 kubenswrapper[18592]: I0308 03:53:41.032221 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7c975945c4-kbb6q"] Mar 08 03:53:41.375073 master-0 kubenswrapper[18592]: I0308 03:53:41.374925 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:41.375475 master-0 kubenswrapper[18592]: E0308 03:53:41.375307 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle podName:a26c661f-f843-45c5-85f0-2c2f72cbf580 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:42.375259762 +0000 UTC m=+34.474014142 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:41.930586 master-0 kubenswrapper[18592]: I0308 03:53:41.930303 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" event={"ID":"04bf080c-cf49-4717-abdf-f247a4cdbf46","Type":"ContainerStarted","Data":"20eafa2b4c3f544482f7fc886e885f2e7272ec9ccb5c2ef4d26554382c4db543"} Mar 08 03:53:41.932618 master-0 kubenswrapper[18592]: I0308 03:53:41.932290 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" event={"ID":"934ced6d-5bc3-4215-8b97-059f77762cfd","Type":"ContainerStarted","Data":"36770416d265394c23e04d11dfc4e034fa53da977752b4a4a35b8e70c5dc17a0"} Mar 08 03:53:41.935242 master-0 kubenswrapper[18592]: I0308 03:53:41.935205 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-64589489d-z2spc" event={"ID":"8b79990f-516d-4eb7-bc3f-bf63ff11f105","Type":"ContainerStarted","Data":"8d02c443badcea8331c0106623002893e3b340d2ac1e2616b51a592e4da212cf"} Mar 08 03:53:42.394502 master-0 kubenswrapper[18592]: I0308 03:53:42.394438 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:42.394888 master-0 kubenswrapper[18592]: E0308 03:53:42.394770 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle podName:a26c661f-f843-45c5-85f0-2c2f72cbf580 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:44.394734916 +0000 UTC m=+36.493489306 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:42.899638 master-0 kubenswrapper[18592]: I0308 03:53:42.899326 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:42.899638 master-0 kubenswrapper[18592]: E0308 03:53:42.899569 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle podName:cacb9582-2132-4543-8a31-7b100ba4dd2f nodeName:}" failed. No retries permitted until 2026-03-08 03:53:50.89953431 +0000 UTC m=+42.998288690 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:43.954035 master-0 kubenswrapper[18592]: I0308 03:53:43.953965 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" event={"ID":"934ced6d-5bc3-4215-8b97-059f77762cfd","Type":"ContainerStarted","Data":"78d2b7afd3a547f82b74029e0cf4b4fce98ec47902393123369af01872cad487"} Mar 08 03:53:43.956054 master-0 kubenswrapper[18592]: I0308 03:53:43.956014 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-64589489d-z2spc" event={"ID":"8b79990f-516d-4eb7-bc3f-bf63ff11f105","Type":"ContainerStarted","Data":"20539338c672f599a91886d18a3b55e9e0c112d0239d27eb4a4688a49e30fc0f"} Mar 08 03:53:43.956415 master-0 kubenswrapper[18592]: I0308 03:53:43.956384 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-64589489d-z2spc" Mar 08 03:53:43.958742 master-0 kubenswrapper[18592]: I0308 03:53:43.958703 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" event={"ID":"04bf080c-cf49-4717-abdf-f247a4cdbf46","Type":"ContainerStarted","Data":"6e5541306993d15c2778b0a8cb9c27814367ea6b9ace7661a2e2f36f262d1ffc"} Mar 08 03:53:43.958935 master-0 kubenswrapper[18592]: I0308 03:53:43.958908 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" event={"ID":"04bf080c-cf49-4717-abdf-f247a4cdbf46","Type":"ContainerStarted","Data":"3702a1b62c796bc4e704d6e3aa2e3a4b41dbb6f775c6ec1c5ab1db77d0f1b3ef"} Mar 08 03:53:43.959076 master-0 kubenswrapper[18592]: I0308 03:53:43.959055 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:43.959185 master-0 kubenswrapper[18592]: I0308 03:53:43.959161 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" event={"ID":"04bf080c-cf49-4717-abdf-f247a4cdbf46","Type":"ContainerStarted","Data":"2e73e4135b2c3e21037e2435602c39e57677183a8f685f6ddf78ff71800cdc15"} Mar 08 03:53:43.966236 master-0 kubenswrapper[18592]: I0308 03:53:43.966199 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-64589489d-z2spc" Mar 08 03:53:43.983296 master-0 kubenswrapper[18592]: I0308 03:53:43.983228 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" podStartSLOduration=3.053905275 podStartE2EDuration="4.983207788s" podCreationTimestamp="2026-03-08 03:53:39 +0000 UTC" firstStartedPulling="2026-03-08 03:53:41.044663443 +0000 UTC m=+33.143417803" lastFinishedPulling="2026-03-08 03:53:42.973965956 +0000 UTC m=+35.072720316" observedRunningTime="2026-03-08 03:53:43.98123968 +0000 UTC m=+36.079994070" watchObservedRunningTime="2026-03-08 03:53:43.983207788 +0000 UTC m=+36.081962148" Mar 08 03:53:44.007031 master-0 kubenswrapper[18592]: I0308 03:53:44.006935 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-64589489d-z2spc" podStartSLOduration=3.03162732 podStartE2EDuration="5.00691277s" podCreationTimestamp="2026-03-08 03:53:39 +0000 UTC" firstStartedPulling="2026-03-08 03:53:40.99175789 +0000 UTC m=+33.090512250" lastFinishedPulling="2026-03-08 03:53:42.96704332 +0000 UTC m=+35.065797700" observedRunningTime="2026-03-08 03:53:44.004395809 +0000 UTC m=+36.103150169" watchObservedRunningTime="2026-03-08 03:53:44.00691277 +0000 UTC m=+36.105667120" Mar 08 03:53:44.049203 master-0 kubenswrapper[18592]: I0308 03:53:44.049115 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" podStartSLOduration=2.544095739 podStartE2EDuration="8.049095128s" podCreationTimestamp="2026-03-08 03:53:36 +0000 UTC" firstStartedPulling="2026-03-08 03:53:37.462075021 +0000 UTC m=+29.560829371" lastFinishedPulling="2026-03-08 03:53:42.9670744 +0000 UTC m=+35.065828760" observedRunningTime="2026-03-08 03:53:44.044787835 +0000 UTC m=+36.143542195" watchObservedRunningTime="2026-03-08 03:53:44.049095128 +0000 UTC m=+36.147849488" Mar 08 03:53:44.325724 master-0 kubenswrapper[18592]: I0308 03:53:44.325543 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:53:44.326004 master-0 kubenswrapper[18592]: E0308 03:53:44.325937 18592 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:44.326004 master-0 kubenswrapper[18592]: E0308 03:53:44.325988 18592 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:44.326162 master-0 kubenswrapper[18592]: E0308 03:53:44.326087 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access podName:9c95709c-c3cb-46fb-afe7-626c8013f3c6 nodeName:}" failed. No retries permitted until 2026-03-08 03:54:16.326052863 +0000 UTC m=+68.424807263 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "9c95709c-c3cb-46fb-afe7-626c8013f3c6") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:53:44.427404 master-0 kubenswrapper[18592]: I0308 03:53:44.427322 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:44.427702 master-0 kubenswrapper[18592]: E0308 03:53:44.427659 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle podName:a26c661f-f843-45c5-85f0-2c2f72cbf580 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:48.427631305 +0000 UTC m=+40.526385695 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:46.479958 master-0 kubenswrapper[18592]: I0308 03:53:46.479887 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7cfbff6469-pjfrs" Mar 08 03:53:48.493521 master-0 kubenswrapper[18592]: I0308 03:53:48.493414 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:48.494547 master-0 kubenswrapper[18592]: E0308 03:53:48.493712 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle podName:a26c661f-f843-45c5-85f0-2c2f72cbf580 nodeName:}" failed. No retries permitted until 2026-03-08 03:53:56.493686822 +0000 UTC m=+48.592441212 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:50.938502 master-0 kubenswrapper[18592]: I0308 03:53:50.938410 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:53:50.939512 master-0 kubenswrapper[18592]: E0308 03:53:50.938734 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle podName:cacb9582-2132-4543-8a31-7b100ba4dd2f nodeName:}" failed. No retries permitted until 2026-03-08 03:54:06.9386971 +0000 UTC m=+59.037451490 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:55.106499 master-0 kubenswrapper[18592]: I0308 03:53:55.106415 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:53:55.107271 master-0 kubenswrapper[18592]: E0308 03:53:55.106656 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca podName:48ab3c8e-a2bd-4380-9e8d-a41d515a989d nodeName:}" failed. No retries permitted until 2026-03-08 03:54:27.106622036 +0000 UTC m=+79.205376426 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca") pod "console-operator-6c7fb6b958-mr9k6" (UID: "48ab3c8e-a2bd-4380-9e8d-a41d515a989d") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:55.755625 master-0 kubenswrapper[18592]: I0308 03:53:55.755544 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 08 03:53:55.756324 master-0 kubenswrapper[18592]: I0308 03:53:55.756284 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:53:55.759271 master-0 kubenswrapper[18592]: I0308 03:53:55.759219 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 03:53:55.759418 master-0 kubenswrapper[18592]: I0308 03:53:55.759275 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-ccz75" Mar 08 03:53:55.778801 master-0 kubenswrapper[18592]: I0308 03:53:55.778742 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 08 03:53:55.821009 master-0 kubenswrapper[18592]: I0308 03:53:55.820929 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55240670-0b16-485e-b5cf-f3e7bc4431f5-kube-api-access\") pod \"installer-3-master-0\" (UID: \"55240670-0b16-485e-b5cf-f3e7bc4431f5\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:53:55.821293 master-0 kubenswrapper[18592]: I0308 03:53:55.821088 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55240670-0b16-485e-b5cf-f3e7bc4431f5-var-lock\") pod \"installer-3-master-0\" (UID: \"55240670-0b16-485e-b5cf-f3e7bc4431f5\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:53:55.821293 master-0 kubenswrapper[18592]: I0308 03:53:55.821155 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55240670-0b16-485e-b5cf-f3e7bc4431f5-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"55240670-0b16-485e-b5cf-f3e7bc4431f5\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:53:55.923332 master-0 kubenswrapper[18592]: I0308 03:53:55.923217 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55240670-0b16-485e-b5cf-f3e7bc4431f5-kube-api-access\") pod \"installer-3-master-0\" (UID: \"55240670-0b16-485e-b5cf-f3e7bc4431f5\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:53:55.923678 master-0 kubenswrapper[18592]: I0308 03:53:55.923355 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55240670-0b16-485e-b5cf-f3e7bc4431f5-var-lock\") pod \"installer-3-master-0\" (UID: \"55240670-0b16-485e-b5cf-f3e7bc4431f5\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:53:55.923678 master-0 kubenswrapper[18592]: I0308 03:53:55.923402 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55240670-0b16-485e-b5cf-f3e7bc4431f5-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"55240670-0b16-485e-b5cf-f3e7bc4431f5\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:53:55.923678 master-0 kubenswrapper[18592]: I0308 03:53:55.923471 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55240670-0b16-485e-b5cf-f3e7bc4431f5-var-lock\") pod \"installer-3-master-0\" (UID: \"55240670-0b16-485e-b5cf-f3e7bc4431f5\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:53:55.923678 master-0 kubenswrapper[18592]: I0308 03:53:55.923498 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55240670-0b16-485e-b5cf-f3e7bc4431f5-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"55240670-0b16-485e-b5cf-f3e7bc4431f5\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:53:55.938848 master-0 kubenswrapper[18592]: I0308 03:53:55.938784 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55240670-0b16-485e-b5cf-f3e7bc4431f5-kube-api-access\") pod \"installer-3-master-0\" (UID: \"55240670-0b16-485e-b5cf-f3e7bc4431f5\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:53:56.076529 master-0 kubenswrapper[18592]: I0308 03:53:56.076346 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:53:56.559092 master-0 kubenswrapper[18592]: I0308 03:53:56.558939 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:53:56.560977 master-0 kubenswrapper[18592]: E0308 03:53:56.559964 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle podName:a26c661f-f843-45c5-85f0-2c2f72cbf580 nodeName:}" failed. No retries permitted until 2026-03-08 03:54:12.559157678 +0000 UTC m=+64.657912068 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:53:56.615995 master-0 kubenswrapper[18592]: I0308 03:53:56.615945 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 08 03:53:56.622960 master-0 kubenswrapper[18592]: W0308 03:53:56.622878 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod55240670_0b16_485e_b5cf_f3e7bc4431f5.slice/crio-0d6f4735df790157d7454c6cb3428443ea59d8eeac7fa002c3b3fcdd9602dcd4 WatchSource:0}: Error finding container 0d6f4735df790157d7454c6cb3428443ea59d8eeac7fa002c3b3fcdd9602dcd4: Status 404 returned error can't find the container with id 0d6f4735df790157d7454c6cb3428443ea59d8eeac7fa002c3b3fcdd9602dcd4 Mar 08 03:53:57.067424 master-0 kubenswrapper[18592]: I0308 03:53:57.067345 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"55240670-0b16-485e-b5cf-f3e7bc4431f5","Type":"ContainerStarted","Data":"8993f2eb590c5e6fb3bab3a6e1230e8a4a446a446cec7cc0e36dedb8eff9076c"} Mar 08 03:53:57.067424 master-0 kubenswrapper[18592]: I0308 03:53:57.067405 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"55240670-0b16-485e-b5cf-f3e7bc4431f5","Type":"ContainerStarted","Data":"0d6f4735df790157d7454c6cb3428443ea59d8eeac7fa002c3b3fcdd9602dcd4"} Mar 08 03:53:57.090605 master-0 kubenswrapper[18592]: I0308 03:53:57.090448 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-3-master-0" podStartSLOduration=2.090395141 podStartE2EDuration="2.090395141s" podCreationTimestamp="2026-03-08 03:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:53:57.089392626 +0000 UTC m=+49.188147016" watchObservedRunningTime="2026-03-08 03:53:57.090395141 +0000 UTC m=+49.189149521" Mar 08 03:53:59.814797 master-0 kubenswrapper[18592]: I0308 03:53:59.814722 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:53:59.815609 master-0 kubenswrapper[18592]: I0308 03:53:59.814814 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:54:02.772187 master-0 kubenswrapper[18592]: I0308 03:54:02.772104 18592 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 03:54:02.773094 master-0 kubenswrapper[18592]: I0308 03:54:02.772507 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a12e67e5b53279c862df229026c8d16c" containerName="kube-controller-manager" containerID="cri-o://9d1a3af9468d450b8ce515e818a31e6bfe522f30f01bccb1080ebaabf3f6d3f1" gracePeriod=30 Mar 08 03:54:02.773094 master-0 kubenswrapper[18592]: I0308 03:54:02.772585 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a12e67e5b53279c862df229026c8d16c" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://6979155324a9775c0f334fc4aa6afa070463810c3191479ea2bb2dbfe2843ea3" gracePeriod=30 Mar 08 03:54:02.773094 master-0 kubenswrapper[18592]: I0308 03:54:02.772693 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a12e67e5b53279c862df229026c8d16c" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://d92fdcd0bd88e0c579bd858a04d7e6e266a7f72aec3885543e0de2cee51140ac" gracePeriod=30 Mar 08 03:54:02.773094 master-0 kubenswrapper[18592]: I0308 03:54:02.772766 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a12e67e5b53279c862df229026c8d16c" containerName="cluster-policy-controller" containerID="cri-o://f5cec83dc05dfae95933e7d5e4646a470fd6b2150eeed3507d1b115fc1dfcb34" gracePeriod=30 Mar 08 03:54:02.776301 master-0 kubenswrapper[18592]: I0308 03:54:02.776239 18592 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 03:54:02.776698 master-0 kubenswrapper[18592]: E0308 03:54:02.776657 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12e67e5b53279c862df229026c8d16c" containerName="kube-controller-manager-recovery-controller" Mar 08 03:54:02.776698 master-0 kubenswrapper[18592]: I0308 03:54:02.776693 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12e67e5b53279c862df229026c8d16c" containerName="kube-controller-manager-recovery-controller" Mar 08 03:54:02.776795 master-0 kubenswrapper[18592]: E0308 03:54:02.776721 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12e67e5b53279c862df229026c8d16c" containerName="cluster-policy-controller" Mar 08 03:54:02.776795 master-0 kubenswrapper[18592]: I0308 03:54:02.776735 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12e67e5b53279c862df229026c8d16c" containerName="cluster-policy-controller" Mar 08 03:54:02.776795 master-0 kubenswrapper[18592]: E0308 03:54:02.776773 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12e67e5b53279c862df229026c8d16c" containerName="kube-controller-manager" Mar 08 03:54:02.776795 master-0 kubenswrapper[18592]: I0308 03:54:02.776786 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12e67e5b53279c862df229026c8d16c" containerName="kube-controller-manager" Mar 08 03:54:02.776980 master-0 kubenswrapper[18592]: E0308 03:54:02.776819 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12e67e5b53279c862df229026c8d16c" containerName="kube-controller-manager-cert-syncer" Mar 08 03:54:02.776980 master-0 kubenswrapper[18592]: I0308 03:54:02.776879 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12e67e5b53279c862df229026c8d16c" containerName="kube-controller-manager-cert-syncer" Mar 08 03:54:02.777116 master-0 kubenswrapper[18592]: I0308 03:54:02.777076 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a12e67e5b53279c862df229026c8d16c" containerName="kube-controller-manager" Mar 08 03:54:02.777170 master-0 kubenswrapper[18592]: I0308 03:54:02.777128 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a12e67e5b53279c862df229026c8d16c" containerName="kube-controller-manager-recovery-controller" Mar 08 03:54:02.777170 master-0 kubenswrapper[18592]: I0308 03:54:02.777145 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a12e67e5b53279c862df229026c8d16c" containerName="cluster-policy-controller" Mar 08 03:54:02.777252 master-0 kubenswrapper[18592]: I0308 03:54:02.777171 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a12e67e5b53279c862df229026c8d16c" containerName="kube-controller-manager-cert-syncer" Mar 08 03:54:02.865114 master-0 kubenswrapper[18592]: I0308 03:54:02.865058 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0580c83f64e952a7a614903b6fdf6965-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0580c83f64e952a7a614903b6fdf6965\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:02.865367 master-0 kubenswrapper[18592]: I0308 03:54:02.865280 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0580c83f64e952a7a614903b6fdf6965-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0580c83f64e952a7a614903b6fdf6965\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:02.966228 master-0 kubenswrapper[18592]: I0308 03:54:02.966142 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0580c83f64e952a7a614903b6fdf6965-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0580c83f64e952a7a614903b6fdf6965\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:02.966597 master-0 kubenswrapper[18592]: I0308 03:54:02.966334 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0580c83f64e952a7a614903b6fdf6965-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0580c83f64e952a7a614903b6fdf6965\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:02.966597 master-0 kubenswrapper[18592]: I0308 03:54:02.966428 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0580c83f64e952a7a614903b6fdf6965-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0580c83f64e952a7a614903b6fdf6965\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:02.966597 master-0 kubenswrapper[18592]: I0308 03:54:02.966488 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0580c83f64e952a7a614903b6fdf6965-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0580c83f64e952a7a614903b6fdf6965\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:03.070534 master-0 kubenswrapper[18592]: I0308 03:54:03.070308 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a12e67e5b53279c862df229026c8d16c/kube-controller-manager-cert-syncer/0.log" Mar 08 03:54:03.071615 master-0 kubenswrapper[18592]: I0308 03:54:03.071576 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:03.075561 master-0 kubenswrapper[18592]: I0308 03:54:03.075505 18592 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="a12e67e5b53279c862df229026c8d16c" podUID="0580c83f64e952a7a614903b6fdf6965" Mar 08 03:54:03.119672 master-0 kubenswrapper[18592]: I0308 03:54:03.119621 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a12e67e5b53279c862df229026c8d16c/kube-controller-manager-cert-syncer/0.log" Mar 08 03:54:03.121022 master-0 kubenswrapper[18592]: I0308 03:54:03.120969 18592 generic.go:334] "Generic (PLEG): container finished" podID="a12e67e5b53279c862df229026c8d16c" containerID="6979155324a9775c0f334fc4aa6afa070463810c3191479ea2bb2dbfe2843ea3" exitCode=0 Mar 08 03:54:03.121022 master-0 kubenswrapper[18592]: I0308 03:54:03.121013 18592 generic.go:334] "Generic (PLEG): container finished" podID="a12e67e5b53279c862df229026c8d16c" containerID="d92fdcd0bd88e0c579bd858a04d7e6e266a7f72aec3885543e0de2cee51140ac" exitCode=2 Mar 08 03:54:03.121158 master-0 kubenswrapper[18592]: I0308 03:54:03.121030 18592 generic.go:334] "Generic (PLEG): container finished" podID="a12e67e5b53279c862df229026c8d16c" containerID="f5cec83dc05dfae95933e7d5e4646a470fd6b2150eeed3507d1b115fc1dfcb34" exitCode=0 Mar 08 03:54:03.121158 master-0 kubenswrapper[18592]: I0308 03:54:03.121048 18592 generic.go:334] "Generic (PLEG): container finished" podID="a12e67e5b53279c862df229026c8d16c" containerID="9d1a3af9468d450b8ce515e818a31e6bfe522f30f01bccb1080ebaabf3f6d3f1" exitCode=0 Mar 08 03:54:03.121158 master-0 kubenswrapper[18592]: I0308 03:54:03.121061 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:03.121158 master-0 kubenswrapper[18592]: I0308 03:54:03.121082 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9af5ebd3eee3c3de99e27a671d715ba12c7da929014abc4a9a4424a8fb8aad4e" Mar 08 03:54:03.123423 master-0 kubenswrapper[18592]: I0308 03:54:03.123362 18592 generic.go:334] "Generic (PLEG): container finished" podID="e5dd61e1-5034-4d59-b752-9a4f6adb92d8" containerID="923c74cb0eb187b16aa30ad8f198a5424dc4e0bd386f1790150cfb8a6ba477c4" exitCode=0 Mar 08 03:54:03.123518 master-0 kubenswrapper[18592]: I0308 03:54:03.123433 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"e5dd61e1-5034-4d59-b752-9a4f6adb92d8","Type":"ContainerDied","Data":"923c74cb0eb187b16aa30ad8f198a5424dc4e0bd386f1790150cfb8a6ba477c4"} Mar 08 03:54:03.124403 master-0 kubenswrapper[18592]: I0308 03:54:03.124349 18592 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="a12e67e5b53279c862df229026c8d16c" podUID="0580c83f64e952a7a614903b6fdf6965" Mar 08 03:54:03.169959 master-0 kubenswrapper[18592]: I0308 03:54:03.169873 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a12e67e5b53279c862df229026c8d16c-resource-dir\") pod \"a12e67e5b53279c862df229026c8d16c\" (UID: \"a12e67e5b53279c862df229026c8d16c\") " Mar 08 03:54:03.170205 master-0 kubenswrapper[18592]: I0308 03:54:03.170018 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a12e67e5b53279c862df229026c8d16c-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "a12e67e5b53279c862df229026c8d16c" (UID: "a12e67e5b53279c862df229026c8d16c"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:54:03.170205 master-0 kubenswrapper[18592]: I0308 03:54:03.170041 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a12e67e5b53279c862df229026c8d16c-cert-dir\") pod \"a12e67e5b53279c862df229026c8d16c\" (UID: \"a12e67e5b53279c862df229026c8d16c\") " Mar 08 03:54:03.170205 master-0 kubenswrapper[18592]: I0308 03:54:03.170092 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a12e67e5b53279c862df229026c8d16c-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "a12e67e5b53279c862df229026c8d16c" (UID: "a12e67e5b53279c862df229026c8d16c"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:54:03.170702 master-0 kubenswrapper[18592]: I0308 03:54:03.170666 18592 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a12e67e5b53279c862df229026c8d16c-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:54:03.170702 master-0 kubenswrapper[18592]: I0308 03:54:03.170689 18592 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a12e67e5b53279c862df229026c8d16c-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:54:03.449720 master-0 kubenswrapper[18592]: I0308 03:54:03.449643 18592 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="a12e67e5b53279c862df229026c8d16c" podUID="0580c83f64e952a7a614903b6fdf6965" Mar 08 03:54:04.155383 master-0 kubenswrapper[18592]: I0308 03:54:04.155328 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a12e67e5b53279c862df229026c8d16c" path="/var/lib/kubelet/pods/a12e67e5b53279c862df229026c8d16c/volumes" Mar 08 03:54:04.483954 master-0 kubenswrapper[18592]: I0308 03:54:04.483913 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:54:04.595592 master-0 kubenswrapper[18592]: I0308 03:54:04.595501 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5dd61e1-5034-4d59-b752-9a4f6adb92d8-kube-api-access\") pod \"e5dd61e1-5034-4d59-b752-9a4f6adb92d8\" (UID: \"e5dd61e1-5034-4d59-b752-9a4f6adb92d8\") " Mar 08 03:54:04.595962 master-0 kubenswrapper[18592]: I0308 03:54:04.595748 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5dd61e1-5034-4d59-b752-9a4f6adb92d8-kubelet-dir\") pod \"e5dd61e1-5034-4d59-b752-9a4f6adb92d8\" (UID: \"e5dd61e1-5034-4d59-b752-9a4f6adb92d8\") " Mar 08 03:54:04.595962 master-0 kubenswrapper[18592]: I0308 03:54:04.595913 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e5dd61e1-5034-4d59-b752-9a4f6adb92d8-var-lock\") pod \"e5dd61e1-5034-4d59-b752-9a4f6adb92d8\" (UID: \"e5dd61e1-5034-4d59-b752-9a4f6adb92d8\") " Mar 08 03:54:04.595962 master-0 kubenswrapper[18592]: I0308 03:54:04.595906 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5dd61e1-5034-4d59-b752-9a4f6adb92d8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e5dd61e1-5034-4d59-b752-9a4f6adb92d8" (UID: "e5dd61e1-5034-4d59-b752-9a4f6adb92d8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:54:04.596199 master-0 kubenswrapper[18592]: I0308 03:54:04.596102 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5dd61e1-5034-4d59-b752-9a4f6adb92d8-var-lock" (OuterVolumeSpecName: "var-lock") pod "e5dd61e1-5034-4d59-b752-9a4f6adb92d8" (UID: "e5dd61e1-5034-4d59-b752-9a4f6adb92d8"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:54:04.596499 master-0 kubenswrapper[18592]: I0308 03:54:04.596458 18592 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5dd61e1-5034-4d59-b752-9a4f6adb92d8-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:54:04.596540 master-0 kubenswrapper[18592]: I0308 03:54:04.596497 18592 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e5dd61e1-5034-4d59-b752-9a4f6adb92d8-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:54:04.600519 master-0 kubenswrapper[18592]: I0308 03:54:04.600462 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5dd61e1-5034-4d59-b752-9a4f6adb92d8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e5dd61e1-5034-4d59-b752-9a4f6adb92d8" (UID: "e5dd61e1-5034-4d59-b752-9a4f6adb92d8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:54:04.699028 master-0 kubenswrapper[18592]: I0308 03:54:04.698837 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5dd61e1-5034-4d59-b752-9a4f6adb92d8-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:54:05.140445 master-0 kubenswrapper[18592]: I0308 03:54:05.140238 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"e5dd61e1-5034-4d59-b752-9a4f6adb92d8","Type":"ContainerDied","Data":"34dd4de04e0481f8dceb9d2e1dc60475fecde9f20ac0b059e7cfbc6cbaa160ff"} Mar 08 03:54:05.140445 master-0 kubenswrapper[18592]: I0308 03:54:05.140302 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34dd4de04e0481f8dceb9d2e1dc60475fecde9f20ac0b059e7cfbc6cbaa160ff" Mar 08 03:54:05.140445 master-0 kubenswrapper[18592]: I0308 03:54:05.140313 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:54:07.037182 master-0 kubenswrapper[18592]: I0308 03:54:07.037072 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:54:07.038058 master-0 kubenswrapper[18592]: E0308 03:54:07.037454 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle podName:cacb9582-2132-4543-8a31-7b100ba4dd2f nodeName:}" failed. No retries permitted until 2026-03-08 03:54:39.037427913 +0000 UTC m=+91.136182303 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:54:08.547740 master-0 kubenswrapper[18592]: I0308 03:54:08.547697 18592 scope.go:117] "RemoveContainer" containerID="b00978d6151280d243ba1f6c8276b934ba5c5276b57bc3800284f048820f905f" Mar 08 03:54:08.577275 master-0 kubenswrapper[18592]: I0308 03:54:08.577224 18592 scope.go:117] "RemoveContainer" containerID="20197cef49bb05fb75f2e7eda65c3e92dc7a4af95343b25ff91e78b1d42be6fb" Mar 08 03:54:12.567314 master-0 kubenswrapper[18592]: I0308 03:54:12.567218 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:54:12.568216 master-0 kubenswrapper[18592]: E0308 03:54:12.567526 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle podName:a26c661f-f843-45c5-85f0-2c2f72cbf580 nodeName:}" failed. No retries permitted until 2026-03-08 03:54:44.567488347 +0000 UTC m=+96.666242727 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:54:15.142709 master-0 kubenswrapper[18592]: I0308 03:54:15.142622 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:15.163639 master-0 kubenswrapper[18592]: I0308 03:54:15.163499 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="e0c76075-1c7d-4a66-8f5f-b4d59f470e30" Mar 08 03:54:15.163639 master-0 kubenswrapper[18592]: I0308 03:54:15.163614 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="e0c76075-1c7d-4a66-8f5f-b4d59f470e30" Mar 08 03:54:15.189394 master-0 kubenswrapper[18592]: I0308 03:54:15.186240 18592 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:15.190611 master-0 kubenswrapper[18592]: I0308 03:54:15.190538 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 03:54:15.197869 master-0 kubenswrapper[18592]: I0308 03:54:15.197483 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 03:54:15.206101 master-0 kubenswrapper[18592]: I0308 03:54:15.205923 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:15.218745 master-0 kubenswrapper[18592]: I0308 03:54:15.215258 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 03:54:15.244899 master-0 kubenswrapper[18592]: W0308 03:54:15.244790 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0580c83f64e952a7a614903b6fdf6965.slice/crio-d1ef459c8cc0e71b033062ff571e95ba6d294f407e26adcf19ee5af5b38b60b5 WatchSource:0}: Error finding container d1ef459c8cc0e71b033062ff571e95ba6d294f407e26adcf19ee5af5b38b60b5: Status 404 returned error can't find the container with id d1ef459c8cc0e71b033062ff571e95ba6d294f407e26adcf19ee5af5b38b60b5 Mar 08 03:54:15.629659 master-0 kubenswrapper[18592]: I0308 03:54:15.629587 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0580c83f64e952a7a614903b6fdf6965","Type":"ContainerStarted","Data":"c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3"} Mar 08 03:54:15.629659 master-0 kubenswrapper[18592]: I0308 03:54:15.629653 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0580c83f64e952a7a614903b6fdf6965","Type":"ContainerStarted","Data":"d1ef459c8cc0e71b033062ff571e95ba6d294f407e26adcf19ee5af5b38b60b5"} Mar 08 03:54:16.330572 master-0 kubenswrapper[18592]: I0308 03:54:16.330503 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:54:16.340744 master-0 kubenswrapper[18592]: I0308 03:54:16.336053 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 08 03:54:16.532485 master-0 kubenswrapper[18592]: I0308 03:54:16.532355 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access\") pod \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\" (UID: \"9c95709c-c3cb-46fb-afe7-626c8013f3c6\") " Mar 08 03:54:16.537594 master-0 kubenswrapper[18592]: I0308 03:54:16.537520 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9c95709c-c3cb-46fb-afe7-626c8013f3c6" (UID: "9c95709c-c3cb-46fb-afe7-626c8013f3c6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:54:16.633884 master-0 kubenswrapper[18592]: I0308 03:54:16.633705 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c95709c-c3cb-46fb-afe7-626c8013f3c6-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:54:16.641605 master-0 kubenswrapper[18592]: I0308 03:54:16.641559 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0580c83f64e952a7a614903b6fdf6965","Type":"ContainerStarted","Data":"7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e"} Mar 08 03:54:16.641605 master-0 kubenswrapper[18592]: I0308 03:54:16.641600 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0580c83f64e952a7a614903b6fdf6965","Type":"ContainerStarted","Data":"a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5"} Mar 08 03:54:16.641605 master-0 kubenswrapper[18592]: I0308 03:54:16.641609 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0580c83f64e952a7a614903b6fdf6965","Type":"ContainerStarted","Data":"ade4a3d46dcebb1e326fada73ac4cc99f5151e1191d54ff04a063230922fd053"} Mar 08 03:54:16.662030 master-0 kubenswrapper[18592]: I0308 03:54:16.661946 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=1.6619268680000001 podStartE2EDuration="1.661926868s" podCreationTimestamp="2026-03-08 03:54:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:54:16.660047052 +0000 UTC m=+68.758801422" watchObservedRunningTime="2026-03-08 03:54:16.661926868 +0000 UTC m=+68.760681228" Mar 08 03:54:19.824351 master-0 kubenswrapper[18592]: I0308 03:54:19.824295 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:54:19.831809 master-0 kubenswrapper[18592]: I0308 03:54:19.831739 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7c975945c4-kbb6q" Mar 08 03:54:25.206631 master-0 kubenswrapper[18592]: I0308 03:54:25.206244 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:25.207552 master-0 kubenswrapper[18592]: I0308 03:54:25.206752 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:25.207552 master-0 kubenswrapper[18592]: I0308 03:54:25.206778 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:25.207552 master-0 kubenswrapper[18592]: I0308 03:54:25.206897 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:25.213101 master-0 kubenswrapper[18592]: I0308 03:54:25.213080 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:25.214627 master-0 kubenswrapper[18592]: I0308 03:54:25.214604 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:25.735075 master-0 kubenswrapper[18592]: I0308 03:54:25.735027 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:25.737071 master-0 kubenswrapper[18592]: I0308 03:54:25.737032 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:54:27.117347 master-0 kubenswrapper[18592]: I0308 03:54:27.117267 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:54:27.118306 master-0 kubenswrapper[18592]: E0308 03:54:27.117468 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca podName:48ab3c8e-a2bd-4380-9e8d-a41d515a989d nodeName:}" failed. No retries permitted until 2026-03-08 03:55:31.117439674 +0000 UTC m=+143.216194064 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca") pod "console-operator-6c7fb6b958-mr9k6" (UID: "48ab3c8e-a2bd-4380-9e8d-a41d515a989d") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:54:35.138605 master-0 kubenswrapper[18592]: I0308 03:54:35.138483 18592 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 03:54:35.139597 master-0 kubenswrapper[18592]: E0308 03:54:35.138957 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5dd61e1-5034-4d59-b752-9a4f6adb92d8" containerName="installer" Mar 08 03:54:35.139597 master-0 kubenswrapper[18592]: I0308 03:54:35.138980 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5dd61e1-5034-4d59-b752-9a4f6adb92d8" containerName="installer" Mar 08 03:54:35.139597 master-0 kubenswrapper[18592]: I0308 03:54:35.139200 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5dd61e1-5034-4d59-b752-9a4f6adb92d8" containerName="installer" Mar 08 03:54:35.139935 master-0 kubenswrapper[18592]: I0308 03:54:35.139878 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:35.140221 master-0 kubenswrapper[18592]: I0308 03:54:35.140158 18592 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 03:54:35.140661 master-0 kubenswrapper[18592]: I0308 03:54:35.140606 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" containerID="cri-o://80a13278743d26b7b1321c7095283277668741654b1e182af894d61a0ac675ff" gracePeriod=15 Mar 08 03:54:35.140739 master-0 kubenswrapper[18592]: I0308 03:54:35.140643 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" containerID="cri-o://0a6172e8ccc8a4efe2658181eb18b7f6b4fbfb74c1d8665ad23817e21967ec14" gracePeriod=15 Mar 08 03:54:35.140805 master-0 kubenswrapper[18592]: I0308 03:54:35.140705 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" containerID="cri-o://77d9f19c7fff32bc633b77d809c0704eaf44b3aee7eeaf009773338793ad2dd5" gracePeriod=15 Mar 08 03:54:35.140904 master-0 kubenswrapper[18592]: I0308 03:54:35.140730 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://5bc596a566a004204d8781e6880a298269208812a64f684e8f90b164a5a846fe" gracePeriod=15 Mar 08 03:54:35.140973 master-0 kubenswrapper[18592]: I0308 03:54:35.140743 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://865460c774c2766f8b86ebf8237c6f8af6ae97a526279d303aebe43f358dbff8" gracePeriod=15 Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: I0308 03:54:35.161070 18592 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: E0308 03:54:35.161452 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: I0308 03:54:35.161475 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: E0308 03:54:35.161503 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: I0308 03:54:35.161516 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: E0308 03:54:35.161538 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: I0308 03:54:35.161551 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: E0308 03:54:35.161572 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: I0308 03:54:35.161585 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: E0308 03:54:35.161608 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: I0308 03:54:35.161620 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: E0308 03:54:35.161632 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="setup" Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: I0308 03:54:35.161644 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="setup" Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: I0308 03:54:35.162040 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: I0308 03:54:35.162071 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: I0308 03:54:35.162094 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: I0308 03:54:35.162121 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: I0308 03:54:35.162141 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 03:54:35.162217 master-0 kubenswrapper[18592]: I0308 03:54:35.162160 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 08 03:54:35.163138 master-0 kubenswrapper[18592]: E0308 03:54:35.162409 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 08 03:54:35.163138 master-0 kubenswrapper[18592]: I0308 03:54:35.162426 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 08 03:54:35.202802 master-0 kubenswrapper[18592]: E0308 03:54:35.202558 18592 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-master-0.189ac16c841ac06b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:cdcecc61ff5eeb08bd2a3ac12599e4f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Killing,Message:Stopping container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:54:35.140685931 +0000 UTC m=+87.239440311,LastTimestamp:2026-03-08 03:54:35.140685931 +0000 UTC m=+87.239440311,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:54:35.258157 master-0 kubenswrapper[18592]: I0308 03:54:35.258103 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:35.258449 master-0 kubenswrapper[18592]: I0308 03:54:35.258420 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:35.258604 master-0 kubenswrapper[18592]: I0308 03:54:35.258580 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:35.258797 master-0 kubenswrapper[18592]: I0308 03:54:35.258771 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:35.259232 master-0 kubenswrapper[18592]: I0308 03:54:35.259199 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:35.263343 master-0 kubenswrapper[18592]: I0308 03:54:35.263279 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:35.263507 master-0 kubenswrapper[18592]: I0308 03:54:35.263468 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:35.263670 master-0 kubenswrapper[18592]: I0308 03:54:35.263620 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:35.282254 master-0 kubenswrapper[18592]: E0308 03:54:35.282186 18592 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:35.365728 master-0 kubenswrapper[18592]: I0308 03:54:35.365623 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:35.366041 master-0 kubenswrapper[18592]: I0308 03:54:35.365854 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:35.366041 master-0 kubenswrapper[18592]: I0308 03:54:35.365946 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:35.366203 master-0 kubenswrapper[18592]: I0308 03:54:35.366076 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:35.366203 master-0 kubenswrapper[18592]: I0308 03:54:35.366081 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:35.366203 master-0 kubenswrapper[18592]: I0308 03:54:35.366127 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:35.366386 master-0 kubenswrapper[18592]: I0308 03:54:35.366235 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:35.366386 master-0 kubenswrapper[18592]: I0308 03:54:35.366349 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:35.366386 master-0 kubenswrapper[18592]: I0308 03:54:35.366367 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:35.366759 master-0 kubenswrapper[18592]: I0308 03:54:35.366428 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:35.366759 master-0 kubenswrapper[18592]: I0308 03:54:35.366484 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:35.366759 master-0 kubenswrapper[18592]: I0308 03:54:35.366529 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:35.366759 master-0 kubenswrapper[18592]: I0308 03:54:35.366574 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:35.366759 master-0 kubenswrapper[18592]: I0308 03:54:35.366671 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:35.367150 master-0 kubenswrapper[18592]: I0308 03:54:35.366758 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:35.367150 master-0 kubenswrapper[18592]: I0308 03:54:35.366778 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:35.583609 master-0 kubenswrapper[18592]: I0308 03:54:35.583531 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:35.808960 master-0 kubenswrapper[18592]: I0308 03:54:35.808905 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"899242a15b2bdf3b4a04fb323647ca94","Type":"ContainerStarted","Data":"df513c5d488ca480c7cdf43c760132dd2a2dcdb7f7ca0b6dc288fddc78147003"} Mar 08 03:54:35.811387 master-0 kubenswrapper[18592]: I0308 03:54:35.811303 18592 generic.go:334] "Generic (PLEG): container finished" podID="55240670-0b16-485e-b5cf-f3e7bc4431f5" containerID="8993f2eb590c5e6fb3bab3a6e1230e8a4a446a446cec7cc0e36dedb8eff9076c" exitCode=0 Mar 08 03:54:35.811535 master-0 kubenswrapper[18592]: I0308 03:54:35.811467 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"55240670-0b16-485e-b5cf-f3e7bc4431f5","Type":"ContainerDied","Data":"8993f2eb590c5e6fb3bab3a6e1230e8a4a446a446cec7cc0e36dedb8eff9076c"} Mar 08 03:54:35.813412 master-0 kubenswrapper[18592]: I0308 03:54:35.813341 18592 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:35.814510 master-0 kubenswrapper[18592]: I0308 03:54:35.814443 18592 status_manager.go:851] "Failed to get status for pod" podUID="55240670-0b16-485e-b5cf-f3e7bc4431f5" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:35.816778 master-0 kubenswrapper[18592]: I0308 03:54:35.816722 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-check-endpoints/0.log" Mar 08 03:54:35.818352 master-0 kubenswrapper[18592]: I0308 03:54:35.818314 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 08 03:54:35.819048 master-0 kubenswrapper[18592]: I0308 03:54:35.819009 18592 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="0a6172e8ccc8a4efe2658181eb18b7f6b4fbfb74c1d8665ad23817e21967ec14" exitCode=0 Mar 08 03:54:35.819048 master-0 kubenswrapper[18592]: I0308 03:54:35.819035 18592 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="5bc596a566a004204d8781e6880a298269208812a64f684e8f90b164a5a846fe" exitCode=0 Mar 08 03:54:35.819048 master-0 kubenswrapper[18592]: I0308 03:54:35.819048 18592 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="865460c774c2766f8b86ebf8237c6f8af6ae97a526279d303aebe43f358dbff8" exitCode=0 Mar 08 03:54:35.819284 master-0 kubenswrapper[18592]: I0308 03:54:35.819058 18592 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="77d9f19c7fff32bc633b77d809c0704eaf44b3aee7eeaf009773338793ad2dd5" exitCode=2 Mar 08 03:54:35.819284 master-0 kubenswrapper[18592]: I0308 03:54:35.819069 18592 scope.go:117] "RemoveContainer" containerID="222e8ca389049069b4efae8be97f8ff91fe671c190224c8b6f05f39079d825cf" Mar 08 03:54:36.831285 master-0 kubenswrapper[18592]: I0308 03:54:36.831187 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 08 03:54:36.834236 master-0 kubenswrapper[18592]: I0308 03:54:36.834156 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"899242a15b2bdf3b4a04fb323647ca94","Type":"ContainerStarted","Data":"457057cf98087638be7d0746060df09cfb1327961dd59ed1c045db49966856a9"} Mar 08 03:54:36.835700 master-0 kubenswrapper[18592]: I0308 03:54:36.835454 18592 status_manager.go:851] "Failed to get status for pod" podUID="55240670-0b16-485e-b5cf-f3e7bc4431f5" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:36.835700 master-0 kubenswrapper[18592]: E0308 03:54:36.835477 18592 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:37.252002 master-0 kubenswrapper[18592]: I0308 03:54:37.251320 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:54:37.253258 master-0 kubenswrapper[18592]: I0308 03:54:37.252598 18592 status_manager.go:851] "Failed to get status for pod" podUID="55240670-0b16-485e-b5cf-f3e7bc4431f5" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:37.306238 master-0 kubenswrapper[18592]: I0308 03:54:37.303616 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55240670-0b16-485e-b5cf-f3e7bc4431f5-var-lock\") pod \"55240670-0b16-485e-b5cf-f3e7bc4431f5\" (UID: \"55240670-0b16-485e-b5cf-f3e7bc4431f5\") " Mar 08 03:54:37.306238 master-0 kubenswrapper[18592]: I0308 03:54:37.303697 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55240670-0b16-485e-b5cf-f3e7bc4431f5-kube-api-access\") pod \"55240670-0b16-485e-b5cf-f3e7bc4431f5\" (UID: \"55240670-0b16-485e-b5cf-f3e7bc4431f5\") " Mar 08 03:54:37.306238 master-0 kubenswrapper[18592]: I0308 03:54:37.303724 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55240670-0b16-485e-b5cf-f3e7bc4431f5-kubelet-dir\") pod \"55240670-0b16-485e-b5cf-f3e7bc4431f5\" (UID: \"55240670-0b16-485e-b5cf-f3e7bc4431f5\") " Mar 08 03:54:37.306238 master-0 kubenswrapper[18592]: I0308 03:54:37.303752 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55240670-0b16-485e-b5cf-f3e7bc4431f5-var-lock" (OuterVolumeSpecName: "var-lock") pod "55240670-0b16-485e-b5cf-f3e7bc4431f5" (UID: "55240670-0b16-485e-b5cf-f3e7bc4431f5"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:54:37.306238 master-0 kubenswrapper[18592]: I0308 03:54:37.304168 18592 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55240670-0b16-485e-b5cf-f3e7bc4431f5-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:54:37.306238 master-0 kubenswrapper[18592]: I0308 03:54:37.304216 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55240670-0b16-485e-b5cf-f3e7bc4431f5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "55240670-0b16-485e-b5cf-f3e7bc4431f5" (UID: "55240670-0b16-485e-b5cf-f3e7bc4431f5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:54:37.317603 master-0 kubenswrapper[18592]: I0308 03:54:37.314097 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55240670-0b16-485e-b5cf-f3e7bc4431f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "55240670-0b16-485e-b5cf-f3e7bc4431f5" (UID: "55240670-0b16-485e-b5cf-f3e7bc4431f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:54:37.407355 master-0 kubenswrapper[18592]: I0308 03:54:37.407286 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55240670-0b16-485e-b5cf-f3e7bc4431f5-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:54:37.407355 master-0 kubenswrapper[18592]: I0308 03:54:37.407336 18592 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55240670-0b16-485e-b5cf-f3e7bc4431f5-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:54:37.513958 master-0 kubenswrapper[18592]: I0308 03:54:37.513928 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 08 03:54:37.514930 master-0 kubenswrapper[18592]: I0308 03:54:37.514899 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:37.515971 master-0 kubenswrapper[18592]: I0308 03:54:37.515935 18592 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:37.516493 master-0 kubenswrapper[18592]: I0308 03:54:37.516464 18592 status_manager.go:851] "Failed to get status for pod" podUID="55240670-0b16-485e-b5cf-f3e7bc4431f5" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:37.609257 master-0 kubenswrapper[18592]: I0308 03:54:37.609161 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"cdcecc61ff5eeb08bd2a3ac12599e4f9\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " Mar 08 03:54:37.609538 master-0 kubenswrapper[18592]: I0308 03:54:37.609365 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"cdcecc61ff5eeb08bd2a3ac12599e4f9\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " Mar 08 03:54:37.609538 master-0 kubenswrapper[18592]: I0308 03:54:37.609363 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "cdcecc61ff5eeb08bd2a3ac12599e4f9" (UID: "cdcecc61ff5eeb08bd2a3ac12599e4f9"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:54:37.609538 master-0 kubenswrapper[18592]: I0308 03:54:37.609495 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "cdcecc61ff5eeb08bd2a3ac12599e4f9" (UID: "cdcecc61ff5eeb08bd2a3ac12599e4f9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:54:37.609811 master-0 kubenswrapper[18592]: I0308 03:54:37.609622 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"cdcecc61ff5eeb08bd2a3ac12599e4f9\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " Mar 08 03:54:37.609811 master-0 kubenswrapper[18592]: I0308 03:54:37.609732 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "cdcecc61ff5eeb08bd2a3ac12599e4f9" (UID: "cdcecc61ff5eeb08bd2a3ac12599e4f9"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:54:37.610322 master-0 kubenswrapper[18592]: I0308 03:54:37.610269 18592 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:54:37.610322 master-0 kubenswrapper[18592]: I0308 03:54:37.610308 18592 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:54:37.610322 master-0 kubenswrapper[18592]: I0308 03:54:37.610323 18592 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:54:37.847773 master-0 kubenswrapper[18592]: I0308 03:54:37.847617 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"55240670-0b16-485e-b5cf-f3e7bc4431f5","Type":"ContainerDied","Data":"0d6f4735df790157d7454c6cb3428443ea59d8eeac7fa002c3b3fcdd9602dcd4"} Mar 08 03:54:37.847773 master-0 kubenswrapper[18592]: I0308 03:54:37.847648 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:54:37.847773 master-0 kubenswrapper[18592]: I0308 03:54:37.847679 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d6f4735df790157d7454c6cb3428443ea59d8eeac7fa002c3b3fcdd9602dcd4" Mar 08 03:54:37.852392 master-0 kubenswrapper[18592]: I0308 03:54:37.852332 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 08 03:54:37.853545 master-0 kubenswrapper[18592]: I0308 03:54:37.853498 18592 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="80a13278743d26b7b1321c7095283277668741654b1e182af894d61a0ac675ff" exitCode=0 Mar 08 03:54:37.853707 master-0 kubenswrapper[18592]: I0308 03:54:37.853642 18592 scope.go:117] "RemoveContainer" containerID="0a6172e8ccc8a4efe2658181eb18b7f6b4fbfb74c1d8665ad23817e21967ec14" Mar 08 03:54:37.853783 master-0 kubenswrapper[18592]: I0308 03:54:37.853723 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:37.855126 master-0 kubenswrapper[18592]: E0308 03:54:37.855046 18592 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:54:37.887262 master-0 kubenswrapper[18592]: I0308 03:54:37.887181 18592 scope.go:117] "RemoveContainer" containerID="5bc596a566a004204d8781e6880a298269208812a64f684e8f90b164a5a846fe" Mar 08 03:54:37.889765 master-0 kubenswrapper[18592]: I0308 03:54:37.889672 18592 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:37.890695 master-0 kubenswrapper[18592]: I0308 03:54:37.890618 18592 status_manager.go:851] "Failed to get status for pod" podUID="55240670-0b16-485e-b5cf-f3e7bc4431f5" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:37.901072 master-0 kubenswrapper[18592]: I0308 03:54:37.895033 18592 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:37.901072 master-0 kubenswrapper[18592]: I0308 03:54:37.896251 18592 status_manager.go:851] "Failed to get status for pod" podUID="55240670-0b16-485e-b5cf-f3e7bc4431f5" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:37.938032 master-0 kubenswrapper[18592]: I0308 03:54:37.937981 18592 scope.go:117] "RemoveContainer" containerID="865460c774c2766f8b86ebf8237c6f8af6ae97a526279d303aebe43f358dbff8" Mar 08 03:54:37.965611 master-0 kubenswrapper[18592]: I0308 03:54:37.965571 18592 scope.go:117] "RemoveContainer" containerID="77d9f19c7fff32bc633b77d809c0704eaf44b3aee7eeaf009773338793ad2dd5" Mar 08 03:54:37.984168 master-0 kubenswrapper[18592]: I0308 03:54:37.983479 18592 scope.go:117] "RemoveContainer" containerID="80a13278743d26b7b1321c7095283277668741654b1e182af894d61a0ac675ff" Mar 08 03:54:38.003166 master-0 kubenswrapper[18592]: I0308 03:54:38.003104 18592 scope.go:117] "RemoveContainer" containerID="1570887c60156b6fbbdb4d53007ec6f0d11589a7feaf962ad0cf0545fdd489d2" Mar 08 03:54:38.026861 master-0 kubenswrapper[18592]: I0308 03:54:38.026776 18592 scope.go:117] "RemoveContainer" containerID="0a6172e8ccc8a4efe2658181eb18b7f6b4fbfb74c1d8665ad23817e21967ec14" Mar 08 03:54:38.027547 master-0 kubenswrapper[18592]: E0308 03:54:38.027497 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a6172e8ccc8a4efe2658181eb18b7f6b4fbfb74c1d8665ad23817e21967ec14\": container with ID starting with 0a6172e8ccc8a4efe2658181eb18b7f6b4fbfb74c1d8665ad23817e21967ec14 not found: ID does not exist" containerID="0a6172e8ccc8a4efe2658181eb18b7f6b4fbfb74c1d8665ad23817e21967ec14" Mar 08 03:54:38.027640 master-0 kubenswrapper[18592]: I0308 03:54:38.027541 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a6172e8ccc8a4efe2658181eb18b7f6b4fbfb74c1d8665ad23817e21967ec14"} err="failed to get container status \"0a6172e8ccc8a4efe2658181eb18b7f6b4fbfb74c1d8665ad23817e21967ec14\": rpc error: code = NotFound desc = could not find container \"0a6172e8ccc8a4efe2658181eb18b7f6b4fbfb74c1d8665ad23817e21967ec14\": container with ID starting with 0a6172e8ccc8a4efe2658181eb18b7f6b4fbfb74c1d8665ad23817e21967ec14 not found: ID does not exist" Mar 08 03:54:38.027640 master-0 kubenswrapper[18592]: I0308 03:54:38.027574 18592 scope.go:117] "RemoveContainer" containerID="5bc596a566a004204d8781e6880a298269208812a64f684e8f90b164a5a846fe" Mar 08 03:54:38.028920 master-0 kubenswrapper[18592]: E0308 03:54:38.028857 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bc596a566a004204d8781e6880a298269208812a64f684e8f90b164a5a846fe\": container with ID starting with 5bc596a566a004204d8781e6880a298269208812a64f684e8f90b164a5a846fe not found: ID does not exist" containerID="5bc596a566a004204d8781e6880a298269208812a64f684e8f90b164a5a846fe" Mar 08 03:54:38.029079 master-0 kubenswrapper[18592]: I0308 03:54:38.028900 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc596a566a004204d8781e6880a298269208812a64f684e8f90b164a5a846fe"} err="failed to get container status \"5bc596a566a004204d8781e6880a298269208812a64f684e8f90b164a5a846fe\": rpc error: code = NotFound desc = could not find container \"5bc596a566a004204d8781e6880a298269208812a64f684e8f90b164a5a846fe\": container with ID starting with 5bc596a566a004204d8781e6880a298269208812a64f684e8f90b164a5a846fe not found: ID does not exist" Mar 08 03:54:38.029079 master-0 kubenswrapper[18592]: I0308 03:54:38.028956 18592 scope.go:117] "RemoveContainer" containerID="865460c774c2766f8b86ebf8237c6f8af6ae97a526279d303aebe43f358dbff8" Mar 08 03:54:38.029851 master-0 kubenswrapper[18592]: E0308 03:54:38.029471 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865460c774c2766f8b86ebf8237c6f8af6ae97a526279d303aebe43f358dbff8\": container with ID starting with 865460c774c2766f8b86ebf8237c6f8af6ae97a526279d303aebe43f358dbff8 not found: ID does not exist" containerID="865460c774c2766f8b86ebf8237c6f8af6ae97a526279d303aebe43f358dbff8" Mar 08 03:54:38.029851 master-0 kubenswrapper[18592]: I0308 03:54:38.029529 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865460c774c2766f8b86ebf8237c6f8af6ae97a526279d303aebe43f358dbff8"} err="failed to get container status \"865460c774c2766f8b86ebf8237c6f8af6ae97a526279d303aebe43f358dbff8\": rpc error: code = NotFound desc = could not find container \"865460c774c2766f8b86ebf8237c6f8af6ae97a526279d303aebe43f358dbff8\": container with ID starting with 865460c774c2766f8b86ebf8237c6f8af6ae97a526279d303aebe43f358dbff8 not found: ID does not exist" Mar 08 03:54:38.029851 master-0 kubenswrapper[18592]: I0308 03:54:38.029553 18592 scope.go:117] "RemoveContainer" containerID="77d9f19c7fff32bc633b77d809c0704eaf44b3aee7eeaf009773338793ad2dd5" Mar 08 03:54:38.030239 master-0 kubenswrapper[18592]: E0308 03:54:38.030174 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77d9f19c7fff32bc633b77d809c0704eaf44b3aee7eeaf009773338793ad2dd5\": container with ID starting with 77d9f19c7fff32bc633b77d809c0704eaf44b3aee7eeaf009773338793ad2dd5 not found: ID does not exist" containerID="77d9f19c7fff32bc633b77d809c0704eaf44b3aee7eeaf009773338793ad2dd5" Mar 08 03:54:38.030239 master-0 kubenswrapper[18592]: I0308 03:54:38.030216 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77d9f19c7fff32bc633b77d809c0704eaf44b3aee7eeaf009773338793ad2dd5"} err="failed to get container status \"77d9f19c7fff32bc633b77d809c0704eaf44b3aee7eeaf009773338793ad2dd5\": rpc error: code = NotFound desc = could not find container \"77d9f19c7fff32bc633b77d809c0704eaf44b3aee7eeaf009773338793ad2dd5\": container with ID starting with 77d9f19c7fff32bc633b77d809c0704eaf44b3aee7eeaf009773338793ad2dd5 not found: ID does not exist" Mar 08 03:54:38.030239 master-0 kubenswrapper[18592]: I0308 03:54:38.030240 18592 scope.go:117] "RemoveContainer" containerID="80a13278743d26b7b1321c7095283277668741654b1e182af894d61a0ac675ff" Mar 08 03:54:38.030885 master-0 kubenswrapper[18592]: E0308 03:54:38.030779 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80a13278743d26b7b1321c7095283277668741654b1e182af894d61a0ac675ff\": container with ID starting with 80a13278743d26b7b1321c7095283277668741654b1e182af894d61a0ac675ff not found: ID does not exist" containerID="80a13278743d26b7b1321c7095283277668741654b1e182af894d61a0ac675ff" Mar 08 03:54:38.030885 master-0 kubenswrapper[18592]: I0308 03:54:38.030818 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80a13278743d26b7b1321c7095283277668741654b1e182af894d61a0ac675ff"} err="failed to get container status \"80a13278743d26b7b1321c7095283277668741654b1e182af894d61a0ac675ff\": rpc error: code = NotFound desc = could not find container \"80a13278743d26b7b1321c7095283277668741654b1e182af894d61a0ac675ff\": container with ID starting with 80a13278743d26b7b1321c7095283277668741654b1e182af894d61a0ac675ff not found: ID does not exist" Mar 08 03:54:38.030885 master-0 kubenswrapper[18592]: I0308 03:54:38.030871 18592 scope.go:117] "RemoveContainer" containerID="1570887c60156b6fbbdb4d53007ec6f0d11589a7feaf962ad0cf0545fdd489d2" Mar 08 03:54:38.031269 master-0 kubenswrapper[18592]: E0308 03:54:38.031225 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1570887c60156b6fbbdb4d53007ec6f0d11589a7feaf962ad0cf0545fdd489d2\": container with ID starting with 1570887c60156b6fbbdb4d53007ec6f0d11589a7feaf962ad0cf0545fdd489d2 not found: ID does not exist" containerID="1570887c60156b6fbbdb4d53007ec6f0d11589a7feaf962ad0cf0545fdd489d2" Mar 08 03:54:38.031269 master-0 kubenswrapper[18592]: I0308 03:54:38.031256 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1570887c60156b6fbbdb4d53007ec6f0d11589a7feaf962ad0cf0545fdd489d2"} err="failed to get container status \"1570887c60156b6fbbdb4d53007ec6f0d11589a7feaf962ad0cf0545fdd489d2\": rpc error: code = NotFound desc = could not find container \"1570887c60156b6fbbdb4d53007ec6f0d11589a7feaf962ad0cf0545fdd489d2\": container with ID starting with 1570887c60156b6fbbdb4d53007ec6f0d11589a7feaf962ad0cf0545fdd489d2 not found: ID does not exist" Mar 08 03:54:38.161908 master-0 kubenswrapper[18592]: I0308 03:54:38.161725 18592 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:38.162356 master-0 kubenswrapper[18592]: I0308 03:54:38.162312 18592 status_manager.go:851] "Failed to get status for pod" podUID="55240670-0b16-485e-b5cf-f3e7bc4431f5" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:38.168969 master-0 kubenswrapper[18592]: I0308 03:54:38.168890 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" path="/var/lib/kubelet/pods/cdcecc61ff5eeb08bd2a3ac12599e4f9/volumes" Mar 08 03:54:38.754081 master-0 kubenswrapper[18592]: E0308 03:54:38.753638 18592 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-master-0.189ac16c841ac06b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:cdcecc61ff5eeb08bd2a3ac12599e4f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Killing,Message:Stopping container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:54:35.140685931 +0000 UTC m=+87.239440311,LastTimestamp:2026-03-08 03:54:35.140685931 +0000 UTC m=+87.239440311,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:54:39.135947 master-0 kubenswrapper[18592]: I0308 03:54:39.135815 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:54:39.136735 master-0 kubenswrapper[18592]: E0308 03:54:39.136630 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle podName:cacb9582-2132-4543-8a31-7b100ba4dd2f nodeName:}" failed. No retries permitted until 2026-03-08 03:55:43.136606715 +0000 UTC m=+155.235361075 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:54:40.904143 master-0 kubenswrapper[18592]: E0308 03:54:40.904046 18592 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:40.905172 master-0 kubenswrapper[18592]: E0308 03:54:40.904644 18592 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:40.905545 master-0 kubenswrapper[18592]: E0308 03:54:40.905429 18592 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:40.906584 master-0 kubenswrapper[18592]: E0308 03:54:40.906522 18592 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:40.907209 master-0 kubenswrapper[18592]: E0308 03:54:40.907159 18592 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:40.907209 master-0 kubenswrapper[18592]: I0308 03:54:40.907196 18592 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 08 03:54:40.907924 master-0 kubenswrapper[18592]: E0308 03:54:40.907812 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 08 03:54:41.109534 master-0 kubenswrapper[18592]: E0308 03:54:41.109442 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 08 03:54:41.511289 master-0 kubenswrapper[18592]: E0308 03:54:41.511184 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 08 03:54:42.313665 master-0 kubenswrapper[18592]: E0308 03:54:42.313561 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 08 03:54:43.915554 master-0 kubenswrapper[18592]: E0308 03:54:43.915390 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 08 03:54:44.624048 master-0 kubenswrapper[18592]: I0308 03:54:44.623955 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:54:44.624327 master-0 kubenswrapper[18592]: E0308 03:54:44.624247 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle podName:a26c661f-f843-45c5-85f0-2c2f72cbf580 nodeName:}" failed. No retries permitted until 2026-03-08 03:55:48.624222086 +0000 UTC m=+160.722976466 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:54:45.947956 master-0 kubenswrapper[18592]: I0308 03:54:45.946598 18592 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="255dd70f3aa78d8d4e9fb681404034a533a64980f735eecd5cf5d8b6ad4838a5" exitCode=1 Mar 08 03:54:45.947956 master-0 kubenswrapper[18592]: I0308 03:54:45.946640 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerDied","Data":"255dd70f3aa78d8d4e9fb681404034a533a64980f735eecd5cf5d8b6ad4838a5"} Mar 08 03:54:45.947956 master-0 kubenswrapper[18592]: I0308 03:54:45.946676 18592 scope.go:117] "RemoveContainer" containerID="f5a0acfb3a3f4f285f366c3abcb3f9d3bebb3626e4a976de0dab27a634745185" Mar 08 03:54:45.947956 master-0 kubenswrapper[18592]: I0308 03:54:45.947198 18592 scope.go:117] "RemoveContainer" containerID="255dd70f3aa78d8d4e9fb681404034a533a64980f735eecd5cf5d8b6ad4838a5" Mar 08 03:54:45.949530 master-0 kubenswrapper[18592]: I0308 03:54:45.949377 18592 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:45.950425 master-0 kubenswrapper[18592]: I0308 03:54:45.950002 18592 status_manager.go:851] "Failed to get status for pod" podUID="55240670-0b16-485e-b5cf-f3e7bc4431f5" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:46.955154 master-0 kubenswrapper[18592]: I0308 03:54:46.955080 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"f7171701edb795064e29edd4a52aeb0af591e01a8efb0166607b6c1961305d36"} Mar 08 03:54:46.956392 master-0 kubenswrapper[18592]: I0308 03:54:46.956308 18592 status_manager.go:851] "Failed to get status for pod" podUID="55240670-0b16-485e-b5cf-f3e7bc4431f5" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:46.957260 master-0 kubenswrapper[18592]: I0308 03:54:46.957186 18592 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:47.117243 master-0 kubenswrapper[18592]: E0308 03:54:47.117099 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 08 03:54:48.150042 master-0 kubenswrapper[18592]: I0308 03:54:48.149655 18592 status_manager.go:851] "Failed to get status for pod" podUID="55240670-0b16-485e-b5cf-f3e7bc4431f5" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:48.152143 master-0 kubenswrapper[18592]: I0308 03:54:48.152059 18592 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:48.756547 master-0 kubenswrapper[18592]: E0308 03:54:48.756236 18592 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-master-0.189ac16c841ac06b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:cdcecc61ff5eeb08bd2a3ac12599e4f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Killing,Message:Stopping container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:54:35.140685931 +0000 UTC m=+87.239440311,LastTimestamp:2026-03-08 03:54:35.140685931 +0000 UTC m=+87.239440311,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:54:49.143240 master-0 kubenswrapper[18592]: I0308 03:54:49.143112 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:49.144493 master-0 kubenswrapper[18592]: I0308 03:54:49.144441 18592 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:49.145436 master-0 kubenswrapper[18592]: I0308 03:54:49.145300 18592 status_manager.go:851] "Failed to get status for pod" podUID="55240670-0b16-485e-b5cf-f3e7bc4431f5" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:49.165731 master-0 kubenswrapper[18592]: I0308 03:54:49.165672 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7cf79b50-4b45-4fa5-a6f3-5c2de91d6bed" Mar 08 03:54:49.166277 master-0 kubenswrapper[18592]: I0308 03:54:49.166222 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7cf79b50-4b45-4fa5-a6f3-5c2de91d6bed" Mar 08 03:54:49.167340 master-0 kubenswrapper[18592]: E0308 03:54:49.167283 18592 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:49.167790 master-0 kubenswrapper[18592]: I0308 03:54:49.167760 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:49.190131 master-0 kubenswrapper[18592]: W0308 03:54:49.190092 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod077dd10388b9e3e48a07382126e86621.slice/crio-7be6477a966049755f430c8a824ad9c2ca097dcd7559ae6f1ff469df59bc4bc2 WatchSource:0}: Error finding container 7be6477a966049755f430c8a824ad9c2ca097dcd7559ae6f1ff469df59bc4bc2: Status 404 returned error can't find the container with id 7be6477a966049755f430c8a824ad9c2ca097dcd7559ae6f1ff469df59bc4bc2 Mar 08 03:54:49.979995 master-0 kubenswrapper[18592]: I0308 03:54:49.979911 18592 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="a35597610df1fae247e29fc70e7550b17b768468fb6d442a71528493ce8f3635" exitCode=0 Mar 08 03:54:49.979995 master-0 kubenswrapper[18592]: I0308 03:54:49.979972 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerDied","Data":"a35597610df1fae247e29fc70e7550b17b768468fb6d442a71528493ce8f3635"} Mar 08 03:54:49.980501 master-0 kubenswrapper[18592]: I0308 03:54:49.980011 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"7be6477a966049755f430c8a824ad9c2ca097dcd7559ae6f1ff469df59bc4bc2"} Mar 08 03:54:49.980501 master-0 kubenswrapper[18592]: I0308 03:54:49.980378 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7cf79b50-4b45-4fa5-a6f3-5c2de91d6bed" Mar 08 03:54:49.980501 master-0 kubenswrapper[18592]: I0308 03:54:49.980402 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7cf79b50-4b45-4fa5-a6f3-5c2de91d6bed" Mar 08 03:54:49.981472 master-0 kubenswrapper[18592]: I0308 03:54:49.981402 18592 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:49.981696 master-0 kubenswrapper[18592]: E0308 03:54:49.981640 18592 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:49.982347 master-0 kubenswrapper[18592]: I0308 03:54:49.982284 18592 status_manager.go:851] "Failed to get status for pod" podUID="55240670-0b16-485e-b5cf-f3e7bc4431f5" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:54:50.993447 master-0 kubenswrapper[18592]: I0308 03:54:50.992999 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"3d937ddeff4a562c97267b5e3e7157a51ad8f928d1db69fb1e607bd363c3cf25"} Mar 08 03:54:50.993447 master-0 kubenswrapper[18592]: I0308 03:54:50.993073 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"b4a05e47377b25ea4d9a9af26e5631afe65dee09a108205893226dbc248e594c"} Mar 08 03:54:50.993447 master-0 kubenswrapper[18592]: I0308 03:54:50.993097 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"101e91519fa54cd23690f6a36dea77798470079f4382e04ac20e58f00a7180da"} Mar 08 03:54:52.002862 master-0 kubenswrapper[18592]: I0308 03:54:52.002798 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"4d0d18c78b2f0279d72c1060fc00718b3f77f330ad1bd6f44039320acf418c8d"} Mar 08 03:54:52.003418 master-0 kubenswrapper[18592]: I0308 03:54:52.003400 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"da6fb8324087a24afc43afc026ff79a419163b34f188875acd9bb6ca7456a181"} Mar 08 03:54:52.003483 master-0 kubenswrapper[18592]: I0308 03:54:52.003250 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7cf79b50-4b45-4fa5-a6f3-5c2de91d6bed" Mar 08 03:54:52.003544 master-0 kubenswrapper[18592]: I0308 03:54:52.003532 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7cf79b50-4b45-4fa5-a6f3-5c2de91d6bed" Mar 08 03:54:52.003695 master-0 kubenswrapper[18592]: I0308 03:54:52.003543 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:54.168518 master-0 kubenswrapper[18592]: I0308 03:54:54.168378 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:54.168518 master-0 kubenswrapper[18592]: I0308 03:54:54.168453 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:54.188694 master-0 kubenswrapper[18592]: I0308 03:54:54.188640 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:57.042353 master-0 kubenswrapper[18592]: I0308 03:54:57.040413 18592 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:58.042415 master-0 kubenswrapper[18592]: I0308 03:54:58.042345 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7cf79b50-4b45-4fa5-a6f3-5c2de91d6bed" Mar 08 03:54:58.042415 master-0 kubenswrapper[18592]: I0308 03:54:58.042398 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7cf79b50-4b45-4fa5-a6f3-5c2de91d6bed" Mar 08 03:54:58.046094 master-0 kubenswrapper[18592]: I0308 03:54:58.046044 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:54:58.158540 master-0 kubenswrapper[18592]: I0308 03:54:58.158476 18592 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="077dd10388b9e3e48a07382126e86621" podUID="8266feeb-c6af-4652-9055-79274b93e9bb" Mar 08 03:54:59.049530 master-0 kubenswrapper[18592]: I0308 03:54:59.049472 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7cf79b50-4b45-4fa5-a6f3-5c2de91d6bed" Mar 08 03:54:59.049530 master-0 kubenswrapper[18592]: I0308 03:54:59.049509 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7cf79b50-4b45-4fa5-a6f3-5c2de91d6bed" Mar 08 03:54:59.053788 master-0 kubenswrapper[18592]: I0308 03:54:59.053716 18592 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="077dd10388b9e3e48a07382126e86621" podUID="8266feeb-c6af-4652-9055-79274b93e9bb" Mar 08 03:55:07.492156 master-0 kubenswrapper[18592]: I0308 03:55:07.492084 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 08 03:55:07.810415 master-0 kubenswrapper[18592]: I0308 03:55:07.810308 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 03:55:07.881077 master-0 kubenswrapper[18592]: I0308 03:55:07.881027 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 08 03:55:08.260026 master-0 kubenswrapper[18592]: I0308 03:55:08.259954 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 08 03:55:08.286715 master-0 kubenswrapper[18592]: I0308 03:55:08.286673 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 08 03:55:08.473365 master-0 kubenswrapper[18592]: I0308 03:55:08.473311 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 08 03:55:08.669143 master-0 kubenswrapper[18592]: I0308 03:55:08.668983 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-rnqsm" Mar 08 03:55:08.926097 master-0 kubenswrapper[18592]: I0308 03:55:08.925929 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 03:55:08.952020 master-0 kubenswrapper[18592]: I0308 03:55:08.951960 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 03:55:09.016793 master-0 kubenswrapper[18592]: I0308 03:55:09.016712 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 08 03:55:09.041000 master-0 kubenswrapper[18592]: I0308 03:55:09.040936 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 03:55:09.149375 master-0 kubenswrapper[18592]: I0308 03:55:09.149302 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 08 03:55:09.189727 master-0 kubenswrapper[18592]: I0308 03:55:09.189667 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 03:55:09.428632 master-0 kubenswrapper[18592]: I0308 03:55:09.428551 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 03:55:09.625076 master-0 kubenswrapper[18592]: I0308 03:55:09.624911 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-7dkp2" Mar 08 03:55:09.682995 master-0 kubenswrapper[18592]: I0308 03:55:09.682928 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 03:55:09.817465 master-0 kubenswrapper[18592]: I0308 03:55:09.817393 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-hdmbr" Mar 08 03:55:09.972060 master-0 kubenswrapper[18592]: I0308 03:55:09.971943 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 03:55:10.052868 master-0 kubenswrapper[18592]: I0308 03:55:10.052771 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 08 03:55:10.131103 master-0 kubenswrapper[18592]: I0308 03:55:10.130972 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 08 03:55:10.193256 master-0 kubenswrapper[18592]: I0308 03:55:10.193198 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 08 03:55:10.201947 master-0 kubenswrapper[18592]: I0308 03:55:10.201893 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 03:55:10.215306 master-0 kubenswrapper[18592]: I0308 03:55:10.215230 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 08 03:55:10.258432 master-0 kubenswrapper[18592]: I0308 03:55:10.258295 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 03:55:10.296587 master-0 kubenswrapper[18592]: I0308 03:55:10.296525 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 08 03:55:10.389548 master-0 kubenswrapper[18592]: I0308 03:55:10.389465 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 08 03:55:10.476415 master-0 kubenswrapper[18592]: I0308 03:55:10.476347 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 03:55:10.647597 master-0 kubenswrapper[18592]: I0308 03:55:10.647428 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 03:55:10.655026 master-0 kubenswrapper[18592]: I0308 03:55:10.654959 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-g8pwp" Mar 08 03:55:10.788973 master-0 kubenswrapper[18592]: I0308 03:55:10.788898 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 03:55:10.888632 master-0 kubenswrapper[18592]: I0308 03:55:10.888526 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-lnbcj" Mar 08 03:55:11.029716 master-0 kubenswrapper[18592]: I0308 03:55:11.029654 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 08 03:55:11.206655 master-0 kubenswrapper[18592]: I0308 03:55:11.206507 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 03:55:11.221562 master-0 kubenswrapper[18592]: I0308 03:55:11.221497 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 03:55:11.221727 master-0 kubenswrapper[18592]: I0308 03:55:11.221575 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 03:55:11.276447 master-0 kubenswrapper[18592]: I0308 03:55:11.276364 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 03:55:11.385918 master-0 kubenswrapper[18592]: I0308 03:55:11.385723 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 03:55:11.390586 master-0 kubenswrapper[18592]: I0308 03:55:11.390515 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 08 03:55:11.431500 master-0 kubenswrapper[18592]: I0308 03:55:11.431428 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 03:55:11.509962 master-0 kubenswrapper[18592]: I0308 03:55:11.509879 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 03:55:11.559321 master-0 kubenswrapper[18592]: I0308 03:55:11.559246 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 08 03:55:11.717997 master-0 kubenswrapper[18592]: I0308 03:55:11.717904 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 08 03:55:11.720493 master-0 kubenswrapper[18592]: I0308 03:55:11.720447 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 03:55:11.756591 master-0 kubenswrapper[18592]: I0308 03:55:11.756509 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 08 03:55:11.771054 master-0 kubenswrapper[18592]: I0308 03:55:11.770900 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 08 03:55:11.791591 master-0 kubenswrapper[18592]: I0308 03:55:11.791511 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 03:55:11.873573 master-0 kubenswrapper[18592]: I0308 03:55:11.873487 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 08 03:55:12.200137 master-0 kubenswrapper[18592]: I0308 03:55:12.200068 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-55li3r6nupslu" Mar 08 03:55:12.231180 master-0 kubenswrapper[18592]: I0308 03:55:12.231063 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 08 03:55:12.239678 master-0 kubenswrapper[18592]: I0308 03:55:12.239631 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 03:55:12.383671 master-0 kubenswrapper[18592]: I0308 03:55:12.383609 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 08 03:55:12.414710 master-0 kubenswrapper[18592]: I0308 03:55:12.414647 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 03:55:12.439424 master-0 kubenswrapper[18592]: I0308 03:55:12.439380 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 03:55:12.449753 master-0 kubenswrapper[18592]: I0308 03:55:12.449671 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 03:55:12.493585 master-0 kubenswrapper[18592]: I0308 03:55:12.493476 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:55:12.735395 master-0 kubenswrapper[18592]: I0308 03:55:12.735336 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-n8nt7" Mar 08 03:55:12.759290 master-0 kubenswrapper[18592]: I0308 03:55:12.759175 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-27jjw" Mar 08 03:55:12.788412 master-0 kubenswrapper[18592]: I0308 03:55:12.788356 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 08 03:55:12.834504 master-0 kubenswrapper[18592]: I0308 03:55:12.834439 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 03:55:12.853338 master-0 kubenswrapper[18592]: I0308 03:55:12.853271 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 03:55:12.893045 master-0 kubenswrapper[18592]: I0308 03:55:12.892964 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 03:55:12.899802 master-0 kubenswrapper[18592]: I0308 03:55:12.899760 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 03:55:12.902145 master-0 kubenswrapper[18592]: I0308 03:55:12.902095 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 03:55:12.924855 master-0 kubenswrapper[18592]: I0308 03:55:12.924767 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 08 03:55:12.974850 master-0 kubenswrapper[18592]: I0308 03:55:12.974755 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 08 03:55:12.990130 master-0 kubenswrapper[18592]: I0308 03:55:12.990070 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 03:55:13.006628 master-0 kubenswrapper[18592]: I0308 03:55:13.006543 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 08 03:55:13.009100 master-0 kubenswrapper[18592]: I0308 03:55:13.009051 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 08 03:55:13.010701 master-0 kubenswrapper[18592]: I0308 03:55:13.010575 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 08 03:55:13.061314 master-0 kubenswrapper[18592]: I0308 03:55:13.061269 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 08 03:55:13.098640 master-0 kubenswrapper[18592]: I0308 03:55:13.098542 18592 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 03:55:13.168939 master-0 kubenswrapper[18592]: I0308 03:55:13.168864 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 03:55:13.302862 master-0 kubenswrapper[18592]: I0308 03:55:13.302666 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 03:55:13.311450 master-0 kubenswrapper[18592]: I0308 03:55:13.311391 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 03:55:13.389176 master-0 kubenswrapper[18592]: I0308 03:55:13.389037 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 03:55:13.428013 master-0 kubenswrapper[18592]: I0308 03:55:13.427917 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 08 03:55:13.473122 master-0 kubenswrapper[18592]: I0308 03:55:13.473079 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 08 03:55:13.553194 master-0 kubenswrapper[18592]: I0308 03:55:13.553056 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 03:55:13.567220 master-0 kubenswrapper[18592]: I0308 03:55:13.567113 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 08 03:55:13.688052 master-0 kubenswrapper[18592]: I0308 03:55:13.687967 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 03:55:13.697092 master-0 kubenswrapper[18592]: I0308 03:55:13.697039 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 03:55:13.829514 master-0 kubenswrapper[18592]: I0308 03:55:13.829347 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 08 03:55:13.850703 master-0 kubenswrapper[18592]: I0308 03:55:13.850592 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 08 03:55:13.863387 master-0 kubenswrapper[18592]: I0308 03:55:13.863335 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 08 03:55:13.967749 master-0 kubenswrapper[18592]: I0308 03:55:13.967663 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 03:55:13.996792 master-0 kubenswrapper[18592]: I0308 03:55:13.996724 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 08 03:55:14.005441 master-0 kubenswrapper[18592]: I0308 03:55:14.005382 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 08 03:55:14.015673 master-0 kubenswrapper[18592]: I0308 03:55:14.015580 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 03:55:14.036207 master-0 kubenswrapper[18592]: I0308 03:55:14.036144 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 08 03:55:14.036815 master-0 kubenswrapper[18592]: I0308 03:55:14.036778 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 08 03:55:14.065350 master-0 kubenswrapper[18592]: I0308 03:55:14.065291 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 08 03:55:14.116304 master-0 kubenswrapper[18592]: I0308 03:55:14.116138 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-cg9nh" Mar 08 03:55:14.205027 master-0 kubenswrapper[18592]: I0308 03:55:14.204961 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 03:55:14.265369 master-0 kubenswrapper[18592]: I0308 03:55:14.265305 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 03:55:14.305758 master-0 kubenswrapper[18592]: I0308 03:55:14.305710 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 03:55:14.336316 master-0 kubenswrapper[18592]: I0308 03:55:14.336280 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 03:55:14.370356 master-0 kubenswrapper[18592]: I0308 03:55:14.370229 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 03:55:14.520766 master-0 kubenswrapper[18592]: I0308 03:55:14.520707 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-hbsn4" Mar 08 03:55:14.528846 master-0 kubenswrapper[18592]: I0308 03:55:14.528736 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 03:55:14.539299 master-0 kubenswrapper[18592]: I0308 03:55:14.539232 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 08 03:55:14.580389 master-0 kubenswrapper[18592]: I0308 03:55:14.580301 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 03:55:14.733093 master-0 kubenswrapper[18592]: I0308 03:55:14.733045 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 08 03:55:14.773203 master-0 kubenswrapper[18592]: I0308 03:55:14.773138 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 03:55:14.903480 master-0 kubenswrapper[18592]: I0308 03:55:14.903392 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 08 03:55:14.918539 master-0 kubenswrapper[18592]: I0308 03:55:14.918498 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 08 03:55:15.066721 master-0 kubenswrapper[18592]: I0308 03:55:15.066595 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 03:55:15.104594 master-0 kubenswrapper[18592]: I0308 03:55:15.104536 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 08 03:55:15.213868 master-0 kubenswrapper[18592]: I0308 03:55:15.211006 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 03:55:15.214674 master-0 kubenswrapper[18592]: I0308 03:55:15.214604 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-92fqc" Mar 08 03:55:15.236709 master-0 kubenswrapper[18592]: I0308 03:55:15.236667 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 03:55:15.285424 master-0 kubenswrapper[18592]: I0308 03:55:15.285388 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 08 03:55:15.286919 master-0 kubenswrapper[18592]: I0308 03:55:15.286886 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-c6d9k" Mar 08 03:55:15.297173 master-0 kubenswrapper[18592]: I0308 03:55:15.297097 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 03:55:15.309709 master-0 kubenswrapper[18592]: I0308 03:55:15.309676 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 03:55:15.316635 master-0 kubenswrapper[18592]: I0308 03:55:15.316574 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 08 03:55:15.337629 master-0 kubenswrapper[18592]: I0308 03:55:15.337512 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 03:55:15.459772 master-0 kubenswrapper[18592]: I0308 03:55:15.459704 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 08 03:55:15.483863 master-0 kubenswrapper[18592]: I0308 03:55:15.483770 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 03:55:15.494762 master-0 kubenswrapper[18592]: I0308 03:55:15.494712 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 08 03:55:15.497372 master-0 kubenswrapper[18592]: I0308 03:55:15.497301 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 03:55:15.556239 master-0 kubenswrapper[18592]: I0308 03:55:15.556194 18592 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 03:55:15.580602 master-0 kubenswrapper[18592]: I0308 03:55:15.580542 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 03:55:15.594916 master-0 kubenswrapper[18592]: I0308 03:55:15.594796 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 03:55:15.599258 master-0 kubenswrapper[18592]: I0308 03:55:15.599221 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 08 03:55:15.692994 master-0 kubenswrapper[18592]: I0308 03:55:15.692904 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 08 03:55:15.719861 master-0 kubenswrapper[18592]: I0308 03:55:15.719771 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 03:55:15.740638 master-0 kubenswrapper[18592]: I0308 03:55:15.740536 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 03:55:15.746669 master-0 kubenswrapper[18592]: I0308 03:55:15.746627 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 03:55:15.758667 master-0 kubenswrapper[18592]: I0308 03:55:15.758619 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 03:55:15.805810 master-0 kubenswrapper[18592]: I0308 03:55:15.805750 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-44sfif47rohlm" Mar 08 03:55:15.857285 master-0 kubenswrapper[18592]: I0308 03:55:15.857095 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 03:55:15.937981 master-0 kubenswrapper[18592]: I0308 03:55:15.937915 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 03:55:15.976025 master-0 kubenswrapper[18592]: I0308 03:55:15.975965 18592 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 03:55:16.014987 master-0 kubenswrapper[18592]: I0308 03:55:16.014904 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 08 03:55:16.029601 master-0 kubenswrapper[18592]: I0308 03:55:16.029559 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-gnr8f" Mar 08 03:55:16.081418 master-0 kubenswrapper[18592]: I0308 03:55:16.081347 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 03:55:16.140174 master-0 kubenswrapper[18592]: I0308 03:55:16.140023 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 03:55:16.168600 master-0 kubenswrapper[18592]: I0308 03:55:16.168525 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 03:55:16.283564 master-0 kubenswrapper[18592]: I0308 03:55:16.283476 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 08 03:55:16.452719 master-0 kubenswrapper[18592]: I0308 03:55:16.452625 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 03:55:16.531910 master-0 kubenswrapper[18592]: I0308 03:55:16.531394 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 08 03:55:16.587047 master-0 kubenswrapper[18592]: I0308 03:55:16.586953 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 03:55:16.755744 master-0 kubenswrapper[18592]: I0308 03:55:16.755577 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 03:55:16.772032 master-0 kubenswrapper[18592]: I0308 03:55:16.771963 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 08 03:55:16.797764 master-0 kubenswrapper[18592]: I0308 03:55:16.797689 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 08 03:55:16.836089 master-0 kubenswrapper[18592]: I0308 03:55:16.836016 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 03:55:16.854680 master-0 kubenswrapper[18592]: I0308 03:55:16.854622 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 03:55:16.871733 master-0 kubenswrapper[18592]: I0308 03:55:16.871675 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 03:55:16.909301 master-0 kubenswrapper[18592]: I0308 03:55:16.909211 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 08 03:55:16.911415 master-0 kubenswrapper[18592]: I0308 03:55:16.911369 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 08 03:55:16.953559 master-0 kubenswrapper[18592]: I0308 03:55:16.953484 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 03:55:16.974515 master-0 kubenswrapper[18592]: I0308 03:55:16.974463 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 08 03:55:17.001959 master-0 kubenswrapper[18592]: I0308 03:55:17.001901 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 03:55:17.088595 master-0 kubenswrapper[18592]: I0308 03:55:17.088437 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 08 03:55:17.165505 master-0 kubenswrapper[18592]: I0308 03:55:17.165434 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 03:55:17.171961 master-0 kubenswrapper[18592]: I0308 03:55:17.171664 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 03:55:17.259551 master-0 kubenswrapper[18592]: I0308 03:55:17.259504 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 03:55:17.271877 master-0 kubenswrapper[18592]: I0308 03:55:17.271795 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 08 03:55:17.287456 master-0 kubenswrapper[18592]: I0308 03:55:17.287368 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 03:55:17.302857 master-0 kubenswrapper[18592]: I0308 03:55:17.302774 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 08 03:55:17.309917 master-0 kubenswrapper[18592]: I0308 03:55:17.309858 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 08 03:55:17.401369 master-0 kubenswrapper[18592]: I0308 03:55:17.400313 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 08 03:55:17.423622 master-0 kubenswrapper[18592]: I0308 03:55:17.423573 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 03:55:17.439933 master-0 kubenswrapper[18592]: I0308 03:55:17.439212 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 03:55:17.453161 master-0 kubenswrapper[18592]: I0308 03:55:17.453109 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 08 03:55:17.535081 master-0 kubenswrapper[18592]: I0308 03:55:17.535006 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 03:55:17.572858 master-0 kubenswrapper[18592]: I0308 03:55:17.570041 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 03:55:17.616751 master-0 kubenswrapper[18592]: I0308 03:55:17.616682 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 03:55:17.699442 master-0 kubenswrapper[18592]: I0308 03:55:17.698863 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 03:55:17.757924 master-0 kubenswrapper[18592]: I0308 03:55:17.753298 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 03:55:17.757924 master-0 kubenswrapper[18592]: I0308 03:55:17.755401 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 08 03:55:17.796841 master-0 kubenswrapper[18592]: I0308 03:55:17.788606 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 03:55:17.908869 master-0 kubenswrapper[18592]: I0308 03:55:17.908749 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 03:55:17.954437 master-0 kubenswrapper[18592]: I0308 03:55:17.954264 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 08 03:55:18.013819 master-0 kubenswrapper[18592]: I0308 03:55:18.013741 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 08 03:55:18.023777 master-0 kubenswrapper[18592]: I0308 03:55:18.023708 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 08 03:55:18.063627 master-0 kubenswrapper[18592]: I0308 03:55:18.063543 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 08 03:55:18.080844 master-0 kubenswrapper[18592]: I0308 03:55:18.080758 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 03:55:18.109525 master-0 kubenswrapper[18592]: I0308 03:55:18.109429 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 03:55:18.176669 master-0 kubenswrapper[18592]: I0308 03:55:18.176591 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-7cjnr" Mar 08 03:55:18.186734 master-0 kubenswrapper[18592]: I0308 03:55:18.186668 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 08 03:55:18.208233 master-0 kubenswrapper[18592]: I0308 03:55:18.208002 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 03:55:18.292625 master-0 kubenswrapper[18592]: I0308 03:55:18.292551 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 03:55:18.296563 master-0 kubenswrapper[18592]: I0308 03:55:18.296509 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 08 03:55:18.311847 master-0 kubenswrapper[18592]: I0308 03:55:18.311790 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 03:55:18.330191 master-0 kubenswrapper[18592]: I0308 03:55:18.330104 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 03:55:18.352649 master-0 kubenswrapper[18592]: I0308 03:55:18.352547 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-slxzk" Mar 08 03:55:18.536307 master-0 kubenswrapper[18592]: I0308 03:55:18.536198 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 08 03:55:18.582271 master-0 kubenswrapper[18592]: I0308 03:55:18.582195 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 03:55:18.583842 master-0 kubenswrapper[18592]: I0308 03:55:18.583779 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 03:55:18.600394 master-0 kubenswrapper[18592]: I0308 03:55:18.600355 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-xbmc7" Mar 08 03:55:18.637863 master-0 kubenswrapper[18592]: I0308 03:55:18.632774 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 08 03:55:18.665474 master-0 kubenswrapper[18592]: I0308 03:55:18.665439 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 03:55:18.672566 master-0 kubenswrapper[18592]: I0308 03:55:18.672551 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 08 03:55:18.708930 master-0 kubenswrapper[18592]: I0308 03:55:18.708863 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-qfnls" Mar 08 03:55:18.737815 master-0 kubenswrapper[18592]: I0308 03:55:18.737743 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 03:55:18.747689 master-0 kubenswrapper[18592]: I0308 03:55:18.747595 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-3ipfg9gfac918" Mar 08 03:55:18.851320 master-0 kubenswrapper[18592]: I0308 03:55:18.850316 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 08 03:55:18.895121 master-0 kubenswrapper[18592]: I0308 03:55:18.895036 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 03:55:18.902403 master-0 kubenswrapper[18592]: I0308 03:55:18.902354 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 03:55:18.939810 master-0 kubenswrapper[18592]: I0308 03:55:18.939742 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 03:55:18.977752 master-0 kubenswrapper[18592]: I0308 03:55:18.977646 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 08 03:55:18.999574 master-0 kubenswrapper[18592]: I0308 03:55:18.999518 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 08 03:55:19.039598 master-0 kubenswrapper[18592]: I0308 03:55:19.039563 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 03:55:19.040948 master-0 kubenswrapper[18592]: I0308 03:55:19.040887 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 03:55:19.067496 master-0 kubenswrapper[18592]: I0308 03:55:19.067435 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 03:55:19.086315 master-0 kubenswrapper[18592]: I0308 03:55:19.086185 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:55:19.086817 master-0 kubenswrapper[18592]: I0308 03:55:19.086775 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-4rdlv" Mar 08 03:55:19.174208 master-0 kubenswrapper[18592]: I0308 03:55:19.174035 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 03:55:19.236704 master-0 kubenswrapper[18592]: I0308 03:55:19.236249 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 08 03:55:19.247301 master-0 kubenswrapper[18592]: I0308 03:55:19.247222 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 03:55:19.247752 master-0 kubenswrapper[18592]: I0308 03:55:19.247718 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 03:55:19.284548 master-0 kubenswrapper[18592]: I0308 03:55:19.284498 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 08 03:55:19.302676 master-0 kubenswrapper[18592]: I0308 03:55:19.302617 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 03:55:19.353080 master-0 kubenswrapper[18592]: I0308 03:55:19.353026 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 03:55:19.449922 master-0 kubenswrapper[18592]: I0308 03:55:19.449407 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 03:55:19.521456 master-0 kubenswrapper[18592]: I0308 03:55:19.521380 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 03:55:19.536638 master-0 kubenswrapper[18592]: I0308 03:55:19.536565 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 03:55:19.650283 master-0 kubenswrapper[18592]: I0308 03:55:19.650205 18592 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 03:55:19.659955 master-0 kubenswrapper[18592]: I0308 03:55:19.659896 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 03:55:19.660109 master-0 kubenswrapper[18592]: I0308 03:55:19.659981 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 03:55:19.674091 master-0 kubenswrapper[18592]: I0308 03:55:19.674029 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:55:19.702770 master-0 kubenswrapper[18592]: I0308 03:55:19.702563 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=22.702540166 podStartE2EDuration="22.702540166s" podCreationTimestamp="2026-03-08 03:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:55:19.695711831 +0000 UTC m=+131.794466221" watchObservedRunningTime="2026-03-08 03:55:19.702540166 +0000 UTC m=+131.801294546" Mar 08 03:55:19.727269 master-0 kubenswrapper[18592]: I0308 03:55:19.727212 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 03:55:19.778582 master-0 kubenswrapper[18592]: I0308 03:55:19.778546 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 03:55:19.829851 master-0 kubenswrapper[18592]: I0308 03:55:19.826491 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 03:55:19.862370 master-0 kubenswrapper[18592]: I0308 03:55:19.862297 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-hvbwb" Mar 08 03:55:19.922158 master-0 kubenswrapper[18592]: I0308 03:55:19.922077 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 08 03:55:19.964646 master-0 kubenswrapper[18592]: I0308 03:55:19.964516 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 08 03:55:20.050664 master-0 kubenswrapper[18592]: I0308 03:55:20.050594 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 08 03:55:20.065177 master-0 kubenswrapper[18592]: I0308 03:55:20.065134 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 03:55:20.102684 master-0 kubenswrapper[18592]: I0308 03:55:20.102602 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 03:55:20.184443 master-0 kubenswrapper[18592]: I0308 03:55:20.184375 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 03:55:20.185183 master-0 kubenswrapper[18592]: I0308 03:55:20.185139 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 03:55:20.202731 master-0 kubenswrapper[18592]: I0308 03:55:20.202685 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 03:55:20.347355 master-0 kubenswrapper[18592]: I0308 03:55:20.347256 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 08 03:55:20.349360 master-0 kubenswrapper[18592]: I0308 03:55:20.349311 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 03:55:20.364568 master-0 kubenswrapper[18592]: I0308 03:55:20.364547 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 08 03:55:20.387674 master-0 kubenswrapper[18592]: I0308 03:55:20.387587 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 03:55:20.545279 master-0 kubenswrapper[18592]: I0308 03:55:20.545205 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 08 03:55:20.709046 master-0 kubenswrapper[18592]: I0308 03:55:20.708976 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 03:55:20.751238 master-0 kubenswrapper[18592]: I0308 03:55:20.751136 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 03:55:20.763723 master-0 kubenswrapper[18592]: I0308 03:55:20.763650 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 08 03:55:20.771309 master-0 kubenswrapper[18592]: I0308 03:55:20.771267 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 03:55:20.884998 master-0 kubenswrapper[18592]: I0308 03:55:20.884953 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 03:55:20.947285 master-0 kubenswrapper[18592]: I0308 03:55:20.947220 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 03:55:21.262198 master-0 kubenswrapper[18592]: I0308 03:55:21.262136 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 03:55:21.289560 master-0 kubenswrapper[18592]: I0308 03:55:21.289131 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 03:55:21.290564 master-0 kubenswrapper[18592]: I0308 03:55:21.290180 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 08 03:55:21.361306 master-0 kubenswrapper[18592]: I0308 03:55:21.361241 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 03:55:21.405087 master-0 kubenswrapper[18592]: I0308 03:55:21.404973 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 03:55:21.454048 master-0 kubenswrapper[18592]: I0308 03:55:21.453983 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 08 03:55:21.462162 master-0 kubenswrapper[18592]: I0308 03:55:21.462111 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 03:55:21.502794 master-0 kubenswrapper[18592]: I0308 03:55:21.502737 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 08 03:55:21.533577 master-0 kubenswrapper[18592]: I0308 03:55:21.533473 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 03:55:21.541233 master-0 kubenswrapper[18592]: I0308 03:55:21.541193 18592 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 03:55:21.644791 master-0 kubenswrapper[18592]: I0308 03:55:21.644712 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 08 03:55:21.671143 master-0 kubenswrapper[18592]: I0308 03:55:21.671070 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 03:55:21.729140 master-0 kubenswrapper[18592]: I0308 03:55:21.729081 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:55:21.738719 master-0 kubenswrapper[18592]: I0308 03:55:21.738686 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 08 03:55:21.788303 master-0 kubenswrapper[18592]: I0308 03:55:21.788181 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 03:55:21.905862 master-0 kubenswrapper[18592]: I0308 03:55:21.905766 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 03:55:21.911024 master-0 kubenswrapper[18592]: I0308 03:55:21.910945 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 08 03:55:22.006931 master-0 kubenswrapper[18592]: I0308 03:55:22.006878 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 03:55:22.023731 master-0 kubenswrapper[18592]: I0308 03:55:22.023669 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 03:55:22.109448 master-0 kubenswrapper[18592]: I0308 03:55:22.109283 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 08 03:55:22.222958 master-0 kubenswrapper[18592]: I0308 03:55:22.222889 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 03:55:22.300346 master-0 kubenswrapper[18592]: I0308 03:55:22.300253 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 03:55:22.393716 master-0 kubenswrapper[18592]: I0308 03:55:22.393568 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 03:55:22.419951 master-0 kubenswrapper[18592]: I0308 03:55:22.419885 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 03:55:22.640446 master-0 kubenswrapper[18592]: I0308 03:55:22.640366 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 08 03:55:22.683906 master-0 kubenswrapper[18592]: I0308 03:55:22.683713 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 03:55:22.782624 master-0 kubenswrapper[18592]: I0308 03:55:22.782536 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 08 03:55:22.913606 master-0 kubenswrapper[18592]: I0308 03:55:22.913533 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 03:55:23.066225 master-0 kubenswrapper[18592]: I0308 03:55:23.066166 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 03:55:25.089690 master-0 kubenswrapper[18592]: I0308 03:55:25.089567 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 08 03:55:25.107585 master-0 kubenswrapper[18592]: I0308 03:55:25.107548 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 03:55:25.137511 master-0 kubenswrapper[18592]: I0308 03:55:25.137455 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 03:55:26.207748 master-0 kubenswrapper[18592]: E0308 03:55:26.207654 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[trusted-ca], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" podUID="48ab3c8e-a2bd-4380-9e8d-a41d515a989d" Mar 08 03:55:26.288648 master-0 kubenswrapper[18592]: I0308 03:55:26.288565 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:55:31.098137 master-0 kubenswrapper[18592]: I0308 03:55:31.095142 18592 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 03:55:31.098137 master-0 kubenswrapper[18592]: I0308 03:55:31.095394 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" containerID="cri-o://457057cf98087638be7d0746060df09cfb1327961dd59ed1c045db49966856a9" gracePeriod=5 Mar 08 03:55:31.139854 master-0 kubenswrapper[18592]: I0308 03:55:31.139770 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:55:31.140154 master-0 kubenswrapper[18592]: E0308 03:55:31.140097 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca podName:48ab3c8e-a2bd-4380-9e8d-a41d515a989d nodeName:}" failed. No retries permitted until 2026-03-08 03:57:33.140072762 +0000 UTC m=+265.238827152 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca") pod "console-operator-6c7fb6b958-mr9k6" (UID: "48ab3c8e-a2bd-4380-9e8d-a41d515a989d") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:55:36.405002 master-0 kubenswrapper[18592]: I0308 03:55:36.404816 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_899242a15b2bdf3b4a04fb323647ca94/startup-monitor/0.log" Mar 08 03:55:36.405002 master-0 kubenswrapper[18592]: I0308 03:55:36.404968 18592 generic.go:334] "Generic (PLEG): container finished" podID="899242a15b2bdf3b4a04fb323647ca94" containerID="457057cf98087638be7d0746060df09cfb1327961dd59ed1c045db49966856a9" exitCode=137 Mar 08 03:55:36.699989 master-0 kubenswrapper[18592]: I0308 03:55:36.699900 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_899242a15b2bdf3b4a04fb323647ca94/startup-monitor/0.log" Mar 08 03:55:36.700208 master-0 kubenswrapper[18592]: I0308 03:55:36.700046 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:55:36.746053 master-0 kubenswrapper[18592]: I0308 03:55:36.745971 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 08 03:55:36.746368 master-0 kubenswrapper[18592]: I0308 03:55:36.746254 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 08 03:55:36.746462 master-0 kubenswrapper[18592]: I0308 03:55:36.746348 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock" (OuterVolumeSpecName: "var-lock") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:55:36.746534 master-0 kubenswrapper[18592]: I0308 03:55:36.746494 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 08 03:55:36.746627 master-0 kubenswrapper[18592]: I0308 03:55:36.746596 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 08 03:55:36.746792 master-0 kubenswrapper[18592]: I0308 03:55:36.746724 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 08 03:55:36.746792 master-0 kubenswrapper[18592]: I0308 03:55:36.746743 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log" (OuterVolumeSpecName: "var-log") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:55:36.746974 master-0 kubenswrapper[18592]: I0308 03:55:36.746869 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:55:36.747041 master-0 kubenswrapper[18592]: I0308 03:55:36.747010 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests" (OuterVolumeSpecName: "manifests") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:55:36.747612 master-0 kubenswrapper[18592]: I0308 03:55:36.747550 18592 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") on node \"master-0\" DevicePath \"\"" Mar 08 03:55:36.747686 master-0 kubenswrapper[18592]: I0308 03:55:36.747609 18592 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:55:36.747686 master-0 kubenswrapper[18592]: I0308 03:55:36.747643 18592 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") on node \"master-0\" DevicePath \"\"" Mar 08 03:55:36.747686 master-0 kubenswrapper[18592]: I0308 03:55:36.747668 18592 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:55:36.754245 master-0 kubenswrapper[18592]: I0308 03:55:36.754170 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:55:36.848754 master-0 kubenswrapper[18592]: I0308 03:55:36.848664 18592 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:55:37.417295 master-0 kubenswrapper[18592]: I0308 03:55:37.416688 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_899242a15b2bdf3b4a04fb323647ca94/startup-monitor/0.log" Mar 08 03:55:37.417295 master-0 kubenswrapper[18592]: I0308 03:55:37.417258 18592 scope.go:117] "RemoveContainer" containerID="457057cf98087638be7d0746060df09cfb1327961dd59ed1c045db49966856a9" Mar 08 03:55:37.417295 master-0 kubenswrapper[18592]: I0308 03:55:37.417288 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:55:38.127518 master-0 kubenswrapper[18592]: E0308 03:55:38.127424 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[alertmanager-trusted-ca-bundle], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-monitoring/alertmanager-main-0" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" Mar 08 03:55:38.157095 master-0 kubenswrapper[18592]: I0308 03:55:38.157011 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="899242a15b2bdf3b4a04fb323647ca94" path="/var/lib/kubelet/pods/899242a15b2bdf3b4a04fb323647ca94/volumes" Mar 08 03:55:38.431966 master-0 kubenswrapper[18592]: I0308 03:55:38.431809 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:55:43.151422 master-0 kubenswrapper[18592]: I0308 03:55:43.151337 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:55:43.152460 master-0 kubenswrapper[18592]: E0308 03:55:43.151603 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle podName:cacb9582-2132-4543-8a31-7b100ba4dd2f nodeName:}" failed. No retries permitted until 2026-03-08 03:57:45.151571554 +0000 UTC m=+277.250325944 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:55:43.622815 master-0 kubenswrapper[18592]: E0308 03:55:43.622731 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[prometheus-trusted-ca-bundle], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-monitoring/prometheus-k8s-0" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" Mar 08 03:55:44.488575 master-0 kubenswrapper[18592]: I0308 03:55:44.488502 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:55:48.648745 master-0 kubenswrapper[18592]: I0308 03:55:48.648600 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:55:48.649905 master-0 kubenswrapper[18592]: E0308 03:55:48.648921 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle podName:a26c661f-f843-45c5-85f0-2c2f72cbf580 nodeName:}" failed. No retries permitted until 2026-03-08 03:57:50.648884859 +0000 UTC m=+282.747639269 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580") : configmap references non-existent config key: ca-bundle.crt Mar 08 03:55:52.606048 master-0 kubenswrapper[18592]: I0308 03:55:52.605950 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 08 03:56:06.923908 master-0 kubenswrapper[18592]: I0308 03:56:06.923808 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-n77ps"] Mar 08 03:56:06.925100 master-0 kubenswrapper[18592]: E0308 03:56:06.924342 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" Mar 08 03:56:06.925100 master-0 kubenswrapper[18592]: I0308 03:56:06.924370 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" Mar 08 03:56:06.925100 master-0 kubenswrapper[18592]: E0308 03:56:06.924406 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55240670-0b16-485e-b5cf-f3e7bc4431f5" containerName="installer" Mar 08 03:56:06.925100 master-0 kubenswrapper[18592]: I0308 03:56:06.924419 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="55240670-0b16-485e-b5cf-f3e7bc4431f5" containerName="installer" Mar 08 03:56:06.925100 master-0 kubenswrapper[18592]: I0308 03:56:06.924653 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" Mar 08 03:56:06.925100 master-0 kubenswrapper[18592]: I0308 03:56:06.924684 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="55240670-0b16-485e-b5cf-f3e7bc4431f5" containerName="installer" Mar 08 03:56:06.925434 master-0 kubenswrapper[18592]: I0308 03:56:06.925395 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n77ps" Mar 08 03:56:06.927667 master-0 kubenswrapper[18592]: I0308 03:56:06.927609 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 08 03:56:06.928176 master-0 kubenswrapper[18592]: I0308 03:56:06.928089 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4cbgb" Mar 08 03:56:06.977810 master-0 kubenswrapper[18592]: I0308 03:56:06.977752 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/83000169-b43a-41b4-9e1c-43d99dade81a-serviceca\") pod \"node-ca-n77ps\" (UID: \"83000169-b43a-41b4-9e1c-43d99dade81a\") " pod="openshift-image-registry/node-ca-n77ps" Mar 08 03:56:06.978047 master-0 kubenswrapper[18592]: I0308 03:56:06.977892 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcmtv\" (UniqueName: \"kubernetes.io/projected/83000169-b43a-41b4-9e1c-43d99dade81a-kube-api-access-tcmtv\") pod \"node-ca-n77ps\" (UID: \"83000169-b43a-41b4-9e1c-43d99dade81a\") " pod="openshift-image-registry/node-ca-n77ps" Mar 08 03:56:06.978047 master-0 kubenswrapper[18592]: I0308 03:56:06.977921 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83000169-b43a-41b4-9e1c-43d99dade81a-host\") pod \"node-ca-n77ps\" (UID: \"83000169-b43a-41b4-9e1c-43d99dade81a\") " pod="openshift-image-registry/node-ca-n77ps" Mar 08 03:56:07.079541 master-0 kubenswrapper[18592]: I0308 03:56:07.079466 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/83000169-b43a-41b4-9e1c-43d99dade81a-serviceca\") pod \"node-ca-n77ps\" (UID: \"83000169-b43a-41b4-9e1c-43d99dade81a\") " pod="openshift-image-registry/node-ca-n77ps" Mar 08 03:56:07.079729 master-0 kubenswrapper[18592]: I0308 03:56:07.079613 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcmtv\" (UniqueName: \"kubernetes.io/projected/83000169-b43a-41b4-9e1c-43d99dade81a-kube-api-access-tcmtv\") pod \"node-ca-n77ps\" (UID: \"83000169-b43a-41b4-9e1c-43d99dade81a\") " pod="openshift-image-registry/node-ca-n77ps" Mar 08 03:56:07.079729 master-0 kubenswrapper[18592]: I0308 03:56:07.079650 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83000169-b43a-41b4-9e1c-43d99dade81a-host\") pod \"node-ca-n77ps\" (UID: \"83000169-b43a-41b4-9e1c-43d99dade81a\") " pod="openshift-image-registry/node-ca-n77ps" Mar 08 03:56:07.079791 master-0 kubenswrapper[18592]: I0308 03:56:07.079740 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83000169-b43a-41b4-9e1c-43d99dade81a-host\") pod \"node-ca-n77ps\" (UID: \"83000169-b43a-41b4-9e1c-43d99dade81a\") " pod="openshift-image-registry/node-ca-n77ps" Mar 08 03:56:07.080385 master-0 kubenswrapper[18592]: I0308 03:56:07.080350 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/83000169-b43a-41b4-9e1c-43d99dade81a-serviceca\") pod \"node-ca-n77ps\" (UID: \"83000169-b43a-41b4-9e1c-43d99dade81a\") " pod="openshift-image-registry/node-ca-n77ps" Mar 08 03:56:07.097904 master-0 kubenswrapper[18592]: I0308 03:56:07.097851 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcmtv\" (UniqueName: \"kubernetes.io/projected/83000169-b43a-41b4-9e1c-43d99dade81a-kube-api-access-tcmtv\") pod \"node-ca-n77ps\" (UID: \"83000169-b43a-41b4-9e1c-43d99dade81a\") " pod="openshift-image-registry/node-ca-n77ps" Mar 08 03:56:07.356955 master-0 kubenswrapper[18592]: I0308 03:56:07.356903 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n77ps" Mar 08 03:56:07.391543 master-0 kubenswrapper[18592]: W0308 03:56:07.391474 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83000169_b43a_41b4_9e1c_43d99dade81a.slice/crio-6d671cd8b01195f92832f795dcd959fc40564518c37c9a7d0dce48bc3bda3459 WatchSource:0}: Error finding container 6d671cd8b01195f92832f795dcd959fc40564518c37c9a7d0dce48bc3bda3459: Status 404 returned error can't find the container with id 6d671cd8b01195f92832f795dcd959fc40564518c37c9a7d0dce48bc3bda3459 Mar 08 03:56:07.688949 master-0 kubenswrapper[18592]: I0308 03:56:07.688895 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n77ps" event={"ID":"83000169-b43a-41b4-9e1c-43d99dade81a","Type":"ContainerStarted","Data":"6d671cd8b01195f92832f795dcd959fc40564518c37c9a7d0dce48bc3bda3459"} Mar 08 03:56:10.728502 master-0 kubenswrapper[18592]: I0308 03:56:10.728370 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n77ps" event={"ID":"83000169-b43a-41b4-9e1c-43d99dade81a","Type":"ContainerStarted","Data":"9d509c2df0ff63c4f4e3caf583c50f105cc02441de79a8baec70602643b63f14"} Mar 08 03:56:10.756513 master-0 kubenswrapper[18592]: I0308 03:56:10.756393 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-n77ps" podStartSLOduration=2.710368878 podStartE2EDuration="4.756369452s" podCreationTimestamp="2026-03-08 03:56:06 +0000 UTC" firstStartedPulling="2026-03-08 03:56:07.393887266 +0000 UTC m=+179.492641646" lastFinishedPulling="2026-03-08 03:56:09.43988787 +0000 UTC m=+181.538642220" observedRunningTime="2026-03-08 03:56:10.753141179 +0000 UTC m=+182.851895569" watchObservedRunningTime="2026-03-08 03:56:10.756369452 +0000 UTC m=+182.855123832" Mar 08 03:56:57.573270 master-0 kubenswrapper[18592]: I0308 03:56:57.573205 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-59shv"] Mar 08 03:56:57.574213 master-0 kubenswrapper[18592]: I0308 03:56:57.574133 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" Mar 08 03:56:57.576091 master-0 kubenswrapper[18592]: I0308 03:56:57.576041 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 08 03:56:57.576492 master-0 kubenswrapper[18592]: I0308 03:56:57.576448 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-bdg6m" Mar 08 03:56:57.648798 master-0 kubenswrapper[18592]: I0308 03:56:57.648734 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-ready\") pod \"cni-sysctl-allowlist-ds-59shv\" (UID: \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\") " pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" Mar 08 03:56:57.648798 master-0 kubenswrapper[18592]: I0308 03:56:57.648793 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-59shv\" (UID: \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\") " pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" Mar 08 03:56:57.649069 master-0 kubenswrapper[18592]: I0308 03:56:57.648896 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpbws\" (UniqueName: \"kubernetes.io/projected/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-kube-api-access-jpbws\") pod \"cni-sysctl-allowlist-ds-59shv\" (UID: \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\") " pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" Mar 08 03:56:57.649069 master-0 kubenswrapper[18592]: I0308 03:56:57.649018 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-59shv\" (UID: \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\") " pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" Mar 08 03:56:57.750592 master-0 kubenswrapper[18592]: I0308 03:56:57.750536 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpbws\" (UniqueName: \"kubernetes.io/projected/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-kube-api-access-jpbws\") pod \"cni-sysctl-allowlist-ds-59shv\" (UID: \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\") " pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" Mar 08 03:56:57.750815 master-0 kubenswrapper[18592]: I0308 03:56:57.750620 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-59shv\" (UID: \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\") " pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" Mar 08 03:56:57.750815 master-0 kubenswrapper[18592]: I0308 03:56:57.750724 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-ready\") pod \"cni-sysctl-allowlist-ds-59shv\" (UID: \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\") " pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" Mar 08 03:56:57.750815 master-0 kubenswrapper[18592]: I0308 03:56:57.750763 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-59shv\" (UID: \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\") " pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" Mar 08 03:56:57.751534 master-0 kubenswrapper[18592]: I0308 03:56:57.751504 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-59shv\" (UID: \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\") " pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" Mar 08 03:56:57.751933 master-0 kubenswrapper[18592]: I0308 03:56:57.751906 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-ready\") pod \"cni-sysctl-allowlist-ds-59shv\" (UID: \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\") " pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" Mar 08 03:56:57.765447 master-0 kubenswrapper[18592]: I0308 03:56:57.760959 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-59shv\" (UID: \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\") " pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" Mar 08 03:56:57.779386 master-0 kubenswrapper[18592]: I0308 03:56:57.779354 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpbws\" (UniqueName: \"kubernetes.io/projected/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-kube-api-access-jpbws\") pod \"cni-sysctl-allowlist-ds-59shv\" (UID: \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\") " pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" Mar 08 03:56:57.902267 master-0 kubenswrapper[18592]: I0308 03:56:57.902158 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" Mar 08 03:56:58.120317 master-0 kubenswrapper[18592]: I0308 03:56:58.120255 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" event={"ID":"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92","Type":"ContainerStarted","Data":"41c532754087c54c36fae5eaa692f11c0f5b8f1ec20e03e63ca2b9d9a8e42274"} Mar 08 03:56:59.131150 master-0 kubenswrapper[18592]: I0308 03:56:59.131073 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" event={"ID":"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92","Type":"ContainerStarted","Data":"c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5"} Mar 08 03:56:59.131797 master-0 kubenswrapper[18592]: I0308 03:56:59.131693 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" Mar 08 03:56:59.160469 master-0 kubenswrapper[18592]: I0308 03:56:59.160401 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" podStartSLOduration=2.160383557 podStartE2EDuration="2.160383557s" podCreationTimestamp="2026-03-08 03:56:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:56:59.159076633 +0000 UTC m=+231.257831003" watchObservedRunningTime="2026-03-08 03:56:59.160383557 +0000 UTC m=+231.259137927" Mar 08 03:56:59.175669 master-0 kubenswrapper[18592]: I0308 03:56:59.175621 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" Mar 08 03:56:59.573798 master-0 kubenswrapper[18592]: I0308 03:56:59.573751 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-59shv"] Mar 08 03:57:01.149151 master-0 kubenswrapper[18592]: I0308 03:57:01.149045 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" podUID="ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5" gracePeriod=30 Mar 08 03:57:05.896076 master-0 kubenswrapper[18592]: I0308 03:57:05.895987 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-d9b6b47f9-85srg"] Mar 08 03:57:05.897415 master-0 kubenswrapper[18592]: I0308 03:57:05.897360 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:05.900690 master-0 kubenswrapper[18592]: I0308 03:57:05.900620 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 03:57:05.900930 master-0 kubenswrapper[18592]: I0308 03:57:05.900634 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 03:57:05.903412 master-0 kubenswrapper[18592]: I0308 03:57:05.903346 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 03:57:05.903958 master-0 kubenswrapper[18592]: I0308 03:57:05.903917 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 03:57:05.904196 master-0 kubenswrapper[18592]: I0308 03:57:05.904156 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 03:57:05.906299 master-0 kubenswrapper[18592]: I0308 03:57:05.906244 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-5s872" Mar 08 03:57:05.906883 master-0 kubenswrapper[18592]: I0308 03:57:05.906816 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 03:57:05.909334 master-0 kubenswrapper[18592]: I0308 03:57:05.909261 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 03:57:05.909563 master-0 kubenswrapper[18592]: I0308 03:57:05.909510 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 03:57:05.910202 master-0 kubenswrapper[18592]: I0308 03:57:05.910139 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 03:57:05.910377 master-0 kubenswrapper[18592]: I0308 03:57:05.910322 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 03:57:05.926182 master-0 kubenswrapper[18592]: I0308 03:57:05.926071 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 03:57:05.937118 master-0 kubenswrapper[18592]: I0308 03:57:05.937013 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-d9b6b47f9-85srg"] Mar 08 03:57:05.938666 master-0 kubenswrapper[18592]: I0308 03:57:05.938592 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 03:57:05.948857 master-0 kubenswrapper[18592]: I0308 03:57:05.947582 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 03:57:05.994785 master-0 kubenswrapper[18592]: I0308 03:57:05.994729 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-router-certs\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:05.994785 master-0 kubenswrapper[18592]: I0308 03:57:05.994782 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:05.995123 master-0 kubenswrapper[18592]: I0308 03:57:05.994818 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-audit-policies\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:05.995123 master-0 kubenswrapper[18592]: I0308 03:57:05.994854 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-user-template-login\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:05.995123 master-0 kubenswrapper[18592]: I0308 03:57:05.994877 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-service-ca\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:05.995123 master-0 kubenswrapper[18592]: I0308 03:57:05.994900 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-serving-cert\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:05.995123 master-0 kubenswrapper[18592]: I0308 03:57:05.994925 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:05.995123 master-0 kubenswrapper[18592]: I0308 03:57:05.994950 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-user-template-error\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:05.995123 master-0 kubenswrapper[18592]: I0308 03:57:05.994968 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:05.995123 master-0 kubenswrapper[18592]: I0308 03:57:05.995030 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:05.995123 master-0 kubenswrapper[18592]: I0308 03:57:05.995121 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7skln\" (UniqueName: \"kubernetes.io/projected/8ae7ed78-1761-4974-977a-1bc16c87bd91-kube-api-access-7skln\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:05.995669 master-0 kubenswrapper[18592]: I0308 03:57:05.995211 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8ae7ed78-1761-4974-977a-1bc16c87bd91-audit-dir\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:05.995669 master-0 kubenswrapper[18592]: I0308 03:57:05.995241 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-session\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.096705 master-0 kubenswrapper[18592]: I0308 03:57:06.096653 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.096705 master-0 kubenswrapper[18592]: I0308 03:57:06.096705 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-user-template-error\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.097031 master-0 kubenswrapper[18592]: I0308 03:57:06.096728 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.097031 master-0 kubenswrapper[18592]: I0308 03:57:06.096749 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.099141 master-0 kubenswrapper[18592]: I0308 03:57:06.099082 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7skln\" (UniqueName: \"kubernetes.io/projected/8ae7ed78-1761-4974-977a-1bc16c87bd91-kube-api-access-7skln\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.099282 master-0 kubenswrapper[18592]: I0308 03:57:06.099226 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8ae7ed78-1761-4974-977a-1bc16c87bd91-audit-dir\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.099282 master-0 kubenswrapper[18592]: I0308 03:57:06.099272 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-session\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.099426 master-0 kubenswrapper[18592]: I0308 03:57:06.099369 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-router-certs\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.099495 master-0 kubenswrapper[18592]: I0308 03:57:06.099436 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.099495 master-0 kubenswrapper[18592]: I0308 03:57:06.099486 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-audit-policies\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.099614 master-0 kubenswrapper[18592]: I0308 03:57:06.099548 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-user-template-login\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.099614 master-0 kubenswrapper[18592]: E0308 03:57:06.099554 18592 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 08 03:57:06.099614 master-0 kubenswrapper[18592]: I0308 03:57:06.099593 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-service-ca\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.099777 master-0 kubenswrapper[18592]: E0308 03:57:06.099654 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-cliconfig podName:8ae7ed78-1761-4974-977a-1bc16c87bd91 nodeName:}" failed. No retries permitted until 2026-03-08 03:57:06.599624666 +0000 UTC m=+238.698379046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-cliconfig") pod "oauth-openshift-d9b6b47f9-85srg" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91") : configmap "v4-0-config-system-cliconfig" not found Mar 08 03:57:06.099777 master-0 kubenswrapper[18592]: I0308 03:57:06.099696 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-serving-cert\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.100534 master-0 kubenswrapper[18592]: I0308 03:57:06.100483 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-service-ca\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.100937 master-0 kubenswrapper[18592]: I0308 03:57:06.100792 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.101289 master-0 kubenswrapper[18592]: I0308 03:57:06.101232 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8ae7ed78-1761-4974-977a-1bc16c87bd91-audit-dir\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.101379 master-0 kubenswrapper[18592]: E0308 03:57:06.101363 18592 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-system-session: secret "v4-0-config-system-session" not found Mar 08 03:57:06.101452 master-0 kubenswrapper[18592]: E0308 03:57:06.101420 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-session podName:8ae7ed78-1761-4974-977a-1bc16c87bd91 nodeName:}" failed. No retries permitted until 2026-03-08 03:57:06.601400877 +0000 UTC m=+238.700155267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-session") pod "oauth-openshift-d9b6b47f9-85srg" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91") : secret "v4-0-config-system-session" not found Mar 08 03:57:06.101520 master-0 kubenswrapper[18592]: I0308 03:57:06.101487 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-audit-policies\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.104436 master-0 kubenswrapper[18592]: I0308 03:57:06.104379 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-user-template-error\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.105399 master-0 kubenswrapper[18592]: I0308 03:57:06.105340 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-serving-cert\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.105509 master-0 kubenswrapper[18592]: I0308 03:57:06.105466 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-user-template-login\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.106413 master-0 kubenswrapper[18592]: I0308 03:57:06.106352 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.106731 master-0 kubenswrapper[18592]: I0308 03:57:06.106694 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-router-certs\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.111757 master-0 kubenswrapper[18592]: I0308 03:57:06.111704 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.118531 master-0 kubenswrapper[18592]: I0308 03:57:06.118488 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7skln\" (UniqueName: \"kubernetes.io/projected/8ae7ed78-1761-4974-977a-1bc16c87bd91-kube-api-access-7skln\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.614386 master-0 kubenswrapper[18592]: I0308 03:57:06.614311 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-session\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.615130 master-0 kubenswrapper[18592]: E0308 03:57:06.614596 18592 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-system-session: secret "v4-0-config-system-session" not found Mar 08 03:57:06.615422 master-0 kubenswrapper[18592]: E0308 03:57:06.615374 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-session podName:8ae7ed78-1761-4974-977a-1bc16c87bd91 nodeName:}" failed. No retries permitted until 2026-03-08 03:57:07.615352245 +0000 UTC m=+239.714106605 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-session") pod "oauth-openshift-d9b6b47f9-85srg" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91") : secret "v4-0-config-system-session" not found Mar 08 03:57:06.615650 master-0 kubenswrapper[18592]: I0308 03:57:06.615605 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:06.615754 master-0 kubenswrapper[18592]: E0308 03:57:06.615713 18592 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 08 03:57:06.615934 master-0 kubenswrapper[18592]: E0308 03:57:06.615780 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-cliconfig podName:8ae7ed78-1761-4974-977a-1bc16c87bd91 nodeName:}" failed. No retries permitted until 2026-03-08 03:57:07.615769825 +0000 UTC m=+239.714524185 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-cliconfig") pod "oauth-openshift-d9b6b47f9-85srg" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91") : configmap "v4-0-config-system-cliconfig" not found Mar 08 03:57:07.397157 master-0 kubenswrapper[18592]: I0308 03:57:07.397083 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-56bbfd46b8-t2wdj"] Mar 08 03:57:07.398711 master-0 kubenswrapper[18592]: I0308 03:57:07.398671 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-56bbfd46b8-t2wdj" Mar 08 03:57:07.402788 master-0 kubenswrapper[18592]: I0308 03:57:07.402576 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-6m6zv" Mar 08 03:57:07.421746 master-0 kubenswrapper[18592]: I0308 03:57:07.421705 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-56bbfd46b8-t2wdj"] Mar 08 03:57:07.431392 master-0 kubenswrapper[18592]: I0308 03:57:07.431337 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7e85a392-3be2-43a2-ab45-cdc9fdf26555-webhook-certs\") pod \"multus-admission-controller-56bbfd46b8-t2wdj\" (UID: \"7e85a392-3be2-43a2-ab45-cdc9fdf26555\") " pod="openshift-multus/multus-admission-controller-56bbfd46b8-t2wdj" Mar 08 03:57:07.431659 master-0 kubenswrapper[18592]: I0308 03:57:07.431623 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcvt5\" (UniqueName: \"kubernetes.io/projected/7e85a392-3be2-43a2-ab45-cdc9fdf26555-kube-api-access-bcvt5\") pod \"multus-admission-controller-56bbfd46b8-t2wdj\" (UID: \"7e85a392-3be2-43a2-ab45-cdc9fdf26555\") " pod="openshift-multus/multus-admission-controller-56bbfd46b8-t2wdj" Mar 08 03:57:07.533417 master-0 kubenswrapper[18592]: I0308 03:57:07.533356 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7e85a392-3be2-43a2-ab45-cdc9fdf26555-webhook-certs\") pod \"multus-admission-controller-56bbfd46b8-t2wdj\" (UID: \"7e85a392-3be2-43a2-ab45-cdc9fdf26555\") " pod="openshift-multus/multus-admission-controller-56bbfd46b8-t2wdj" Mar 08 03:57:07.533620 master-0 kubenswrapper[18592]: I0308 03:57:07.533461 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcvt5\" (UniqueName: \"kubernetes.io/projected/7e85a392-3be2-43a2-ab45-cdc9fdf26555-kube-api-access-bcvt5\") pod \"multus-admission-controller-56bbfd46b8-t2wdj\" (UID: \"7e85a392-3be2-43a2-ab45-cdc9fdf26555\") " pod="openshift-multus/multus-admission-controller-56bbfd46b8-t2wdj" Mar 08 03:57:07.536621 master-0 kubenswrapper[18592]: I0308 03:57:07.536575 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7e85a392-3be2-43a2-ab45-cdc9fdf26555-webhook-certs\") pod \"multus-admission-controller-56bbfd46b8-t2wdj\" (UID: \"7e85a392-3be2-43a2-ab45-cdc9fdf26555\") " pod="openshift-multus/multus-admission-controller-56bbfd46b8-t2wdj" Mar 08 03:57:07.550958 master-0 kubenswrapper[18592]: I0308 03:57:07.550921 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcvt5\" (UniqueName: \"kubernetes.io/projected/7e85a392-3be2-43a2-ab45-cdc9fdf26555-kube-api-access-bcvt5\") pod \"multus-admission-controller-56bbfd46b8-t2wdj\" (UID: \"7e85a392-3be2-43a2-ab45-cdc9fdf26555\") " pod="openshift-multus/multus-admission-controller-56bbfd46b8-t2wdj" Mar 08 03:57:07.635253 master-0 kubenswrapper[18592]: I0308 03:57:07.635176 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:07.635253 master-0 kubenswrapper[18592]: I0308 03:57:07.635256 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-session\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:07.635796 master-0 kubenswrapper[18592]: E0308 03:57:07.635748 18592 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 08 03:57:07.636135 master-0 kubenswrapper[18592]: E0308 03:57:07.636108 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-cliconfig podName:8ae7ed78-1761-4974-977a-1bc16c87bd91 nodeName:}" failed. No retries permitted until 2026-03-08 03:57:09.636077048 +0000 UTC m=+241.734831438 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-cliconfig") pod "oauth-openshift-d9b6b47f9-85srg" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91") : configmap "v4-0-config-system-cliconfig" not found Mar 08 03:57:07.638591 master-0 kubenswrapper[18592]: I0308 03:57:07.638525 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-session\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:07.721666 master-0 kubenswrapper[18592]: I0308 03:57:07.721606 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-56bbfd46b8-t2wdj" Mar 08 03:57:07.911193 master-0 kubenswrapper[18592]: E0308 03:57:07.910838 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 08 03:57:07.913298 master-0 kubenswrapper[18592]: E0308 03:57:07.912246 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 08 03:57:07.913476 master-0 kubenswrapper[18592]: E0308 03:57:07.913430 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 08 03:57:07.913532 master-0 kubenswrapper[18592]: E0308 03:57:07.913475 18592 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" podUID="ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92" containerName="kube-multus-additional-cni-plugins" Mar 08 03:57:08.235264 master-0 kubenswrapper[18592]: I0308 03:57:08.235215 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-56bbfd46b8-t2wdj"] Mar 08 03:57:09.230135 master-0 kubenswrapper[18592]: I0308 03:57:09.230033 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-56bbfd46b8-t2wdj" event={"ID":"7e85a392-3be2-43a2-ab45-cdc9fdf26555","Type":"ContainerStarted","Data":"dd638703bc0e09819dabc538870a7969ee38b1fd1f139ef3335d2db3d408d97f"} Mar 08 03:57:09.230135 master-0 kubenswrapper[18592]: I0308 03:57:09.230109 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-56bbfd46b8-t2wdj" event={"ID":"7e85a392-3be2-43a2-ab45-cdc9fdf26555","Type":"ContainerStarted","Data":"667ce5c9f383bfaca69c155a8c45dea30123f5fd3372249e3090a4bcfa362341"} Mar 08 03:57:09.230135 master-0 kubenswrapper[18592]: I0308 03:57:09.230130 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-56bbfd46b8-t2wdj" event={"ID":"7e85a392-3be2-43a2-ab45-cdc9fdf26555","Type":"ContainerStarted","Data":"1be176d78762a74eef4a314407e501000b7543c84295bb437d43c152d36756ee"} Mar 08 03:57:09.264178 master-0 kubenswrapper[18592]: I0308 03:57:09.264049 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-56bbfd46b8-t2wdj" podStartSLOduration=2.264013884 podStartE2EDuration="2.264013884s" podCreationTimestamp="2026-03-08 03:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:57:09.255294126 +0000 UTC m=+241.354048496" watchObservedRunningTime="2026-03-08 03:57:09.264013884 +0000 UTC m=+241.362768274" Mar 08 03:57:09.326530 master-0 kubenswrapper[18592]: I0308 03:57:09.326456 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-j8pv6"] Mar 08 03:57:09.326898 master-0 kubenswrapper[18592]: I0308 03:57:09.326773 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" podUID="1eb851be-f157-48ea-9a39-1361b68d2639" containerName="multus-admission-controller" containerID="cri-o://5b490ed7d49134874203db2b969a8e813e09c1e907556c2039144c5be0ec90cb" gracePeriod=30 Mar 08 03:57:09.327377 master-0 kubenswrapper[18592]: I0308 03:57:09.327329 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" podUID="1eb851be-f157-48ea-9a39-1361b68d2639" containerName="kube-rbac-proxy" containerID="cri-o://b6a9bf08942a2e11233d13bcdf99f3b818825bc97027c58eaa8bb54d09fc4200" gracePeriod=30 Mar 08 03:57:09.677721 master-0 kubenswrapper[18592]: I0308 03:57:09.677548 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:09.678107 master-0 kubenswrapper[18592]: E0308 03:57:09.678065 18592 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 08 03:57:09.678222 master-0 kubenswrapper[18592]: E0308 03:57:09.678183 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-cliconfig podName:8ae7ed78-1761-4974-977a-1bc16c87bd91 nodeName:}" failed. No retries permitted until 2026-03-08 03:57:13.678148799 +0000 UTC m=+245.776903189 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-cliconfig") pod "oauth-openshift-d9b6b47f9-85srg" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91") : configmap "v4-0-config-system-cliconfig" not found Mar 08 03:57:10.242519 master-0 kubenswrapper[18592]: I0308 03:57:10.242442 18592 generic.go:334] "Generic (PLEG): container finished" podID="1eb851be-f157-48ea-9a39-1361b68d2639" containerID="b6a9bf08942a2e11233d13bcdf99f3b818825bc97027c58eaa8bb54d09fc4200" exitCode=0 Mar 08 03:57:10.243713 master-0 kubenswrapper[18592]: I0308 03:57:10.243625 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" event={"ID":"1eb851be-f157-48ea-9a39-1361b68d2639","Type":"ContainerDied","Data":"b6a9bf08942a2e11233d13bcdf99f3b818825bc97027c58eaa8bb54d09fc4200"} Mar 08 03:57:13.750282 master-0 kubenswrapper[18592]: I0308 03:57:13.750177 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:13.751478 master-0 kubenswrapper[18592]: I0308 03:57:13.751433 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d9b6b47f9-85srg\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:14.036046 master-0 kubenswrapper[18592]: I0308 03:57:14.035879 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-5s872" Mar 08 03:57:14.043994 master-0 kubenswrapper[18592]: I0308 03:57:14.043931 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:14.608494 master-0 kubenswrapper[18592]: I0308 03:57:14.608421 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-d9b6b47f9-85srg"] Mar 08 03:57:15.289048 master-0 kubenswrapper[18592]: I0308 03:57:15.288946 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" event={"ID":"8ae7ed78-1761-4974-977a-1bc16c87bd91","Type":"ContainerStarted","Data":"51dc84185ae3a0e3c7461c9a48c0fb92df72f2c82283f36cd7e1f9243733cc12"} Mar 08 03:57:17.330723 master-0 kubenswrapper[18592]: I0308 03:57:17.330644 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" event={"ID":"8ae7ed78-1761-4974-977a-1bc16c87bd91","Type":"ContainerStarted","Data":"83104ae7907744071db87e84bc9a9478d626ea2deb92206d324e81616d00d3f4"} Mar 08 03:57:17.335177 master-0 kubenswrapper[18592]: I0308 03:57:17.335125 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:17.371259 master-0 kubenswrapper[18592]: I0308 03:57:17.371141 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" podStartSLOduration=10.149713726 podStartE2EDuration="12.371114017s" podCreationTimestamp="2026-03-08 03:57:05 +0000 UTC" firstStartedPulling="2026-03-08 03:57:14.607478394 +0000 UTC m=+246.706232794" lastFinishedPulling="2026-03-08 03:57:16.828878735 +0000 UTC m=+248.927633085" observedRunningTime="2026-03-08 03:57:17.362923371 +0000 UTC m=+249.461677741" watchObservedRunningTime="2026-03-08 03:57:17.371114017 +0000 UTC m=+249.469868407" Mar 08 03:57:17.491928 master-0 kubenswrapper[18592]: I0308 03:57:17.491495 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:57:17.905471 master-0 kubenswrapper[18592]: E0308 03:57:17.905414 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 08 03:57:17.910671 master-0 kubenswrapper[18592]: E0308 03:57:17.910620 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 08 03:57:17.913360 master-0 kubenswrapper[18592]: E0308 03:57:17.913227 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 08 03:57:17.913360 master-0 kubenswrapper[18592]: E0308 03:57:17.913306 18592 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" podUID="ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92" containerName="kube-multus-additional-cni-plugins" Mar 08 03:57:17.933005 master-0 kubenswrapper[18592]: I0308 03:57:17.929164 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6999cc9685-kprrt"] Mar 08 03:57:17.933005 master-0 kubenswrapper[18592]: I0308 03:57:17.929459 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" podUID="a6c4695c-da78-46b6-8f92-ca93c5ebb96b" containerName="controller-manager" containerID="cri-o://a7de4e022aef3ade6de907c6671db640f64c6ee4620c8dd001c2eff19b497698" gracePeriod=30 Mar 08 03:57:17.949614 master-0 kubenswrapper[18592]: I0308 03:57:17.949558 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t"] Mar 08 03:57:17.949882 master-0 kubenswrapper[18592]: I0308 03:57:17.949793 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" podUID="3f7d2cef-b17b-43ba-a222-9e6e8d8352e2" containerName="route-controller-manager" containerID="cri-o://435ec6619f140faf62f3c471feef1a5ed855198061a6c789c631830099974cc7" gracePeriod=30 Mar 08 03:57:18.338435 master-0 kubenswrapper[18592]: I0308 03:57:18.338088 18592 generic.go:334] "Generic (PLEG): container finished" podID="3f7d2cef-b17b-43ba-a222-9e6e8d8352e2" containerID="435ec6619f140faf62f3c471feef1a5ed855198061a6c789c631830099974cc7" exitCode=0 Mar 08 03:57:18.338435 master-0 kubenswrapper[18592]: I0308 03:57:18.338178 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" event={"ID":"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2","Type":"ContainerDied","Data":"435ec6619f140faf62f3c471feef1a5ed855198061a6c789c631830099974cc7"} Mar 08 03:57:18.338435 master-0 kubenswrapper[18592]: I0308 03:57:18.338213 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" event={"ID":"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2","Type":"ContainerDied","Data":"c4262d5f7cd90d77070e291ffede65804485b27ce848841d5c9b49cfb475af2e"} Mar 08 03:57:18.338435 master-0 kubenswrapper[18592]: I0308 03:57:18.338229 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4262d5f7cd90d77070e291ffede65804485b27ce848841d5c9b49cfb475af2e" Mar 08 03:57:18.341169 master-0 kubenswrapper[18592]: I0308 03:57:18.341142 18592 generic.go:334] "Generic (PLEG): container finished" podID="a6c4695c-da78-46b6-8f92-ca93c5ebb96b" containerID="a7de4e022aef3ade6de907c6671db640f64c6ee4620c8dd001c2eff19b497698" exitCode=0 Mar 08 03:57:18.341927 master-0 kubenswrapper[18592]: I0308 03:57:18.341893 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" event={"ID":"a6c4695c-da78-46b6-8f92-ca93c5ebb96b","Type":"ContainerDied","Data":"a7de4e022aef3ade6de907c6671db640f64c6ee4620c8dd001c2eff19b497698"} Mar 08 03:57:18.402928 master-0 kubenswrapper[18592]: I0308 03:57:18.401691 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:57:18.459633 master-0 kubenswrapper[18592]: I0308 03:57:18.459607 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:57:18.575729 master-0 kubenswrapper[18592]: I0308 03:57:18.575668 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-client-ca\") pod \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " Mar 08 03:57:18.576088 master-0 kubenswrapper[18592]: I0308 03:57:18.576035 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-serving-cert\") pod \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " Mar 08 03:57:18.576228 master-0 kubenswrapper[18592]: I0308 03:57:18.576207 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-config\") pod \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " Mar 08 03:57:18.576228 master-0 kubenswrapper[18592]: I0308 03:57:18.576214 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-client-ca" (OuterVolumeSpecName: "client-ca") pod "a6c4695c-da78-46b6-8f92-ca93c5ebb96b" (UID: "a6c4695c-da78-46b6-8f92-ca93c5ebb96b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:57:18.576678 master-0 kubenswrapper[18592]: I0308 03:57:18.576637 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-config" (OuterVolumeSpecName: "config") pod "3f7d2cef-b17b-43ba-a222-9e6e8d8352e2" (UID: "3f7d2cef-b17b-43ba-a222-9e6e8d8352e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:57:18.576757 master-0 kubenswrapper[18592]: I0308 03:57:18.576681 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-proxy-ca-bundles\") pod \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " Mar 08 03:57:18.576757 master-0 kubenswrapper[18592]: I0308 03:57:18.576727 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-config\") pod \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " Mar 08 03:57:18.576916 master-0 kubenswrapper[18592]: I0308 03:57:18.576768 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-client-ca\") pod \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " Mar 08 03:57:18.577091 master-0 kubenswrapper[18592]: I0308 03:57:18.577046 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a6c4695c-da78-46b6-8f92-ca93c5ebb96b" (UID: "a6c4695c-da78-46b6-8f92-ca93c5ebb96b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:57:18.577273 master-0 kubenswrapper[18592]: I0308 03:57:18.577234 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-client-ca" (OuterVolumeSpecName: "client-ca") pod "3f7d2cef-b17b-43ba-a222-9e6e8d8352e2" (UID: "3f7d2cef-b17b-43ba-a222-9e6e8d8352e2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:57:18.577364 master-0 kubenswrapper[18592]: I0308 03:57:18.577301 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd7d5\" (UniqueName: \"kubernetes.io/projected/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-kube-api-access-bd7d5\") pod \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\" (UID: \"a6c4695c-da78-46b6-8f92-ca93c5ebb96b\") " Mar 08 03:57:18.577364 master-0 kubenswrapper[18592]: I0308 03:57:18.577325 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb87l\" (UniqueName: \"kubernetes.io/projected/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-kube-api-access-pb87l\") pod \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " Mar 08 03:57:18.577629 master-0 kubenswrapper[18592]: I0308 03:57:18.577564 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-config" (OuterVolumeSpecName: "config") pod "a6c4695c-da78-46b6-8f92-ca93c5ebb96b" (UID: "a6c4695c-da78-46b6-8f92-ca93c5ebb96b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:57:18.577705 master-0 kubenswrapper[18592]: I0308 03:57:18.577588 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-serving-cert\") pod \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\" (UID: \"3f7d2cef-b17b-43ba-a222-9e6e8d8352e2\") " Mar 08 03:57:18.578366 master-0 kubenswrapper[18592]: I0308 03:57:18.578326 18592 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:57:18.578366 master-0 kubenswrapper[18592]: I0308 03:57:18.578360 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:57:18.578556 master-0 kubenswrapper[18592]: I0308 03:57:18.578374 18592 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 08 03:57:18.578556 master-0 kubenswrapper[18592]: I0308 03:57:18.578387 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:57:18.578556 master-0 kubenswrapper[18592]: I0308 03:57:18.578399 18592 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:57:18.578897 master-0 kubenswrapper[18592]: I0308 03:57:18.578854 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a6c4695c-da78-46b6-8f92-ca93c5ebb96b" (UID: "a6c4695c-da78-46b6-8f92-ca93c5ebb96b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:57:18.579466 master-0 kubenswrapper[18592]: I0308 03:57:18.579411 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-kube-api-access-pb87l" (OuterVolumeSpecName: "kube-api-access-pb87l") pod "3f7d2cef-b17b-43ba-a222-9e6e8d8352e2" (UID: "3f7d2cef-b17b-43ba-a222-9e6e8d8352e2"). InnerVolumeSpecName "kube-api-access-pb87l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:57:18.579567 master-0 kubenswrapper[18592]: I0308 03:57:18.579502 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-kube-api-access-bd7d5" (OuterVolumeSpecName: "kube-api-access-bd7d5") pod "a6c4695c-da78-46b6-8f92-ca93c5ebb96b" (UID: "a6c4695c-da78-46b6-8f92-ca93c5ebb96b"). InnerVolumeSpecName "kube-api-access-bd7d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:57:18.580528 master-0 kubenswrapper[18592]: I0308 03:57:18.580488 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3f7d2cef-b17b-43ba-a222-9e6e8d8352e2" (UID: "3f7d2cef-b17b-43ba-a222-9e6e8d8352e2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:57:18.679670 master-0 kubenswrapper[18592]: I0308 03:57:18.679505 18592 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:57:18.679670 master-0 kubenswrapper[18592]: I0308 03:57:18.679542 18592 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:57:18.679670 master-0 kubenswrapper[18592]: I0308 03:57:18.679552 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd7d5\" (UniqueName: \"kubernetes.io/projected/a6c4695c-da78-46b6-8f92-ca93c5ebb96b-kube-api-access-bd7d5\") on node \"master-0\" DevicePath \"\"" Mar 08 03:57:18.679670 master-0 kubenswrapper[18592]: I0308 03:57:18.679562 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb87l\" (UniqueName: \"kubernetes.io/projected/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2-kube-api-access-pb87l\") on node \"master-0\" DevicePath \"\"" Mar 08 03:57:19.272368 master-0 kubenswrapper[18592]: I0308 03:57:19.272136 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s"] Mar 08 03:57:19.272921 master-0 kubenswrapper[18592]: E0308 03:57:19.272526 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f7d2cef-b17b-43ba-a222-9e6e8d8352e2" containerName="route-controller-manager" Mar 08 03:57:19.272921 master-0 kubenswrapper[18592]: I0308 03:57:19.272544 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7d2cef-b17b-43ba-a222-9e6e8d8352e2" containerName="route-controller-manager" Mar 08 03:57:19.272921 master-0 kubenswrapper[18592]: E0308 03:57:19.272580 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6c4695c-da78-46b6-8f92-ca93c5ebb96b" containerName="controller-manager" Mar 08 03:57:19.272921 master-0 kubenswrapper[18592]: I0308 03:57:19.272586 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6c4695c-da78-46b6-8f92-ca93c5ebb96b" containerName="controller-manager" Mar 08 03:57:19.272921 master-0 kubenswrapper[18592]: I0308 03:57:19.272759 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f7d2cef-b17b-43ba-a222-9e6e8d8352e2" containerName="route-controller-manager" Mar 08 03:57:19.272921 master-0 kubenswrapper[18592]: I0308 03:57:19.272809 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6c4695c-da78-46b6-8f92-ca93c5ebb96b" containerName="controller-manager" Mar 08 03:57:19.273632 master-0 kubenswrapper[18592]: I0308 03:57:19.273523 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:19.287576 master-0 kubenswrapper[18592]: I0308 03:57:19.285417 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb"] Mar 08 03:57:19.293088 master-0 kubenswrapper[18592]: I0308 03:57:19.290466 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" Mar 08 03:57:19.296601 master-0 kubenswrapper[18592]: I0308 03:57:19.296287 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb"] Mar 08 03:57:19.332751 master-0 kubenswrapper[18592]: I0308 03:57:19.303020 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s"] Mar 08 03:57:19.348928 master-0 kubenswrapper[18592]: I0308 03:57:19.348879 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t" Mar 08 03:57:19.348928 master-0 kubenswrapper[18592]: I0308 03:57:19.348899 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" event={"ID":"a6c4695c-da78-46b6-8f92-ca93c5ebb96b","Type":"ContainerDied","Data":"4025d1a6cb66b179d453ec8f3c19442902ea80a04085eeeda4fa9c48c774a80e"} Mar 08 03:57:19.349616 master-0 kubenswrapper[18592]: I0308 03:57:19.348945 18592 scope.go:117] "RemoveContainer" containerID="a7de4e022aef3ade6de907c6671db640f64c6ee4620c8dd001c2eff19b497698" Mar 08 03:57:19.349616 master-0 kubenswrapper[18592]: I0308 03:57:19.349018 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6999cc9685-kprrt" Mar 08 03:57:19.394501 master-0 kubenswrapper[18592]: I0308 03:57:19.394385 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae4cf373-137f-4ed4-a276-2109a68f3616-serving-cert\") pod \"route-controller-manager-595b48c567-w5hbb\" (UID: \"ae4cf373-137f-4ed4-a276-2109a68f3616\") " pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" Mar 08 03:57:19.394501 master-0 kubenswrapper[18592]: I0308 03:57:19.394448 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-594ld\" (UniqueName: \"kubernetes.io/projected/738cecb2-995e-4486-ab1e-05af4df24de0-kube-api-access-594ld\") pod \"controller-manager-64b4c7cbf8-n5l2s\" (UID: \"738cecb2-995e-4486-ab1e-05af4df24de0\") " pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:19.394501 master-0 kubenswrapper[18592]: I0308 03:57:19.394485 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/738cecb2-995e-4486-ab1e-05af4df24de0-client-ca\") pod \"controller-manager-64b4c7cbf8-n5l2s\" (UID: \"738cecb2-995e-4486-ab1e-05af4df24de0\") " pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:19.394782 master-0 kubenswrapper[18592]: I0308 03:57:19.394655 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nnvb\" (UniqueName: \"kubernetes.io/projected/ae4cf373-137f-4ed4-a276-2109a68f3616-kube-api-access-4nnvb\") pod \"route-controller-manager-595b48c567-w5hbb\" (UID: \"ae4cf373-137f-4ed4-a276-2109a68f3616\") " pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" Mar 08 03:57:19.395007 master-0 kubenswrapper[18592]: I0308 03:57:19.394966 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae4cf373-137f-4ed4-a276-2109a68f3616-config\") pod \"route-controller-manager-595b48c567-w5hbb\" (UID: \"ae4cf373-137f-4ed4-a276-2109a68f3616\") " pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" Mar 08 03:57:19.395448 master-0 kubenswrapper[18592]: I0308 03:57:19.395411 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae4cf373-137f-4ed4-a276-2109a68f3616-client-ca\") pod \"route-controller-manager-595b48c567-w5hbb\" (UID: \"ae4cf373-137f-4ed4-a276-2109a68f3616\") " pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" Mar 08 03:57:19.395969 master-0 kubenswrapper[18592]: I0308 03:57:19.395930 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/738cecb2-995e-4486-ab1e-05af4df24de0-serving-cert\") pod \"controller-manager-64b4c7cbf8-n5l2s\" (UID: \"738cecb2-995e-4486-ab1e-05af4df24de0\") " pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:19.396047 master-0 kubenswrapper[18592]: I0308 03:57:19.395995 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/738cecb2-995e-4486-ab1e-05af4df24de0-config\") pod \"controller-manager-64b4c7cbf8-n5l2s\" (UID: \"738cecb2-995e-4486-ab1e-05af4df24de0\") " pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:19.396117 master-0 kubenswrapper[18592]: I0308 03:57:19.396087 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/738cecb2-995e-4486-ab1e-05af4df24de0-proxy-ca-bundles\") pod \"controller-manager-64b4c7cbf8-n5l2s\" (UID: \"738cecb2-995e-4486-ab1e-05af4df24de0\") " pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:19.430926 master-0 kubenswrapper[18592]: I0308 03:57:19.430726 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t"] Mar 08 03:57:19.440809 master-0 kubenswrapper[18592]: I0308 03:57:19.440379 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb98fbc8c-xnx9t"] Mar 08 03:57:19.448219 master-0 kubenswrapper[18592]: I0308 03:57:19.448183 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6999cc9685-kprrt"] Mar 08 03:57:19.455553 master-0 kubenswrapper[18592]: I0308 03:57:19.455482 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6999cc9685-kprrt"] Mar 08 03:57:19.497587 master-0 kubenswrapper[18592]: I0308 03:57:19.497520 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae4cf373-137f-4ed4-a276-2109a68f3616-client-ca\") pod \"route-controller-manager-595b48c567-w5hbb\" (UID: \"ae4cf373-137f-4ed4-a276-2109a68f3616\") " pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" Mar 08 03:57:19.497867 master-0 kubenswrapper[18592]: I0308 03:57:19.497681 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/738cecb2-995e-4486-ab1e-05af4df24de0-serving-cert\") pod \"controller-manager-64b4c7cbf8-n5l2s\" (UID: \"738cecb2-995e-4486-ab1e-05af4df24de0\") " pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:19.497867 master-0 kubenswrapper[18592]: I0308 03:57:19.497704 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/738cecb2-995e-4486-ab1e-05af4df24de0-config\") pod \"controller-manager-64b4c7cbf8-n5l2s\" (UID: \"738cecb2-995e-4486-ab1e-05af4df24de0\") " pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:19.497867 master-0 kubenswrapper[18592]: I0308 03:57:19.497741 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/738cecb2-995e-4486-ab1e-05af4df24de0-proxy-ca-bundles\") pod \"controller-manager-64b4c7cbf8-n5l2s\" (UID: \"738cecb2-995e-4486-ab1e-05af4df24de0\") " pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:19.497867 master-0 kubenswrapper[18592]: I0308 03:57:19.497794 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae4cf373-137f-4ed4-a276-2109a68f3616-serving-cert\") pod \"route-controller-manager-595b48c567-w5hbb\" (UID: \"ae4cf373-137f-4ed4-a276-2109a68f3616\") " pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" Mar 08 03:57:19.497867 master-0 kubenswrapper[18592]: I0308 03:57:19.497838 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-594ld\" (UniqueName: \"kubernetes.io/projected/738cecb2-995e-4486-ab1e-05af4df24de0-kube-api-access-594ld\") pod \"controller-manager-64b4c7cbf8-n5l2s\" (UID: \"738cecb2-995e-4486-ab1e-05af4df24de0\") " pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:19.498212 master-0 kubenswrapper[18592]: I0308 03:57:19.497880 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/738cecb2-995e-4486-ab1e-05af4df24de0-client-ca\") pod \"controller-manager-64b4c7cbf8-n5l2s\" (UID: \"738cecb2-995e-4486-ab1e-05af4df24de0\") " pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:19.498212 master-0 kubenswrapper[18592]: I0308 03:57:19.497899 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae4cf373-137f-4ed4-a276-2109a68f3616-config\") pod \"route-controller-manager-595b48c567-w5hbb\" (UID: \"ae4cf373-137f-4ed4-a276-2109a68f3616\") " pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" Mar 08 03:57:19.498212 master-0 kubenswrapper[18592]: I0308 03:57:19.497919 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nnvb\" (UniqueName: \"kubernetes.io/projected/ae4cf373-137f-4ed4-a276-2109a68f3616-kube-api-access-4nnvb\") pod \"route-controller-manager-595b48c567-w5hbb\" (UID: \"ae4cf373-137f-4ed4-a276-2109a68f3616\") " pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" Mar 08 03:57:19.502944 master-0 kubenswrapper[18592]: I0308 03:57:19.499420 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae4cf373-137f-4ed4-a276-2109a68f3616-client-ca\") pod \"route-controller-manager-595b48c567-w5hbb\" (UID: \"ae4cf373-137f-4ed4-a276-2109a68f3616\") " pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" Mar 08 03:57:19.502944 master-0 kubenswrapper[18592]: I0308 03:57:19.500620 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/738cecb2-995e-4486-ab1e-05af4df24de0-client-ca\") pod \"controller-manager-64b4c7cbf8-n5l2s\" (UID: \"738cecb2-995e-4486-ab1e-05af4df24de0\") " pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:19.502944 master-0 kubenswrapper[18592]: I0308 03:57:19.501502 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/738cecb2-995e-4486-ab1e-05af4df24de0-proxy-ca-bundles\") pod \"controller-manager-64b4c7cbf8-n5l2s\" (UID: \"738cecb2-995e-4486-ab1e-05af4df24de0\") " pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:19.502944 master-0 kubenswrapper[18592]: I0308 03:57:19.502603 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae4cf373-137f-4ed4-a276-2109a68f3616-config\") pod \"route-controller-manager-595b48c567-w5hbb\" (UID: \"ae4cf373-137f-4ed4-a276-2109a68f3616\") " pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" Mar 08 03:57:19.506554 master-0 kubenswrapper[18592]: I0308 03:57:19.506227 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/738cecb2-995e-4486-ab1e-05af4df24de0-config\") pod \"controller-manager-64b4c7cbf8-n5l2s\" (UID: \"738cecb2-995e-4486-ab1e-05af4df24de0\") " pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:19.507121 master-0 kubenswrapper[18592]: I0308 03:57:19.506768 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae4cf373-137f-4ed4-a276-2109a68f3616-serving-cert\") pod \"route-controller-manager-595b48c567-w5hbb\" (UID: \"ae4cf373-137f-4ed4-a276-2109a68f3616\") " pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" Mar 08 03:57:19.511010 master-0 kubenswrapper[18592]: I0308 03:57:19.510815 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/738cecb2-995e-4486-ab1e-05af4df24de0-serving-cert\") pod \"controller-manager-64b4c7cbf8-n5l2s\" (UID: \"738cecb2-995e-4486-ab1e-05af4df24de0\") " pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:19.522705 master-0 kubenswrapper[18592]: I0308 03:57:19.522614 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-594ld\" (UniqueName: \"kubernetes.io/projected/738cecb2-995e-4486-ab1e-05af4df24de0-kube-api-access-594ld\") pod \"controller-manager-64b4c7cbf8-n5l2s\" (UID: \"738cecb2-995e-4486-ab1e-05af4df24de0\") " pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:19.532912 master-0 kubenswrapper[18592]: I0308 03:57:19.532537 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nnvb\" (UniqueName: \"kubernetes.io/projected/ae4cf373-137f-4ed4-a276-2109a68f3616-kube-api-access-4nnvb\") pod \"route-controller-manager-595b48c567-w5hbb\" (UID: \"ae4cf373-137f-4ed4-a276-2109a68f3616\") " pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" Mar 08 03:57:19.667220 master-0 kubenswrapper[18592]: I0308 03:57:19.667143 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:19.686097 master-0 kubenswrapper[18592]: I0308 03:57:19.686017 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" Mar 08 03:57:20.152191 master-0 kubenswrapper[18592]: I0308 03:57:20.152052 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f7d2cef-b17b-43ba-a222-9e6e8d8352e2" path="/var/lib/kubelet/pods/3f7d2cef-b17b-43ba-a222-9e6e8d8352e2/volumes" Mar 08 03:57:20.152853 master-0 kubenswrapper[18592]: I0308 03:57:20.152793 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6c4695c-da78-46b6-8f92-ca93c5ebb96b" path="/var/lib/kubelet/pods/a6c4695c-da78-46b6-8f92-ca93c5ebb96b/volumes" Mar 08 03:57:20.202251 master-0 kubenswrapper[18592]: I0308 03:57:20.202205 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s"] Mar 08 03:57:20.207557 master-0 kubenswrapper[18592]: W0308 03:57:20.207498 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod738cecb2_995e_4486_ab1e_05af4df24de0.slice/crio-3147b51b4fb03c9f9abb2dd7a18758880be80d4b9f0c59b287d7743bcda2c0b1 WatchSource:0}: Error finding container 3147b51b4fb03c9f9abb2dd7a18758880be80d4b9f0c59b287d7743bcda2c0b1: Status 404 returned error can't find the container with id 3147b51b4fb03c9f9abb2dd7a18758880be80d4b9f0c59b287d7743bcda2c0b1 Mar 08 03:57:20.266478 master-0 kubenswrapper[18592]: I0308 03:57:20.266378 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb"] Mar 08 03:57:20.272997 master-0 kubenswrapper[18592]: W0308 03:57:20.272930 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae4cf373_137f_4ed4_a276_2109a68f3616.slice/crio-7df7563610286c707019ae54d58cb8ad8cc0f8eda2df84850232e4b228131efe WatchSource:0}: Error finding container 7df7563610286c707019ae54d58cb8ad8cc0f8eda2df84850232e4b228131efe: Status 404 returned error can't find the container with id 7df7563610286c707019ae54d58cb8ad8cc0f8eda2df84850232e4b228131efe Mar 08 03:57:20.357183 master-0 kubenswrapper[18592]: I0308 03:57:20.357111 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" event={"ID":"ae4cf373-137f-4ed4-a276-2109a68f3616","Type":"ContainerStarted","Data":"7df7563610286c707019ae54d58cb8ad8cc0f8eda2df84850232e4b228131efe"} Mar 08 03:57:20.358291 master-0 kubenswrapper[18592]: I0308 03:57:20.358235 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" event={"ID":"738cecb2-995e-4486-ab1e-05af4df24de0","Type":"ContainerStarted","Data":"3147b51b4fb03c9f9abb2dd7a18758880be80d4b9f0c59b287d7743bcda2c0b1"} Mar 08 03:57:21.369386 master-0 kubenswrapper[18592]: I0308 03:57:21.369311 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" event={"ID":"ae4cf373-137f-4ed4-a276-2109a68f3616","Type":"ContainerStarted","Data":"62b7489256e68bf5d0a04c5a110369ae1f66f911ae5ef4890aafd69da4f57420"} Mar 08 03:57:21.369942 master-0 kubenswrapper[18592]: I0308 03:57:21.369774 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" Mar 08 03:57:21.371908 master-0 kubenswrapper[18592]: I0308 03:57:21.371875 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" event={"ID":"738cecb2-995e-4486-ab1e-05af4df24de0","Type":"ContainerStarted","Data":"faeb18f642a82417511abd4d045c306f37d81c662abebaeb263059eebc3803d7"} Mar 08 03:57:21.373196 master-0 kubenswrapper[18592]: I0308 03:57:21.372095 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:21.376860 master-0 kubenswrapper[18592]: I0308 03:57:21.376803 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" Mar 08 03:57:21.377298 master-0 kubenswrapper[18592]: I0308 03:57:21.377274 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 03:57:21.396537 master-0 kubenswrapper[18592]: I0308 03:57:21.396456 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" podStartSLOduration=3.396432725 podStartE2EDuration="3.396432725s" podCreationTimestamp="2026-03-08 03:57:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:57:21.395722798 +0000 UTC m=+253.494477188" watchObservedRunningTime="2026-03-08 03:57:21.396432725 +0000 UTC m=+253.495187075" Mar 08 03:57:21.458835 master-0 kubenswrapper[18592]: I0308 03:57:21.458695 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" podStartSLOduration=4.45866205 podStartE2EDuration="4.45866205s" podCreationTimestamp="2026-03-08 03:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:57:21.421007732 +0000 UTC m=+253.519762082" watchObservedRunningTime="2026-03-08 03:57:21.45866205 +0000 UTC m=+253.557416430" Mar 08 03:57:24.561816 master-0 kubenswrapper[18592]: I0308 03:57:24.561747 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-2mptg"] Mar 08 03:57:24.562930 master-0 kubenswrapper[18592]: I0308 03:57:24.562688 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cbd49d755-2mptg" Mar 08 03:57:24.570017 master-0 kubenswrapper[18592]: I0308 03:57:24.569950 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 08 03:57:24.570209 master-0 kubenswrapper[18592]: I0308 03:57:24.569948 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 08 03:57:24.584288 master-0 kubenswrapper[18592]: I0308 03:57:24.584242 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-2mptg"] Mar 08 03:57:24.714808 master-0 kubenswrapper[18592]: I0308 03:57:24.714646 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3e0c5b08-ed06-4356-bb62-7f3258f763e9-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-2mptg\" (UID: \"3e0c5b08-ed06-4356-bb62-7f3258f763e9\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-2mptg" Mar 08 03:57:24.714808 master-0 kubenswrapper[18592]: I0308 03:57:24.714863 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3e0c5b08-ed06-4356-bb62-7f3258f763e9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-2mptg\" (UID: \"3e0c5b08-ed06-4356-bb62-7f3258f763e9\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-2mptg" Mar 08 03:57:24.816921 master-0 kubenswrapper[18592]: I0308 03:57:24.816727 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3e0c5b08-ed06-4356-bb62-7f3258f763e9-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-2mptg\" (UID: \"3e0c5b08-ed06-4356-bb62-7f3258f763e9\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-2mptg" Mar 08 03:57:24.816921 master-0 kubenswrapper[18592]: I0308 03:57:24.816818 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3e0c5b08-ed06-4356-bb62-7f3258f763e9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-2mptg\" (UID: \"3e0c5b08-ed06-4356-bb62-7f3258f763e9\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-2mptg" Mar 08 03:57:24.817272 master-0 kubenswrapper[18592]: E0308 03:57:24.817016 18592 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 08 03:57:24.817272 master-0 kubenswrapper[18592]: E0308 03:57:24.817080 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e0c5b08-ed06-4356-bb62-7f3258f763e9-networking-console-plugin-cert podName:3e0c5b08-ed06-4356-bb62-7f3258f763e9 nodeName:}" failed. No retries permitted until 2026-03-08 03:57:25.31706318 +0000 UTC m=+257.415817540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3e0c5b08-ed06-4356-bb62-7f3258f763e9-networking-console-plugin-cert") pod "networking-console-plugin-5cbd49d755-2mptg" (UID: "3e0c5b08-ed06-4356-bb62-7f3258f763e9") : secret "networking-console-plugin-cert" not found Mar 08 03:57:24.817875 master-0 kubenswrapper[18592]: I0308 03:57:24.817786 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3e0c5b08-ed06-4356-bb62-7f3258f763e9-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-2mptg\" (UID: \"3e0c5b08-ed06-4356-bb62-7f3258f763e9\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-2mptg" Mar 08 03:57:25.324777 master-0 kubenswrapper[18592]: I0308 03:57:25.324136 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3e0c5b08-ed06-4356-bb62-7f3258f763e9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-2mptg\" (UID: \"3e0c5b08-ed06-4356-bb62-7f3258f763e9\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-2mptg" Mar 08 03:57:25.328467 master-0 kubenswrapper[18592]: I0308 03:57:25.328400 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3e0c5b08-ed06-4356-bb62-7f3258f763e9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-2mptg\" (UID: \"3e0c5b08-ed06-4356-bb62-7f3258f763e9\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-2mptg" Mar 08 03:57:25.499797 master-0 kubenswrapper[18592]: I0308 03:57:25.496278 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cbd49d755-2mptg" Mar 08 03:57:25.993166 master-0 kubenswrapper[18592]: I0308 03:57:25.993099 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-2mptg"] Mar 08 03:57:26.418319 master-0 kubenswrapper[18592]: I0308 03:57:26.418168 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cbd49d755-2mptg" event={"ID":"3e0c5b08-ed06-4356-bb62-7f3258f763e9","Type":"ContainerStarted","Data":"efd74d5f8d485ff0191df0b806b9838aab7bc2021309bd7a739fdcad01792ece"} Mar 08 03:57:27.906610 master-0 kubenswrapper[18592]: E0308 03:57:27.906450 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 08 03:57:27.909212 master-0 kubenswrapper[18592]: E0308 03:57:27.909136 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 08 03:57:27.911248 master-0 kubenswrapper[18592]: E0308 03:57:27.911028 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 08 03:57:27.911248 master-0 kubenswrapper[18592]: E0308 03:57:27.911090 18592 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" podUID="ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92" containerName="kube-multus-additional-cni-plugins" Mar 08 03:57:28.436061 master-0 kubenswrapper[18592]: I0308 03:57:28.435981 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cbd49d755-2mptg" event={"ID":"3e0c5b08-ed06-4356-bb62-7f3258f763e9","Type":"ContainerStarted","Data":"f9cf3fad705592e31fdb1264cc1d0aac9007db55fb3fc9889d372169232a2227"} Mar 08 03:57:28.465041 master-0 kubenswrapper[18592]: I0308 03:57:28.463301 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cbd49d755-2mptg" podStartSLOduration=2.986595083 podStartE2EDuration="4.463276079s" podCreationTimestamp="2026-03-08 03:57:24 +0000 UTC" firstStartedPulling="2026-03-08 03:57:26.00275533 +0000 UTC m=+258.101509690" lastFinishedPulling="2026-03-08 03:57:27.479436326 +0000 UTC m=+259.578190686" observedRunningTime="2026-03-08 03:57:28.458425593 +0000 UTC m=+260.557179973" watchObservedRunningTime="2026-03-08 03:57:28.463276079 +0000 UTC m=+260.562030469" Mar 08 03:57:29.291688 master-0 kubenswrapper[18592]: E0308 03:57:29.291613 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[trusted-ca], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" podUID="48ab3c8e-a2bd-4380-9e8d-a41d515a989d" Mar 08 03:57:29.446868 master-0 kubenswrapper[18592]: I0308 03:57:29.446740 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:57:31.328251 master-0 kubenswrapper[18592]: I0308 03:57:31.328145 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-59shv_ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92/kube-multus-additional-cni-plugins/0.log" Mar 08 03:57:31.329944 master-0 kubenswrapper[18592]: I0308 03:57:31.329431 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" Mar 08 03:57:31.333629 master-0 kubenswrapper[18592]: I0308 03:57:31.333570 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-tuning-conf-dir\") pod \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\" (UID: \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\") " Mar 08 03:57:31.333806 master-0 kubenswrapper[18592]: I0308 03:57:31.333660 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-cni-sysctl-allowlist\") pod \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\" (UID: \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\") " Mar 08 03:57:31.333806 master-0 kubenswrapper[18592]: I0308 03:57:31.333727 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpbws\" (UniqueName: \"kubernetes.io/projected/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-kube-api-access-jpbws\") pod \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\" (UID: \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\") " Mar 08 03:57:31.333806 master-0 kubenswrapper[18592]: I0308 03:57:31.333731 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92" (UID: "ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:57:31.333806 master-0 kubenswrapper[18592]: I0308 03:57:31.333809 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-ready\") pod \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\" (UID: \"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92\") " Mar 08 03:57:31.334569 master-0 kubenswrapper[18592]: I0308 03:57:31.334448 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-ready" (OuterVolumeSpecName: "ready") pod "ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92" (UID: "ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:57:31.334569 master-0 kubenswrapper[18592]: I0308 03:57:31.334506 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92" (UID: "ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:57:31.334817 master-0 kubenswrapper[18592]: I0308 03:57:31.334699 18592 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-ready\") on node \"master-0\" DevicePath \"\"" Mar 08 03:57:31.334817 master-0 kubenswrapper[18592]: I0308 03:57:31.334763 18592 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-tuning-conf-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:57:31.334817 master-0 kubenswrapper[18592]: I0308 03:57:31.334798 18592 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-cni-sysctl-allowlist\") on node \"master-0\" DevicePath \"\"" Mar 08 03:57:31.339372 master-0 kubenswrapper[18592]: I0308 03:57:31.339314 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-kube-api-access-jpbws" (OuterVolumeSpecName: "kube-api-access-jpbws") pod "ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92" (UID: "ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92"). InnerVolumeSpecName "kube-api-access-jpbws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:57:31.435598 master-0 kubenswrapper[18592]: I0308 03:57:31.435451 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpbws\" (UniqueName: \"kubernetes.io/projected/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92-kube-api-access-jpbws\") on node \"master-0\" DevicePath \"\"" Mar 08 03:57:31.467783 master-0 kubenswrapper[18592]: I0308 03:57:31.467695 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-59shv_ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92/kube-multus-additional-cni-plugins/0.log" Mar 08 03:57:31.467783 master-0 kubenswrapper[18592]: I0308 03:57:31.467770 18592 generic.go:334] "Generic (PLEG): container finished" podID="ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92" containerID="c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5" exitCode=137 Mar 08 03:57:31.468164 master-0 kubenswrapper[18592]: I0308 03:57:31.467810 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" event={"ID":"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92","Type":"ContainerDied","Data":"c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5"} Mar 08 03:57:31.468164 master-0 kubenswrapper[18592]: I0308 03:57:31.467860 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" Mar 08 03:57:31.468164 master-0 kubenswrapper[18592]: I0308 03:57:31.467894 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-59shv" event={"ID":"ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92","Type":"ContainerDied","Data":"41c532754087c54c36fae5eaa692f11c0f5b8f1ec20e03e63ca2b9d9a8e42274"} Mar 08 03:57:31.468164 master-0 kubenswrapper[18592]: I0308 03:57:31.467938 18592 scope.go:117] "RemoveContainer" containerID="c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5" Mar 08 03:57:31.506754 master-0 kubenswrapper[18592]: I0308 03:57:31.506658 18592 scope.go:117] "RemoveContainer" containerID="c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5" Mar 08 03:57:31.511132 master-0 kubenswrapper[18592]: E0308 03:57:31.510794 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5\": container with ID starting with c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5 not found: ID does not exist" containerID="c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5" Mar 08 03:57:31.511132 master-0 kubenswrapper[18592]: I0308 03:57:31.510918 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5"} err="failed to get container status \"c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5\": rpc error: code = NotFound desc = could not find container \"c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5\": container with ID starting with c25b8ce9fda56070aad9cad01b6e5b307e4cd5a99282419041c17435cb609eb5 not found: ID does not exist" Mar 08 03:57:31.524690 master-0 kubenswrapper[18592]: I0308 03:57:31.524616 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-59shv"] Mar 08 03:57:31.533404 master-0 kubenswrapper[18592]: I0308 03:57:31.533349 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-59shv"] Mar 08 03:57:32.158561 master-0 kubenswrapper[18592]: I0308 03:57:32.158468 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92" path="/var/lib/kubelet/pods/ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92/volumes" Mar 08 03:57:33.162854 master-0 kubenswrapper[18592]: I0308 03:57:33.162768 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:57:33.164140 master-0 kubenswrapper[18592]: I0308 03:57:33.164099 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48ab3c8e-a2bd-4380-9e8d-a41d515a989d-trusted-ca\") pod \"console-operator-6c7fb6b958-mr9k6\" (UID: \"48ab3c8e-a2bd-4380-9e8d-a41d515a989d\") " pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:57:33.348381 master-0 kubenswrapper[18592]: I0308 03:57:33.348305 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:57:33.830760 master-0 kubenswrapper[18592]: I0308 03:57:33.830685 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-mr9k6"] Mar 08 03:57:34.318117 master-0 kubenswrapper[18592]: I0308 03:57:34.318023 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 08 03:57:34.319119 master-0 kubenswrapper[18592]: E0308 03:57:34.318588 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92" containerName="kube-multus-additional-cni-plugins" Mar 08 03:57:34.319119 master-0 kubenswrapper[18592]: I0308 03:57:34.318682 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92" containerName="kube-multus-additional-cni-plugins" Mar 08 03:57:34.319119 master-0 kubenswrapper[18592]: I0308 03:57:34.319033 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec90b25a-dd6c-43cc-bf66-4f32c7dd2c92" containerName="kube-multus-additional-cni-plugins" Mar 08 03:57:34.320184 master-0 kubenswrapper[18592]: I0308 03:57:34.320127 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:57:34.364011 master-0 kubenswrapper[18592]: I0308 03:57:34.326901 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-ccz75" Mar 08 03:57:34.364011 master-0 kubenswrapper[18592]: I0308 03:57:34.327422 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 03:57:34.364011 master-0 kubenswrapper[18592]: I0308 03:57:34.333907 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 08 03:57:34.391558 master-0 kubenswrapper[18592]: I0308 03:57:34.391492 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cde83292-7f24-4a84-aa85-e917c0f33a02-var-lock\") pod \"installer-4-master-0\" (UID: \"cde83292-7f24-4a84-aa85-e917c0f33a02\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:57:34.391802 master-0 kubenswrapper[18592]: I0308 03:57:34.391752 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cde83292-7f24-4a84-aa85-e917c0f33a02-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"cde83292-7f24-4a84-aa85-e917c0f33a02\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:57:34.392015 master-0 kubenswrapper[18592]: I0308 03:57:34.391977 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cde83292-7f24-4a84-aa85-e917c0f33a02-kube-api-access\") pod \"installer-4-master-0\" (UID: \"cde83292-7f24-4a84-aa85-e917c0f33a02\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:57:34.493639 master-0 kubenswrapper[18592]: I0308 03:57:34.493542 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cde83292-7f24-4a84-aa85-e917c0f33a02-var-lock\") pod \"installer-4-master-0\" (UID: \"cde83292-7f24-4a84-aa85-e917c0f33a02\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:57:34.494126 master-0 kubenswrapper[18592]: I0308 03:57:34.493811 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cde83292-7f24-4a84-aa85-e917c0f33a02-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"cde83292-7f24-4a84-aa85-e917c0f33a02\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:57:34.494126 master-0 kubenswrapper[18592]: I0308 03:57:34.493907 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cde83292-7f24-4a84-aa85-e917c0f33a02-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"cde83292-7f24-4a84-aa85-e917c0f33a02\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:57:34.494126 master-0 kubenswrapper[18592]: I0308 03:57:34.493947 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cde83292-7f24-4a84-aa85-e917c0f33a02-kube-api-access\") pod \"installer-4-master-0\" (UID: \"cde83292-7f24-4a84-aa85-e917c0f33a02\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:57:34.494126 master-0 kubenswrapper[18592]: I0308 03:57:34.493809 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cde83292-7f24-4a84-aa85-e917c0f33a02-var-lock\") pod \"installer-4-master-0\" (UID: \"cde83292-7f24-4a84-aa85-e917c0f33a02\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:57:34.502412 master-0 kubenswrapper[18592]: I0308 03:57:34.502346 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" event={"ID":"48ab3c8e-a2bd-4380-9e8d-a41d515a989d","Type":"ContainerStarted","Data":"a011d8a07609960ae6bd04d3a02dc127f73314b57939c0edec941ddc27e30c27"} Mar 08 03:57:34.515868 master-0 kubenswrapper[18592]: I0308 03:57:34.515789 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cde83292-7f24-4a84-aa85-e917c0f33a02-kube-api-access\") pod \"installer-4-master-0\" (UID: \"cde83292-7f24-4a84-aa85-e917c0f33a02\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:57:34.679560 master-0 kubenswrapper[18592]: I0308 03:57:34.679372 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:57:34.954503 master-0 kubenswrapper[18592]: I0308 03:57:34.954436 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 08 03:57:34.955723 master-0 kubenswrapper[18592]: I0308 03:57:34.955695 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 08 03:57:34.959195 master-0 kubenswrapper[18592]: I0308 03:57:34.959136 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-gkczp" Mar 08 03:57:34.962231 master-0 kubenswrapper[18592]: I0308 03:57:34.962193 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 08 03:57:34.968162 master-0 kubenswrapper[18592]: I0308 03:57:34.968105 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 08 03:57:35.002233 master-0 kubenswrapper[18592]: I0308 03:57:35.002182 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d84e0373-988e-47db-be73-5690d18beba3-kube-api-access\") pod \"installer-2-master-0\" (UID: \"d84e0373-988e-47db-be73-5690d18beba3\") " pod="openshift-etcd/installer-2-master-0" Mar 08 03:57:35.002233 master-0 kubenswrapper[18592]: I0308 03:57:35.002228 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d84e0373-988e-47db-be73-5690d18beba3-var-lock\") pod \"installer-2-master-0\" (UID: \"d84e0373-988e-47db-be73-5690d18beba3\") " pod="openshift-etcd/installer-2-master-0" Mar 08 03:57:35.002233 master-0 kubenswrapper[18592]: I0308 03:57:35.002246 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d84e0373-988e-47db-be73-5690d18beba3-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"d84e0373-988e-47db-be73-5690d18beba3\") " pod="openshift-etcd/installer-2-master-0" Mar 08 03:57:35.104299 master-0 kubenswrapper[18592]: I0308 03:57:35.104203 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d84e0373-988e-47db-be73-5690d18beba3-kube-api-access\") pod \"installer-2-master-0\" (UID: \"d84e0373-988e-47db-be73-5690d18beba3\") " pod="openshift-etcd/installer-2-master-0" Mar 08 03:57:35.104796 master-0 kubenswrapper[18592]: I0308 03:57:35.104665 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d84e0373-988e-47db-be73-5690d18beba3-var-lock\") pod \"installer-2-master-0\" (UID: \"d84e0373-988e-47db-be73-5690d18beba3\") " pod="openshift-etcd/installer-2-master-0" Mar 08 03:57:35.104796 master-0 kubenswrapper[18592]: I0308 03:57:35.104725 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d84e0373-988e-47db-be73-5690d18beba3-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"d84e0373-988e-47db-be73-5690d18beba3\") " pod="openshift-etcd/installer-2-master-0" Mar 08 03:57:35.105142 master-0 kubenswrapper[18592]: I0308 03:57:35.105106 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d84e0373-988e-47db-be73-5690d18beba3-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"d84e0373-988e-47db-be73-5690d18beba3\") " pod="openshift-etcd/installer-2-master-0" Mar 08 03:57:35.105257 master-0 kubenswrapper[18592]: I0308 03:57:35.105204 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d84e0373-988e-47db-be73-5690d18beba3-var-lock\") pod \"installer-2-master-0\" (UID: \"d84e0373-988e-47db-be73-5690d18beba3\") " pod="openshift-etcd/installer-2-master-0" Mar 08 03:57:35.132031 master-0 kubenswrapper[18592]: I0308 03:57:35.131992 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d84e0373-988e-47db-be73-5690d18beba3-kube-api-access\") pod \"installer-2-master-0\" (UID: \"d84e0373-988e-47db-be73-5690d18beba3\") " pod="openshift-etcd/installer-2-master-0" Mar 08 03:57:35.132910 master-0 kubenswrapper[18592]: I0308 03:57:35.132855 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 08 03:57:35.136251 master-0 kubenswrapper[18592]: W0308 03:57:35.136134 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcde83292_7f24_4a84_aa85_e917c0f33a02.slice/crio-15abe6185497f8aa588730ebfa1b7ed190ce63e7b82ce2dced2372dab045ecc2 WatchSource:0}: Error finding container 15abe6185497f8aa588730ebfa1b7ed190ce63e7b82ce2dced2372dab045ecc2: Status 404 returned error can't find the container with id 15abe6185497f8aa588730ebfa1b7ed190ce63e7b82ce2dced2372dab045ecc2 Mar 08 03:57:35.283404 master-0 kubenswrapper[18592]: I0308 03:57:35.283359 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 08 03:57:35.512759 master-0 kubenswrapper[18592]: I0308 03:57:35.512345 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"cde83292-7f24-4a84-aa85-e917c0f33a02","Type":"ContainerStarted","Data":"cd4bd59e144a886da1266aa75a6e84b10456106610dd8827a7a1300c5f80d968"} Mar 08 03:57:35.512759 master-0 kubenswrapper[18592]: I0308 03:57:35.512410 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"cde83292-7f24-4a84-aa85-e917c0f33a02","Type":"ContainerStarted","Data":"15abe6185497f8aa588730ebfa1b7ed190ce63e7b82ce2dced2372dab045ecc2"} Mar 08 03:57:35.549207 master-0 kubenswrapper[18592]: I0308 03:57:35.548962 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-4-master-0" podStartSLOduration=1.5489331339999999 podStartE2EDuration="1.548933134s" podCreationTimestamp="2026-03-08 03:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:57:35.547435997 +0000 UTC m=+267.646190337" watchObservedRunningTime="2026-03-08 03:57:35.548933134 +0000 UTC m=+267.647687514" Mar 08 03:57:35.827551 master-0 kubenswrapper[18592]: I0308 03:57:35.827513 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 08 03:57:36.348016 master-0 kubenswrapper[18592]: W0308 03:57:36.347975 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd84e0373_988e_47db_be73_5690d18beba3.slice/crio-ebd9f6f4acf9d718bfd8eaa682a4bf6f0d977e7f880a973028282e846abc73ff WatchSource:0}: Error finding container ebd9f6f4acf9d718bfd8eaa682a4bf6f0d977e7f880a973028282e846abc73ff: Status 404 returned error can't find the container with id ebd9f6f4acf9d718bfd8eaa682a4bf6f0d977e7f880a973028282e846abc73ff Mar 08 03:57:36.523721 master-0 kubenswrapper[18592]: I0308 03:57:36.523657 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"d84e0373-988e-47db-be73-5690d18beba3","Type":"ContainerStarted","Data":"ebd9f6f4acf9d718bfd8eaa682a4bf6f0d977e7f880a973028282e846abc73ff"} Mar 08 03:57:37.246482 master-0 kubenswrapper[18592]: I0308 03:57:37.246417 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-84f57b9877-n666k"] Mar 08 03:57:37.247851 master-0 kubenswrapper[18592]: I0308 03:57:37.247805 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-84f57b9877-n666k" Mar 08 03:57:37.250403 master-0 kubenswrapper[18592]: I0308 03:57:37.250356 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-n6zd8" Mar 08 03:57:37.250503 master-0 kubenswrapper[18592]: I0308 03:57:37.250434 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 08 03:57:37.250579 master-0 kubenswrapper[18592]: I0308 03:57:37.250532 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 08 03:57:37.267858 master-0 kubenswrapper[18592]: I0308 03:57:37.267795 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-84f57b9877-n666k"] Mar 08 03:57:37.341587 master-0 kubenswrapper[18592]: I0308 03:57:37.341434 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfwvv\" (UniqueName: \"kubernetes.io/projected/f8fc59f5-7a53-4075-8005-8fdb2b45ccb5-kube-api-access-lfwvv\") pod \"downloads-84f57b9877-n666k\" (UID: \"f8fc59f5-7a53-4075-8005-8fdb2b45ccb5\") " pod="openshift-console/downloads-84f57b9877-n666k" Mar 08 03:57:37.443142 master-0 kubenswrapper[18592]: I0308 03:57:37.443058 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfwvv\" (UniqueName: \"kubernetes.io/projected/f8fc59f5-7a53-4075-8005-8fdb2b45ccb5-kube-api-access-lfwvv\") pod \"downloads-84f57b9877-n666k\" (UID: \"f8fc59f5-7a53-4075-8005-8fdb2b45ccb5\") " pod="openshift-console/downloads-84f57b9877-n666k" Mar 08 03:57:37.476533 master-0 kubenswrapper[18592]: I0308 03:57:37.476463 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfwvv\" (UniqueName: \"kubernetes.io/projected/f8fc59f5-7a53-4075-8005-8fdb2b45ccb5-kube-api-access-lfwvv\") pod \"downloads-84f57b9877-n666k\" (UID: \"f8fc59f5-7a53-4075-8005-8fdb2b45ccb5\") " pod="openshift-console/downloads-84f57b9877-n666k" Mar 08 03:57:37.537524 master-0 kubenswrapper[18592]: I0308 03:57:37.537311 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"d84e0373-988e-47db-be73-5690d18beba3","Type":"ContainerStarted","Data":"868a23a98a2c6aca96a779a449124f3a6f973b2f70f8e065c579b59214082095"} Mar 08 03:57:37.543391 master-0 kubenswrapper[18592]: I0308 03:57:37.543349 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" event={"ID":"48ab3c8e-a2bd-4380-9e8d-a41d515a989d","Type":"ContainerStarted","Data":"e660c7adff8defba69d6fca83f25e18d22a1b67aae31982eaa0eb024e6dc5410"} Mar 08 03:57:37.544272 master-0 kubenswrapper[18592]: I0308 03:57:37.543949 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:57:37.560131 master-0 kubenswrapper[18592]: I0308 03:57:37.560074 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" Mar 08 03:57:37.571573 master-0 kubenswrapper[18592]: I0308 03:57:37.571526 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-84f57b9877-n666k" Mar 08 03:57:37.579077 master-0 kubenswrapper[18592]: I0308 03:57:37.578971 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-master-0" podStartSLOduration=3.578939386 podStartE2EDuration="3.578939386s" podCreationTimestamp="2026-03-08 03:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:57:37.564602024 +0000 UTC m=+269.663356414" watchObservedRunningTime="2026-03-08 03:57:37.578939386 +0000 UTC m=+269.677693766" Mar 08 03:57:37.593038 master-0 kubenswrapper[18592]: I0308 03:57:37.592946 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-6c7fb6b958-mr9k6" podStartSLOduration=252.025701404 podStartE2EDuration="4m14.59292423s" podCreationTimestamp="2026-03-08 03:53:23 +0000 UTC" firstStartedPulling="2026-03-08 03:57:33.827445604 +0000 UTC m=+265.926199954" lastFinishedPulling="2026-03-08 03:57:36.39466839 +0000 UTC m=+268.493422780" observedRunningTime="2026-03-08 03:57:37.592359596 +0000 UTC m=+269.691113976" watchObservedRunningTime="2026-03-08 03:57:37.59292423 +0000 UTC m=+269.691678610" Mar 08 03:57:38.119859 master-0 kubenswrapper[18592]: I0308 03:57:38.119781 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-84f57b9877-n666k"] Mar 08 03:57:38.128954 master-0 kubenswrapper[18592]: W0308 03:57:38.125183 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8fc59f5_7a53_4075_8005_8fdb2b45ccb5.slice/crio-3af6cac74a39e203a896a24dc0a262c402a79cbea4c60319d5e01a2c4d392687 WatchSource:0}: Error finding container 3af6cac74a39e203a896a24dc0a262c402a79cbea4c60319d5e01a2c4d392687: Status 404 returned error can't find the container with id 3af6cac74a39e203a896a24dc0a262c402a79cbea4c60319d5e01a2c4d392687 Mar 08 03:57:38.553866 master-0 kubenswrapper[18592]: I0308 03:57:38.553762 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-84f57b9877-n666k" event={"ID":"f8fc59f5-7a53-4075-8005-8fdb2b45ccb5","Type":"ContainerStarted","Data":"3af6cac74a39e203a896a24dc0a262c402a79cbea4c60319d5e01a2c4d392687"} Mar 08 03:57:39.564655 master-0 kubenswrapper[18592]: I0308 03:57:39.564163 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-j8pv6_1eb851be-f157-48ea-9a39-1361b68d2639/multus-admission-controller/0.log" Mar 08 03:57:39.564655 master-0 kubenswrapper[18592]: I0308 03:57:39.564285 18592 generic.go:334] "Generic (PLEG): container finished" podID="1eb851be-f157-48ea-9a39-1361b68d2639" containerID="5b490ed7d49134874203db2b969a8e813e09c1e907556c2039144c5be0ec90cb" exitCode=137 Mar 08 03:57:39.565149 master-0 kubenswrapper[18592]: I0308 03:57:39.564776 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" event={"ID":"1eb851be-f157-48ea-9a39-1361b68d2639","Type":"ContainerDied","Data":"5b490ed7d49134874203db2b969a8e813e09c1e907556c2039144c5be0ec90cb"} Mar 08 03:57:39.849557 master-0 kubenswrapper[18592]: I0308 03:57:39.849526 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-j8pv6_1eb851be-f157-48ea-9a39-1361b68d2639/multus-admission-controller/0.log" Mar 08 03:57:39.849808 master-0 kubenswrapper[18592]: I0308 03:57:39.849796 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:57:39.854218 master-0 kubenswrapper[18592]: I0308 03:57:39.854158 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-d9b6b47f9-85srg"] Mar 08 03:57:39.881679 master-0 kubenswrapper[18592]: I0308 03:57:39.881640 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") pod \"1eb851be-f157-48ea-9a39-1361b68d2639\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " Mar 08 03:57:39.882145 master-0 kubenswrapper[18592]: I0308 03:57:39.882129 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqhzl\" (UniqueName: \"kubernetes.io/projected/1eb851be-f157-48ea-9a39-1361b68d2639-kube-api-access-nqhzl\") pod \"1eb851be-f157-48ea-9a39-1361b68d2639\" (UID: \"1eb851be-f157-48ea-9a39-1361b68d2639\") " Mar 08 03:57:39.892847 master-0 kubenswrapper[18592]: I0308 03:57:39.891158 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eb851be-f157-48ea-9a39-1361b68d2639-kube-api-access-nqhzl" (OuterVolumeSpecName: "kube-api-access-nqhzl") pod "1eb851be-f157-48ea-9a39-1361b68d2639" (UID: "1eb851be-f157-48ea-9a39-1361b68d2639"). InnerVolumeSpecName "kube-api-access-nqhzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:57:39.899125 master-0 kubenswrapper[18592]: I0308 03:57:39.899040 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "1eb851be-f157-48ea-9a39-1361b68d2639" (UID: "1eb851be-f157-48ea-9a39-1361b68d2639"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:57:39.983324 master-0 kubenswrapper[18592]: I0308 03:57:39.983280 18592 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1eb851be-f157-48ea-9a39-1361b68d2639-webhook-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:57:39.983324 master-0 kubenswrapper[18592]: I0308 03:57:39.983315 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqhzl\" (UniqueName: \"kubernetes.io/projected/1eb851be-f157-48ea-9a39-1361b68d2639-kube-api-access-nqhzl\") on node \"master-0\" DevicePath \"\"" Mar 08 03:57:40.351977 master-0 kubenswrapper[18592]: I0308 03:57:40.351923 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-744db48f96-lgsd4"] Mar 08 03:57:40.353431 master-0 kubenswrapper[18592]: E0308 03:57:40.353381 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eb851be-f157-48ea-9a39-1361b68d2639" containerName="multus-admission-controller" Mar 08 03:57:40.353431 master-0 kubenswrapper[18592]: I0308 03:57:40.353427 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eb851be-f157-48ea-9a39-1361b68d2639" containerName="multus-admission-controller" Mar 08 03:57:40.353538 master-0 kubenswrapper[18592]: E0308 03:57:40.353442 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eb851be-f157-48ea-9a39-1361b68d2639" containerName="kube-rbac-proxy" Mar 08 03:57:40.353538 master-0 kubenswrapper[18592]: I0308 03:57:40.353449 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eb851be-f157-48ea-9a39-1361b68d2639" containerName="kube-rbac-proxy" Mar 08 03:57:40.353629 master-0 kubenswrapper[18592]: I0308 03:57:40.353605 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eb851be-f157-48ea-9a39-1361b68d2639" containerName="kube-rbac-proxy" Mar 08 03:57:40.353739 master-0 kubenswrapper[18592]: I0308 03:57:40.353629 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eb851be-f157-48ea-9a39-1361b68d2639" containerName="multus-admission-controller" Mar 08 03:57:40.354125 master-0 kubenswrapper[18592]: I0308 03:57:40.354098 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.357021 master-0 kubenswrapper[18592]: I0308 03:57:40.356665 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 08 03:57:40.357021 master-0 kubenswrapper[18592]: I0308 03:57:40.356889 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-4nfrl" Mar 08 03:57:40.357021 master-0 kubenswrapper[18592]: I0308 03:57:40.357004 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 08 03:57:40.357273 master-0 kubenswrapper[18592]: I0308 03:57:40.357049 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 08 03:57:40.359837 master-0 kubenswrapper[18592]: I0308 03:57:40.359533 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 08 03:57:40.362430 master-0 kubenswrapper[18592]: I0308 03:57:40.361997 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 08 03:57:40.373785 master-0 kubenswrapper[18592]: I0308 03:57:40.372327 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-744db48f96-lgsd4"] Mar 08 03:57:40.391501 master-0 kubenswrapper[18592]: I0308 03:57:40.391100 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/935ab7fb-b097-41c3-8926-8343eb29e7fc-console-serving-cert\") pod \"console-744db48f96-lgsd4\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.391501 master-0 kubenswrapper[18592]: I0308 03:57:40.391228 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/935ab7fb-b097-41c3-8926-8343eb29e7fc-console-oauth-config\") pod \"console-744db48f96-lgsd4\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.391501 master-0 kubenswrapper[18592]: I0308 03:57:40.391350 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/935ab7fb-b097-41c3-8926-8343eb29e7fc-oauth-serving-cert\") pod \"console-744db48f96-lgsd4\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.391816 master-0 kubenswrapper[18592]: I0308 03:57:40.391392 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc4rd\" (UniqueName: \"kubernetes.io/projected/935ab7fb-b097-41c3-8926-8343eb29e7fc-kube-api-access-mc4rd\") pod \"console-744db48f96-lgsd4\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.391816 master-0 kubenswrapper[18592]: I0308 03:57:40.391559 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/935ab7fb-b097-41c3-8926-8343eb29e7fc-service-ca\") pod \"console-744db48f96-lgsd4\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.391816 master-0 kubenswrapper[18592]: I0308 03:57:40.391747 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/935ab7fb-b097-41c3-8926-8343eb29e7fc-console-config\") pod \"console-744db48f96-lgsd4\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.494213 master-0 kubenswrapper[18592]: I0308 03:57:40.494158 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/935ab7fb-b097-41c3-8926-8343eb29e7fc-console-serving-cert\") pod \"console-744db48f96-lgsd4\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.494213 master-0 kubenswrapper[18592]: I0308 03:57:40.494215 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/935ab7fb-b097-41c3-8926-8343eb29e7fc-console-oauth-config\") pod \"console-744db48f96-lgsd4\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.494437 master-0 kubenswrapper[18592]: I0308 03:57:40.494237 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/935ab7fb-b097-41c3-8926-8343eb29e7fc-oauth-serving-cert\") pod \"console-744db48f96-lgsd4\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.494437 master-0 kubenswrapper[18592]: I0308 03:57:40.494402 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc4rd\" (UniqueName: \"kubernetes.io/projected/935ab7fb-b097-41c3-8926-8343eb29e7fc-kube-api-access-mc4rd\") pod \"console-744db48f96-lgsd4\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.495621 master-0 kubenswrapper[18592]: I0308 03:57:40.495580 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/935ab7fb-b097-41c3-8926-8343eb29e7fc-oauth-serving-cert\") pod \"console-744db48f96-lgsd4\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.495766 master-0 kubenswrapper[18592]: I0308 03:57:40.495740 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/935ab7fb-b097-41c3-8926-8343eb29e7fc-service-ca\") pod \"console-744db48f96-lgsd4\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.495841 master-0 kubenswrapper[18592]: I0308 03:57:40.495806 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/935ab7fb-b097-41c3-8926-8343eb29e7fc-console-config\") pod \"console-744db48f96-lgsd4\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.496732 master-0 kubenswrapper[18592]: I0308 03:57:40.496704 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/935ab7fb-b097-41c3-8926-8343eb29e7fc-console-config\") pod \"console-744db48f96-lgsd4\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.497943 master-0 kubenswrapper[18592]: I0308 03:57:40.497890 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/935ab7fb-b097-41c3-8926-8343eb29e7fc-console-serving-cert\") pod \"console-744db48f96-lgsd4\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.498005 master-0 kubenswrapper[18592]: I0308 03:57:40.497944 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/935ab7fb-b097-41c3-8926-8343eb29e7fc-service-ca\") pod \"console-744db48f96-lgsd4\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.500337 master-0 kubenswrapper[18592]: I0308 03:57:40.500317 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/935ab7fb-b097-41c3-8926-8343eb29e7fc-console-oauth-config\") pod \"console-744db48f96-lgsd4\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.511584 master-0 kubenswrapper[18592]: I0308 03:57:40.511564 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc4rd\" (UniqueName: \"kubernetes.io/projected/935ab7fb-b097-41c3-8926-8343eb29e7fc-kube-api-access-mc4rd\") pod \"console-744db48f96-lgsd4\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:40.573952 master-0 kubenswrapper[18592]: I0308 03:57:40.573761 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-j8pv6_1eb851be-f157-48ea-9a39-1361b68d2639/multus-admission-controller/0.log" Mar 08 03:57:40.573952 master-0 kubenswrapper[18592]: I0308 03:57:40.573817 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" event={"ID":"1eb851be-f157-48ea-9a39-1361b68d2639","Type":"ContainerDied","Data":"48d46b7645a64ea18f3fa334445c914bbcaaadce3a50f149dedad680b9f63699"} Mar 08 03:57:40.573952 master-0 kubenswrapper[18592]: I0308 03:57:40.573873 18592 scope.go:117] "RemoveContainer" containerID="b6a9bf08942a2e11233d13bcdf99f3b818825bc97027c58eaa8bb54d09fc4200" Mar 08 03:57:40.574430 master-0 kubenswrapper[18592]: I0308 03:57:40.573950 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-j8pv6" Mar 08 03:57:40.617561 master-0 kubenswrapper[18592]: I0308 03:57:40.617516 18592 scope.go:117] "RemoveContainer" containerID="5b490ed7d49134874203db2b969a8e813e09c1e907556c2039144c5be0ec90cb" Mar 08 03:57:40.629083 master-0 kubenswrapper[18592]: I0308 03:57:40.629031 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-j8pv6"] Mar 08 03:57:40.633929 master-0 kubenswrapper[18592]: I0308 03:57:40.633890 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-j8pv6"] Mar 08 03:57:40.674512 master-0 kubenswrapper[18592]: I0308 03:57:40.674368 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:41.132194 master-0 kubenswrapper[18592]: I0308 03:57:41.132133 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-744db48f96-lgsd4"] Mar 08 03:57:41.433654 master-0 kubenswrapper[18592]: E0308 03:57:41.433494 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[alertmanager-trusted-ca-bundle], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-monitoring/alertmanager-main-0" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" Mar 08 03:57:41.584123 master-0 kubenswrapper[18592]: I0308 03:57:41.584033 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-744db48f96-lgsd4" event={"ID":"935ab7fb-b097-41c3-8926-8343eb29e7fc","Type":"ContainerStarted","Data":"26efdafcb9fa82a0a97f09fa91567fd66e01ffff3d441f8948c444e1410e6016"} Mar 08 03:57:41.585540 master-0 kubenswrapper[18592]: I0308 03:57:41.585497 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:57:42.151529 master-0 kubenswrapper[18592]: I0308 03:57:42.151467 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eb851be-f157-48ea-9a39-1361b68d2639" path="/var/lib/kubelet/pods/1eb851be-f157-48ea-9a39-1361b68d2639/volumes" Mar 08 03:57:42.975512 master-0 kubenswrapper[18592]: I0308 03:57:42.975460 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7748864899-8p6h5"] Mar 08 03:57:42.976548 master-0 kubenswrapper[18592]: I0308 03:57:42.976504 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:42.995942 master-0 kubenswrapper[18592]: I0308 03:57:42.995684 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 08 03:57:43.002849 master-0 kubenswrapper[18592]: I0308 03:57:43.002025 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7748864899-8p6h5"] Mar 08 03:57:43.140856 master-0 kubenswrapper[18592]: I0308 03:57:43.140795 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5bc0469d-1ae9-4606-ba99-7a333b66af37-console-oauth-config\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.140856 master-0 kubenswrapper[18592]: I0308 03:57:43.140861 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5bc0469d-1ae9-4606-ba99-7a333b66af37-console-serving-cert\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.141119 master-0 kubenswrapper[18592]: I0308 03:57:43.140887 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh2mj\" (UniqueName: \"kubernetes.io/projected/5bc0469d-1ae9-4606-ba99-7a333b66af37-kube-api-access-zh2mj\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.141119 master-0 kubenswrapper[18592]: I0308 03:57:43.140910 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-service-ca\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.141119 master-0 kubenswrapper[18592]: I0308 03:57:43.140935 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-trusted-ca-bundle\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.141234 master-0 kubenswrapper[18592]: I0308 03:57:43.141152 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-oauth-serving-cert\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.141351 master-0 kubenswrapper[18592]: I0308 03:57:43.141326 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-console-config\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.242647 master-0 kubenswrapper[18592]: I0308 03:57:43.242450 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-console-config\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.242960 master-0 kubenswrapper[18592]: I0308 03:57:43.242737 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5bc0469d-1ae9-4606-ba99-7a333b66af37-console-oauth-config\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.242960 master-0 kubenswrapper[18592]: I0308 03:57:43.242773 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5bc0469d-1ae9-4606-ba99-7a333b66af37-console-serving-cert\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.242960 master-0 kubenswrapper[18592]: I0308 03:57:43.242800 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh2mj\" (UniqueName: \"kubernetes.io/projected/5bc0469d-1ae9-4606-ba99-7a333b66af37-kube-api-access-zh2mj\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.242960 master-0 kubenswrapper[18592]: I0308 03:57:43.242857 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-service-ca\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.242960 master-0 kubenswrapper[18592]: I0308 03:57:43.242893 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-trusted-ca-bundle\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.242960 master-0 kubenswrapper[18592]: I0308 03:57:43.242935 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-oauth-serving-cert\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.244558 master-0 kubenswrapper[18592]: I0308 03:57:43.244488 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-oauth-serving-cert\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.245249 master-0 kubenswrapper[18592]: I0308 03:57:43.245210 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-trusted-ca-bundle\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.245312 master-0 kubenswrapper[18592]: I0308 03:57:43.245220 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-service-ca\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.245908 master-0 kubenswrapper[18592]: I0308 03:57:43.245655 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-console-config\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.249593 master-0 kubenswrapper[18592]: I0308 03:57:43.249563 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5bc0469d-1ae9-4606-ba99-7a333b66af37-console-serving-cert\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.250394 master-0 kubenswrapper[18592]: I0308 03:57:43.250342 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5bc0469d-1ae9-4606-ba99-7a333b66af37-console-oauth-config\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.258232 master-0 kubenswrapper[18592]: I0308 03:57:43.257608 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh2mj\" (UniqueName: \"kubernetes.io/projected/5bc0469d-1ae9-4606-ba99-7a333b66af37-kube-api-access-zh2mj\") pod \"console-7748864899-8p6h5\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:43.301055 master-0 kubenswrapper[18592]: I0308 03:57:43.300975 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:45.173709 master-0 kubenswrapper[18592]: I0308 03:57:45.173528 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:57:45.175902 master-0 kubenswrapper[18592]: I0308 03:57:45.175800 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:57:45.188938 master-0 kubenswrapper[18592]: I0308 03:57:45.188867 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-65bbw" Mar 08 03:57:45.197312 master-0 kubenswrapper[18592]: I0308 03:57:45.197260 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:57:45.319564 master-0 kubenswrapper[18592]: I0308 03:57:45.319481 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7748864899-8p6h5"] Mar 08 03:57:45.320992 master-0 kubenswrapper[18592]: W0308 03:57:45.320943 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bc0469d_1ae9_4606_ba99_7a333b66af37.slice/crio-4fa7e88b1d76afea6d135ea7ac69bfcebd04384bc46d8431bd99191c0769b3f0 WatchSource:0}: Error finding container 4fa7e88b1d76afea6d135ea7ac69bfcebd04384bc46d8431bd99191c0769b3f0: Status 404 returned error can't find the container with id 4fa7e88b1d76afea6d135ea7ac69bfcebd04384bc46d8431bd99191c0769b3f0 Mar 08 03:57:45.643669 master-0 kubenswrapper[18592]: I0308 03:57:45.643514 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-744db48f96-lgsd4" event={"ID":"935ab7fb-b097-41c3-8926-8343eb29e7fc","Type":"ContainerStarted","Data":"576ba1799553100c4affee23b8bf267786a47451fbe5e13058a6627861ec622f"} Mar 08 03:57:45.645861 master-0 kubenswrapper[18592]: I0308 03:57:45.645719 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7748864899-8p6h5" event={"ID":"5bc0469d-1ae9-4606-ba99-7a333b66af37","Type":"ContainerStarted","Data":"e72150628b7e9311091e98f72ec77d7eee04d74ad6ce896529075ce288be841e"} Mar 08 03:57:45.645861 master-0 kubenswrapper[18592]: I0308 03:57:45.645879 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7748864899-8p6h5" event={"ID":"5bc0469d-1ae9-4606-ba99-7a333b66af37","Type":"ContainerStarted","Data":"4fa7e88b1d76afea6d135ea7ac69bfcebd04384bc46d8431bd99191c0769b3f0"} Mar 08 03:57:45.675137 master-0 kubenswrapper[18592]: I0308 03:57:45.675020 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-744db48f96-lgsd4" podStartSLOduration=1.9655817679999998 podStartE2EDuration="5.674984846s" podCreationTimestamp="2026-03-08 03:57:40 +0000 UTC" firstStartedPulling="2026-03-08 03:57:41.137131154 +0000 UTC m=+273.235885504" lastFinishedPulling="2026-03-08 03:57:44.846534232 +0000 UTC m=+276.945288582" observedRunningTime="2026-03-08 03:57:45.671480131 +0000 UTC m=+277.770234511" watchObservedRunningTime="2026-03-08 03:57:45.674984846 +0000 UTC m=+277.773739186" Mar 08 03:57:45.697618 master-0 kubenswrapper[18592]: I0308 03:57:45.697556 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 03:57:45.706330 master-0 kubenswrapper[18592]: I0308 03:57:45.706242 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7748864899-8p6h5" podStartSLOduration=3.706217721 podStartE2EDuration="3.706217721s" podCreationTimestamp="2026-03-08 03:57:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:57:45.704725975 +0000 UTC m=+277.803480355" watchObservedRunningTime="2026-03-08 03:57:45.706217721 +0000 UTC m=+277.804972091" Mar 08 03:57:46.656279 master-0 kubenswrapper[18592]: I0308 03:57:46.656198 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cacb9582-2132-4543-8a31-7b100ba4dd2f","Type":"ContainerStarted","Data":"3d93dc780d604df774e75a0674240fd776184772216c04f04a16c061ca8f1739"} Mar 08 03:57:47.490122 master-0 kubenswrapper[18592]: E0308 03:57:47.490056 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[prometheus-trusted-ca-bundle], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-monitoring/prometheus-k8s-0" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" Mar 08 03:57:47.663766 master-0 kubenswrapper[18592]: I0308 03:57:47.663707 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:57:48.671363 master-0 kubenswrapper[18592]: I0308 03:57:48.671236 18592 generic.go:334] "Generic (PLEG): container finished" podID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerID="296f822f770e0810353c5af5937b6f7a98bd88f53e04415bc2c63dbf385c5929" exitCode=0 Mar 08 03:57:48.671363 master-0 kubenswrapper[18592]: I0308 03:57:48.671283 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cacb9582-2132-4543-8a31-7b100ba4dd2f","Type":"ContainerDied","Data":"296f822f770e0810353c5af5937b6f7a98bd88f53e04415bc2c63dbf385c5929"} Mar 08 03:57:50.614903 master-0 kubenswrapper[18592]: I0308 03:57:50.613872 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 08 03:57:50.614903 master-0 kubenswrapper[18592]: I0308 03:57:50.614076 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-4-master-0" podUID="cde83292-7f24-4a84-aa85-e917c0f33a02" containerName="installer" containerID="cri-o://cd4bd59e144a886da1266aa75a6e84b10456106610dd8827a7a1300c5f80d968" gracePeriod=30 Mar 08 03:57:50.669271 master-0 kubenswrapper[18592]: I0308 03:57:50.669207 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:57:50.671197 master-0 kubenswrapper[18592]: I0308 03:57:50.670246 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:57:50.675934 master-0 kubenswrapper[18592]: I0308 03:57:50.675889 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:50.676023 master-0 kubenswrapper[18592]: I0308 03:57:50.675950 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-744db48f96-lgsd4" Mar 08 03:57:50.677684 master-0 kubenswrapper[18592]: I0308 03:57:50.677627 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 03:57:50.677752 master-0 kubenswrapper[18592]: I0308 03:57:50.677704 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 03:57:50.967008 master-0 kubenswrapper[18592]: I0308 03:57:50.966936 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-snc9w" Mar 08 03:57:50.975608 master-0 kubenswrapper[18592]: I0308 03:57:50.975579 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:57:51.747423 master-0 kubenswrapper[18592]: I0308 03:57:51.747364 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 03:57:51.756461 master-0 kubenswrapper[18592]: W0308 03:57:51.756399 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda26c661f_f843_45c5_85f0_2c2f72cbf580.slice/crio-e71332738e2ae39fa0b5c7bf8bf8ed3118528a8aa66a7bb7d76359721eba9710 WatchSource:0}: Error finding container e71332738e2ae39fa0b5c7bf8bf8ed3118528a8aa66a7bb7d76359721eba9710: Status 404 returned error can't find the container with id e71332738e2ae39fa0b5c7bf8bf8ed3118528a8aa66a7bb7d76359721eba9710 Mar 08 03:57:52.706963 master-0 kubenswrapper[18592]: I0308 03:57:52.706880 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a26c661f-f843-45c5-85f0-2c2f72cbf580","Type":"ContainerStarted","Data":"e71332738e2ae39fa0b5c7bf8bf8ed3118528a8aa66a7bb7d76359721eba9710"} Mar 08 03:57:53.304586 master-0 kubenswrapper[18592]: I0308 03:57:53.304525 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:53.304586 master-0 kubenswrapper[18592]: I0308 03:57:53.304588 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7748864899-8p6h5" Mar 08 03:57:53.305530 master-0 kubenswrapper[18592]: I0308 03:57:53.305479 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 03:57:53.305593 master-0 kubenswrapper[18592]: I0308 03:57:53.305538 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 03:57:53.669007 master-0 kubenswrapper[18592]: I0308 03:57:53.668927 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 08 03:57:53.669924 master-0 kubenswrapper[18592]: I0308 03:57:53.669839 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:57:53.686929 master-0 kubenswrapper[18592]: I0308 03:57:53.686785 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 08 03:57:53.722600 master-0 kubenswrapper[18592]: I0308 03:57:53.721540 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57550d28-9d40-436b-b20f-da41f30a0e6b-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"57550d28-9d40-436b-b20f-da41f30a0e6b\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:57:53.722600 master-0 kubenswrapper[18592]: I0308 03:57:53.721627 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/57550d28-9d40-436b-b20f-da41f30a0e6b-var-lock\") pod \"installer-5-master-0\" (UID: \"57550d28-9d40-436b-b20f-da41f30a0e6b\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:57:53.722600 master-0 kubenswrapper[18592]: I0308 03:57:53.721647 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57550d28-9d40-436b-b20f-da41f30a0e6b-kube-api-access\") pod \"installer-5-master-0\" (UID: \"57550d28-9d40-436b-b20f-da41f30a0e6b\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:57:53.723035 master-0 kubenswrapper[18592]: I0308 03:57:53.723001 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a26c661f-f843-45c5-85f0-2c2f72cbf580","Type":"ContainerStarted","Data":"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36"} Mar 08 03:57:53.824320 master-0 kubenswrapper[18592]: I0308 03:57:53.822962 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/57550d28-9d40-436b-b20f-da41f30a0e6b-var-lock\") pod \"installer-5-master-0\" (UID: \"57550d28-9d40-436b-b20f-da41f30a0e6b\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:57:53.824320 master-0 kubenswrapper[18592]: I0308 03:57:53.823067 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57550d28-9d40-436b-b20f-da41f30a0e6b-kube-api-access\") pod \"installer-5-master-0\" (UID: \"57550d28-9d40-436b-b20f-da41f30a0e6b\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:57:53.824320 master-0 kubenswrapper[18592]: I0308 03:57:53.823306 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/57550d28-9d40-436b-b20f-da41f30a0e6b-var-lock\") pod \"installer-5-master-0\" (UID: \"57550d28-9d40-436b-b20f-da41f30a0e6b\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:57:53.824320 master-0 kubenswrapper[18592]: I0308 03:57:53.823521 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57550d28-9d40-436b-b20f-da41f30a0e6b-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"57550d28-9d40-436b-b20f-da41f30a0e6b\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:57:53.824320 master-0 kubenswrapper[18592]: I0308 03:57:53.823861 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57550d28-9d40-436b-b20f-da41f30a0e6b-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"57550d28-9d40-436b-b20f-da41f30a0e6b\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:57:53.841125 master-0 kubenswrapper[18592]: I0308 03:57:53.840633 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57550d28-9d40-436b-b20f-da41f30a0e6b-kube-api-access\") pod \"installer-5-master-0\" (UID: \"57550d28-9d40-436b-b20f-da41f30a0e6b\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:57:54.020428 master-0 kubenswrapper[18592]: I0308 03:57:54.020363 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:57:54.424436 master-0 kubenswrapper[18592]: I0308 03:57:54.423540 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 08 03:57:54.735528 master-0 kubenswrapper[18592]: I0308 03:57:54.735360 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cacb9582-2132-4543-8a31-7b100ba4dd2f","Type":"ContainerStarted","Data":"94dbcdb89aa0f263874521912d133035369312397d1ad983f2732352b74fb994"} Mar 08 03:57:54.735528 master-0 kubenswrapper[18592]: I0308 03:57:54.735431 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cacb9582-2132-4543-8a31-7b100ba4dd2f","Type":"ContainerStarted","Data":"762bc926182fe44b4af81804c22991bc1dff6e23202c8882e8000e3e2aaad012"} Mar 08 03:57:54.735528 master-0 kubenswrapper[18592]: I0308 03:57:54.735443 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cacb9582-2132-4543-8a31-7b100ba4dd2f","Type":"ContainerStarted","Data":"d23a898658aeb28f11810d92075f16eb074d90b8b3a0a2f14b7f1f5e6e3c3074"} Mar 08 03:57:54.735528 master-0 kubenswrapper[18592]: I0308 03:57:54.735454 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cacb9582-2132-4543-8a31-7b100ba4dd2f","Type":"ContainerStarted","Data":"5b6b052f8bf0820cfe48407dc8704560b66216dfe0ff3c4511019bfcdbea1ef2"} Mar 08 03:57:54.735528 master-0 kubenswrapper[18592]: I0308 03:57:54.735464 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cacb9582-2132-4543-8a31-7b100ba4dd2f","Type":"ContainerStarted","Data":"34335e826503ea10aadf683d38cad805f434f5b33127bda531c7acdcb51fa3a5"} Mar 08 03:57:54.735528 master-0 kubenswrapper[18592]: I0308 03:57:54.735476 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cacb9582-2132-4543-8a31-7b100ba4dd2f","Type":"ContainerStarted","Data":"8ffa2ebcaa54a41e98873119e7f53fb354c1c0439475fce800c91d93f862c5b3"} Mar 08 03:57:54.736662 master-0 kubenswrapper[18592]: I0308 03:57:54.736612 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"57550d28-9d40-436b-b20f-da41f30a0e6b","Type":"ContainerStarted","Data":"5943e770bf612f37cde4c4f29602d84e71717a59f303c6cb668defe2ccdd5f53"} Mar 08 03:57:54.738282 master-0 kubenswrapper[18592]: I0308 03:57:54.738226 18592 generic.go:334] "Generic (PLEG): container finished" podID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerID="a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36" exitCode=0 Mar 08 03:57:54.738282 master-0 kubenswrapper[18592]: I0308 03:57:54.738268 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a26c661f-f843-45c5-85f0-2c2f72cbf580","Type":"ContainerDied","Data":"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36"} Mar 08 03:57:54.768448 master-0 kubenswrapper[18592]: I0308 03:57:54.768365 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=251.838415694 podStartE2EDuration="4m19.768337736s" podCreationTimestamp="2026-03-08 03:53:35 +0000 UTC" firstStartedPulling="2026-03-08 03:57:45.707172384 +0000 UTC m=+277.805926744" lastFinishedPulling="2026-03-08 03:57:53.637094426 +0000 UTC m=+285.735848786" observedRunningTime="2026-03-08 03:57:54.768222303 +0000 UTC m=+286.866976673" watchObservedRunningTime="2026-03-08 03:57:54.768337736 +0000 UTC m=+286.867092116" Mar 08 03:57:55.746086 master-0 kubenswrapper[18592]: I0308 03:57:55.746031 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"57550d28-9d40-436b-b20f-da41f30a0e6b","Type":"ContainerStarted","Data":"cb5e37fbfe5cc1bcf2f3f419a6b9f123db459cab245005e63a2065425117f690"} Mar 08 03:57:58.776617 master-0 kubenswrapper[18592]: I0308 03:57:58.776554 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a26c661f-f843-45c5-85f0-2c2f72cbf580","Type":"ContainerStarted","Data":"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56"} Mar 08 03:57:58.777070 master-0 kubenswrapper[18592]: I0308 03:57:58.776620 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a26c661f-f843-45c5-85f0-2c2f72cbf580","Type":"ContainerStarted","Data":"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f"} Mar 08 03:57:58.777070 master-0 kubenswrapper[18592]: I0308 03:57:58.776637 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a26c661f-f843-45c5-85f0-2c2f72cbf580","Type":"ContainerStarted","Data":"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488"} Mar 08 03:57:58.777070 master-0 kubenswrapper[18592]: I0308 03:57:58.776650 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a26c661f-f843-45c5-85f0-2c2f72cbf580","Type":"ContainerStarted","Data":"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf"} Mar 08 03:57:58.777070 master-0 kubenswrapper[18592]: I0308 03:57:58.776666 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a26c661f-f843-45c5-85f0-2c2f72cbf580","Type":"ContainerStarted","Data":"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2"} Mar 08 03:57:59.800031 master-0 kubenswrapper[18592]: I0308 03:57:59.799857 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a26c661f-f843-45c5-85f0-2c2f72cbf580","Type":"ContainerStarted","Data":"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1"} Mar 08 03:57:59.865014 master-0 kubenswrapper[18592]: I0308 03:57:59.861682 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=256.583741565 podStartE2EDuration="4m19.861657425s" podCreationTimestamp="2026-03-08 03:53:40 +0000 UTC" firstStartedPulling="2026-03-08 03:57:54.74091928 +0000 UTC m=+286.839673630" lastFinishedPulling="2026-03-08 03:57:58.01883514 +0000 UTC m=+290.117589490" observedRunningTime="2026-03-08 03:57:59.860971787 +0000 UTC m=+291.959726167" watchObservedRunningTime="2026-03-08 03:57:59.861657425 +0000 UTC m=+291.960411805" Mar 08 03:57:59.868214 master-0 kubenswrapper[18592]: I0308 03:57:59.868133 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-5-master-0" podStartSLOduration=6.868107006 podStartE2EDuration="6.868107006s" podCreationTimestamp="2026-03-08 03:57:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:57:55.766310906 +0000 UTC m=+287.865065256" watchObservedRunningTime="2026-03-08 03:57:59.868107006 +0000 UTC m=+291.966861386" Mar 08 03:58:00.675729 master-0 kubenswrapper[18592]: I0308 03:58:00.675679 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 03:58:00.675998 master-0 kubenswrapper[18592]: I0308 03:58:00.675736 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 03:58:00.977234 master-0 kubenswrapper[18592]: I0308 03:58:00.976464 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:58:03.303030 master-0 kubenswrapper[18592]: I0308 03:58:03.302918 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 03:58:03.304028 master-0 kubenswrapper[18592]: I0308 03:58:03.303060 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 03:58:04.896712 master-0 kubenswrapper[18592]: I0308 03:58:04.896645 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" podUID="8ae7ed78-1761-4974-977a-1bc16c87bd91" containerName="oauth-openshift" containerID="cri-o://83104ae7907744071db87e84bc9a9478d626ea2deb92206d324e81616d00d3f4" gracePeriod=15 Mar 08 03:58:05.514014 master-0 kubenswrapper[18592]: I0308 03:58:05.513975 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:58:05.563146 master-0 kubenswrapper[18592]: I0308 03:58:05.563096 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6fd77597b5-w649n"] Mar 08 03:58:05.563398 master-0 kubenswrapper[18592]: E0308 03:58:05.563355 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ae7ed78-1761-4974-977a-1bc16c87bd91" containerName="oauth-openshift" Mar 08 03:58:05.563398 master-0 kubenswrapper[18592]: I0308 03:58:05.563370 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ae7ed78-1761-4974-977a-1bc16c87bd91" containerName="oauth-openshift" Mar 08 03:58:05.563533 master-0 kubenswrapper[18592]: I0308 03:58:05.563513 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ae7ed78-1761-4974-977a-1bc16c87bd91" containerName="oauth-openshift" Mar 08 03:58:05.564242 master-0 kubenswrapper[18592]: I0308 03:58:05.563990 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.598625 master-0 kubenswrapper[18592]: I0308 03:58:05.598560 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6fd77597b5-w649n"] Mar 08 03:58:05.639514 master-0 kubenswrapper[18592]: I0308 03:58:05.639427 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-router-certs\") pod \"8ae7ed78-1761-4974-977a-1bc16c87bd91\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " Mar 08 03:58:05.639514 master-0 kubenswrapper[18592]: I0308 03:58:05.639513 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-audit-policies\") pod \"8ae7ed78-1761-4974-977a-1bc16c87bd91\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " Mar 08 03:58:05.639514 master-0 kubenswrapper[18592]: I0308 03:58:05.639557 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-user-template-login\") pod \"8ae7ed78-1761-4974-977a-1bc16c87bd91\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " Mar 08 03:58:05.640002 master-0 kubenswrapper[18592]: I0308 03:58:05.639597 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-ocp-branding-template\") pod \"8ae7ed78-1761-4974-977a-1bc16c87bd91\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " Mar 08 03:58:05.640002 master-0 kubenswrapper[18592]: I0308 03:58:05.639621 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-trusted-ca-bundle\") pod \"8ae7ed78-1761-4974-977a-1bc16c87bd91\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " Mar 08 03:58:05.640002 master-0 kubenswrapper[18592]: I0308 03:58:05.639654 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8ae7ed78-1761-4974-977a-1bc16c87bd91-audit-dir\") pod \"8ae7ed78-1761-4974-977a-1bc16c87bd91\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " Mar 08 03:58:05.640002 master-0 kubenswrapper[18592]: I0308 03:58:05.639692 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-service-ca\") pod \"8ae7ed78-1761-4974-977a-1bc16c87bd91\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " Mar 08 03:58:05.640002 master-0 kubenswrapper[18592]: I0308 03:58:05.639715 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-cliconfig\") pod \"8ae7ed78-1761-4974-977a-1bc16c87bd91\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " Mar 08 03:58:05.640002 master-0 kubenswrapper[18592]: I0308 03:58:05.639732 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-session\") pod \"8ae7ed78-1761-4974-977a-1bc16c87bd91\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " Mar 08 03:58:05.640002 master-0 kubenswrapper[18592]: I0308 03:58:05.639748 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7skln\" (UniqueName: \"kubernetes.io/projected/8ae7ed78-1761-4974-977a-1bc16c87bd91-kube-api-access-7skln\") pod \"8ae7ed78-1761-4974-977a-1bc16c87bd91\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " Mar 08 03:58:05.640002 master-0 kubenswrapper[18592]: I0308 03:58:05.639777 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-user-template-error\") pod \"8ae7ed78-1761-4974-977a-1bc16c87bd91\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " Mar 08 03:58:05.640002 master-0 kubenswrapper[18592]: I0308 03:58:05.639796 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-user-template-provider-selection\") pod \"8ae7ed78-1761-4974-977a-1bc16c87bd91\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " Mar 08 03:58:05.640002 master-0 kubenswrapper[18592]: I0308 03:58:05.639835 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-serving-cert\") pod \"8ae7ed78-1761-4974-977a-1bc16c87bd91\" (UID: \"8ae7ed78-1761-4974-977a-1bc16c87bd91\") " Mar 08 03:58:05.640465 master-0 kubenswrapper[18592]: I0308 03:58:05.640440 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ae7ed78-1761-4974-977a-1bc16c87bd91-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "8ae7ed78-1761-4974-977a-1bc16c87bd91" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:58:05.642723 master-0 kubenswrapper[18592]: I0308 03:58:05.642697 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "8ae7ed78-1761-4974-977a-1bc16c87bd91" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:58:05.643051 master-0 kubenswrapper[18592]: I0308 03:58:05.642988 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "8ae7ed78-1761-4974-977a-1bc16c87bd91" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:58:05.643051 master-0 kubenswrapper[18592]: I0308 03:58:05.643009 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "8ae7ed78-1761-4974-977a-1bc16c87bd91" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:58:05.643147 master-0 kubenswrapper[18592]: I0308 03:58:05.642544 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "8ae7ed78-1761-4974-977a-1bc16c87bd91" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:58:05.643356 master-0 kubenswrapper[18592]: I0308 03:58:05.643328 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "8ae7ed78-1761-4974-977a-1bc16c87bd91" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:58:05.644002 master-0 kubenswrapper[18592]: I0308 03:58:05.643958 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "8ae7ed78-1761-4974-977a-1bc16c87bd91" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:58:05.644387 master-0 kubenswrapper[18592]: I0308 03:58:05.644344 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "8ae7ed78-1761-4974-977a-1bc16c87bd91" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:58:05.645317 master-0 kubenswrapper[18592]: I0308 03:58:05.645272 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "8ae7ed78-1761-4974-977a-1bc16c87bd91" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:58:05.646386 master-0 kubenswrapper[18592]: I0308 03:58:05.646360 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "8ae7ed78-1761-4974-977a-1bc16c87bd91" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:58:05.647064 master-0 kubenswrapper[18592]: I0308 03:58:05.647037 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "8ae7ed78-1761-4974-977a-1bc16c87bd91" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:58:05.654565 master-0 kubenswrapper[18592]: I0308 03:58:05.654506 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "8ae7ed78-1761-4974-977a-1bc16c87bd91" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:58:05.655228 master-0 kubenswrapper[18592]: I0308 03:58:05.655072 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ae7ed78-1761-4974-977a-1bc16c87bd91-kube-api-access-7skln" (OuterVolumeSpecName: "kube-api-access-7skln") pod "8ae7ed78-1761-4974-977a-1bc16c87bd91" (UID: "8ae7ed78-1761-4974-977a-1bc16c87bd91"). InnerVolumeSpecName "kube-api-access-7skln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:58:05.741457 master-0 kubenswrapper[18592]: I0308 03:58:05.741322 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-session\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.741658 master-0 kubenswrapper[18592]: I0308 03:58:05.741417 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.741658 master-0 kubenswrapper[18592]: I0308 03:58:05.741646 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.741733 master-0 kubenswrapper[18592]: I0308 03:58:05.741700 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/836c1be1-26de-4840-8ba6-9d34a751aebc-audit-dir\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.741733 master-0 kubenswrapper[18592]: I0308 03:58:05.741726 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-user-template-error\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.741872 master-0 kubenswrapper[18592]: I0308 03:58:05.741835 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/836c1be1-26de-4840-8ba6-9d34a751aebc-audit-policies\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.741985 master-0 kubenswrapper[18592]: I0308 03:58:05.741964 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.742025 master-0 kubenswrapper[18592]: I0308 03:58:05.742011 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-user-template-login\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.742060 master-0 kubenswrapper[18592]: I0308 03:58:05.742031 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnqh4\" (UniqueName: \"kubernetes.io/projected/836c1be1-26de-4840-8ba6-9d34a751aebc-kube-api-access-nnqh4\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.742313 master-0 kubenswrapper[18592]: I0308 03:58:05.742257 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-router-certs\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.742411 master-0 kubenswrapper[18592]: I0308 03:58:05.742386 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.742442 master-0 kubenswrapper[18592]: I0308 03:58:05.742426 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-service-ca\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.742565 master-0 kubenswrapper[18592]: I0308 03:58:05.742466 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.742636 master-0 kubenswrapper[18592]: I0308 03:58:05.742611 18592 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:05.742669 master-0 kubenswrapper[18592]: I0308 03:58:05.742645 18592 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:05.742701 master-0 kubenswrapper[18592]: I0308 03:58:05.742669 18592 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:05.742701 master-0 kubenswrapper[18592]: I0308 03:58:05.742693 18592 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8ae7ed78-1761-4974-977a-1bc16c87bd91-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:05.742754 master-0 kubenswrapper[18592]: I0308 03:58:05.742721 18592 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:05.742754 master-0 kubenswrapper[18592]: I0308 03:58:05.742740 18592 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:05.742814 master-0 kubenswrapper[18592]: I0308 03:58:05.742760 18592 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:05.742814 master-0 kubenswrapper[18592]: I0308 03:58:05.742780 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7skln\" (UniqueName: \"kubernetes.io/projected/8ae7ed78-1761-4974-977a-1bc16c87bd91-kube-api-access-7skln\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:05.742814 master-0 kubenswrapper[18592]: I0308 03:58:05.742798 18592 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:05.742917 master-0 kubenswrapper[18592]: I0308 03:58:05.742846 18592 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:05.742917 master-0 kubenswrapper[18592]: I0308 03:58:05.742867 18592 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:05.742917 master-0 kubenswrapper[18592]: I0308 03:58:05.742887 18592 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8ae7ed78-1761-4974-977a-1bc16c87bd91-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:05.742917 master-0 kubenswrapper[18592]: I0308 03:58:05.742906 18592 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8ae7ed78-1761-4974-977a-1bc16c87bd91-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:05.843488 master-0 kubenswrapper[18592]: I0308 03:58:05.843426 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.843677 master-0 kubenswrapper[18592]: I0308 03:58:05.843596 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnqh4\" (UniqueName: \"kubernetes.io/projected/836c1be1-26de-4840-8ba6-9d34a751aebc-kube-api-access-nnqh4\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.843677 master-0 kubenswrapper[18592]: I0308 03:58:05.843619 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-user-template-login\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.844132 master-0 kubenswrapper[18592]: I0308 03:58:05.844092 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-router-certs\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.844182 master-0 kubenswrapper[18592]: I0308 03:58:05.844170 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.844446 master-0 kubenswrapper[18592]: I0308 03:58:05.844363 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-service-ca\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.844484 master-0 kubenswrapper[18592]: I0308 03:58:05.844450 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.844541 master-0 kubenswrapper[18592]: I0308 03:58:05.844516 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-session\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.844638 master-0 kubenswrapper[18592]: I0308 03:58:05.844619 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.844675 master-0 kubenswrapper[18592]: I0308 03:58:05.844665 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.844711 master-0 kubenswrapper[18592]: I0308 03:58:05.844692 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/836c1be1-26de-4840-8ba6-9d34a751aebc-audit-dir\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.844743 master-0 kubenswrapper[18592]: I0308 03:58:05.844714 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-user-template-error\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.844777 master-0 kubenswrapper[18592]: I0308 03:58:05.844754 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/836c1be1-26de-4840-8ba6-9d34a751aebc-audit-policies\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.844923 master-0 kubenswrapper[18592]: I0308 03:58:05.844902 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/836c1be1-26de-4840-8ba6-9d34a751aebc-audit-dir\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.845079 master-0 kubenswrapper[18592]: I0308 03:58:05.845055 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.845365 master-0 kubenswrapper[18592]: I0308 03:58:05.845324 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-service-ca\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.845814 master-0 kubenswrapper[18592]: I0308 03:58:05.845785 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/836c1be1-26de-4840-8ba6-9d34a751aebc-audit-policies\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.845885 master-0 kubenswrapper[18592]: I0308 03:58:05.845816 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.847806 master-0 kubenswrapper[18592]: I0308 03:58:05.847770 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.848282 master-0 kubenswrapper[18592]: I0308 03:58:05.848250 18592 generic.go:334] "Generic (PLEG): container finished" podID="8ae7ed78-1761-4974-977a-1bc16c87bd91" containerID="83104ae7907744071db87e84bc9a9478d626ea2deb92206d324e81616d00d3f4" exitCode=0 Mar 08 03:58:05.848334 master-0 kubenswrapper[18592]: I0308 03:58:05.848286 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" event={"ID":"8ae7ed78-1761-4974-977a-1bc16c87bd91","Type":"ContainerDied","Data":"83104ae7907744071db87e84bc9a9478d626ea2deb92206d324e81616d00d3f4"} Mar 08 03:58:05.848334 master-0 kubenswrapper[18592]: I0308 03:58:05.848316 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" event={"ID":"8ae7ed78-1761-4974-977a-1bc16c87bd91","Type":"ContainerDied","Data":"51dc84185ae3a0e3c7461c9a48c0fb92df72f2c82283f36cd7e1f9243733cc12"} Mar 08 03:58:05.848334 master-0 kubenswrapper[18592]: I0308 03:58:05.848340 18592 scope.go:117] "RemoveContainer" containerID="83104ae7907744071db87e84bc9a9478d626ea2deb92206d324e81616d00d3f4" Mar 08 03:58:05.848488 master-0 kubenswrapper[18592]: I0308 03:58:05.848334 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-session\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.848573 master-0 kubenswrapper[18592]: I0308 03:58:05.848547 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-d9b6b47f9-85srg" Mar 08 03:58:05.849740 master-0 kubenswrapper[18592]: I0308 03:58:05.849706 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.850767 master-0 kubenswrapper[18592]: I0308 03:58:05.849958 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-user-template-login\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.852132 master-0 kubenswrapper[18592]: I0308 03:58:05.851884 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.853774 master-0 kubenswrapper[18592]: I0308 03:58:05.853705 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-system-router-certs\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.869070 master-0 kubenswrapper[18592]: I0308 03:58:05.869029 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnqh4\" (UniqueName: \"kubernetes.io/projected/836c1be1-26de-4840-8ba6-9d34a751aebc-kube-api-access-nnqh4\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.873393 master-0 kubenswrapper[18592]: I0308 03:58:05.873330 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/836c1be1-26de-4840-8ba6-9d34a751aebc-v4-0-config-user-template-error\") pod \"oauth-openshift-6fd77597b5-w649n\" (UID: \"836c1be1-26de-4840-8ba6-9d34a751aebc\") " pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.888266 master-0 kubenswrapper[18592]: I0308 03:58:05.888121 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:58:05.929852 master-0 kubenswrapper[18592]: I0308 03:58:05.928661 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-d9b6b47f9-85srg"] Mar 08 03:58:05.935145 master-0 kubenswrapper[18592]: I0308 03:58:05.935088 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-d9b6b47f9-85srg"] Mar 08 03:58:06.154994 master-0 kubenswrapper[18592]: I0308 03:58:06.154001 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ae7ed78-1761-4974-977a-1bc16c87bd91" path="/var/lib/kubelet/pods/8ae7ed78-1761-4974-977a-1bc16c87bd91/volumes" Mar 08 03:58:07.646614 master-0 kubenswrapper[18592]: I0308 03:58:07.646568 18592 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 08 03:58:07.647244 master-0 kubenswrapper[18592]: I0308 03:58:07.646858 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" containerID="cri-o://b968b6dde7d2ded374cd8ae315cb70a664d6c49c41163b10766b7ed997cf628a" gracePeriod=30 Mar 08 03:58:07.647244 master-0 kubenswrapper[18592]: I0308 03:58:07.646981 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" containerID="cri-o://5d6ee3d775aef6ab7485b93f260e08787c53fee03078cb5743281f3a00c6731a" gracePeriod=30 Mar 08 03:58:07.647244 master-0 kubenswrapper[18592]: I0308 03:58:07.647016 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" containerID="cri-o://743b6d0d3328cb1e5fd90f39085d9403830aebc1de828659a1d9c0fc9660f4a2" gracePeriod=30 Mar 08 03:58:07.647244 master-0 kubenswrapper[18592]: I0308 03:58:07.647046 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" containerID="cri-o://a8391451b1644c11ad363666bf4d456fe86930894f61c4a9474dc40e3b26d78b" gracePeriod=30 Mar 08 03:58:07.647244 master-0 kubenswrapper[18592]: I0308 03:58:07.647078 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" containerID="cri-o://afc2ac57ed877bb9156ca731d8cd2f853ddb9f606dc1ae3cba22d206076d25c5" gracePeriod=30 Mar 08 03:58:07.650167 master-0 kubenswrapper[18592]: I0308 03:58:07.650123 18592 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 08 03:58:07.650508 master-0 kubenswrapper[18592]: E0308 03:58:07.650485 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" Mar 08 03:58:07.650508 master-0 kubenswrapper[18592]: I0308 03:58:07.650507 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" Mar 08 03:58:07.650605 master-0 kubenswrapper[18592]: E0308 03:58:07.650519 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" Mar 08 03:58:07.650605 master-0 kubenswrapper[18592]: I0308 03:58:07.650555 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" Mar 08 03:58:07.650605 master-0 kubenswrapper[18592]: E0308 03:58:07.650575 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" Mar 08 03:58:07.650605 master-0 kubenswrapper[18592]: I0308 03:58:07.650586 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" Mar 08 03:58:07.650765 master-0 kubenswrapper[18592]: E0308 03:58:07.650604 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="setup" Mar 08 03:58:07.650765 master-0 kubenswrapper[18592]: I0308 03:58:07.650640 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="setup" Mar 08 03:58:07.650765 master-0 kubenswrapper[18592]: E0308 03:58:07.650657 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-ensure-env-vars" Mar 08 03:58:07.650765 master-0 kubenswrapper[18592]: I0308 03:58:07.650666 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-ensure-env-vars" Mar 08 03:58:07.650765 master-0 kubenswrapper[18592]: E0308 03:58:07.650680 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" Mar 08 03:58:07.650765 master-0 kubenswrapper[18592]: I0308 03:58:07.650712 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" Mar 08 03:58:07.650765 master-0 kubenswrapper[18592]: E0308 03:58:07.650737 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-resources-copy" Mar 08 03:58:07.650765 master-0 kubenswrapper[18592]: I0308 03:58:07.650745 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-resources-copy" Mar 08 03:58:07.650765 master-0 kubenswrapper[18592]: E0308 03:58:07.650761 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" Mar 08 03:58:07.650765 master-0 kubenswrapper[18592]: I0308 03:58:07.650770 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" Mar 08 03:58:07.651240 master-0 kubenswrapper[18592]: I0308 03:58:07.651002 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-resources-copy" Mar 08 03:58:07.651240 master-0 kubenswrapper[18592]: I0308 03:58:07.651032 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" Mar 08 03:58:07.651240 master-0 kubenswrapper[18592]: I0308 03:58:07.651046 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="setup" Mar 08 03:58:07.651240 master-0 kubenswrapper[18592]: I0308 03:58:07.651062 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" Mar 08 03:58:07.651240 master-0 kubenswrapper[18592]: I0308 03:58:07.651076 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" Mar 08 03:58:07.651240 master-0 kubenswrapper[18592]: I0308 03:58:07.651094 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" Mar 08 03:58:07.651240 master-0 kubenswrapper[18592]: I0308 03:58:07.651106 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" Mar 08 03:58:07.651240 master-0 kubenswrapper[18592]: I0308 03:58:07.651113 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-ensure-env-vars" Mar 08 03:58:07.776628 master-0 kubenswrapper[18592]: I0308 03:58:07.776560 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:58:07.776628 master-0 kubenswrapper[18592]: I0308 03:58:07.776604 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:58:07.776957 master-0 kubenswrapper[18592]: I0308 03:58:07.776665 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:58:07.776957 master-0 kubenswrapper[18592]: I0308 03:58:07.776884 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:58:07.777044 master-0 kubenswrapper[18592]: I0308 03:58:07.776966 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:58:07.777100 master-0 kubenswrapper[18592]: I0308 03:58:07.777076 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:58:07.878349 master-0 kubenswrapper[18592]: I0308 03:58:07.878293 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:58:07.878565 master-0 kubenswrapper[18592]: I0308 03:58:07.878383 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:58:07.878565 master-0 kubenswrapper[18592]: I0308 03:58:07.878409 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:58:07.878565 master-0 kubenswrapper[18592]: I0308 03:58:07.878413 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:58:07.878565 master-0 kubenswrapper[18592]: I0308 03:58:07.878480 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:58:07.878747 master-0 kubenswrapper[18592]: I0308 03:58:07.878553 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:58:07.878747 master-0 kubenswrapper[18592]: I0308 03:58:07.878635 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:58:07.878747 master-0 kubenswrapper[18592]: I0308 03:58:07.878603 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:58:07.878747 master-0 kubenswrapper[18592]: I0308 03:58:07.878700 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:58:07.878747 master-0 kubenswrapper[18592]: I0308 03:58:07.878702 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:58:07.878747 master-0 kubenswrapper[18592]: I0308 03:58:07.878704 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:58:07.878747 master-0 kubenswrapper[18592]: I0308 03:58:07.878735 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:58:08.746341 master-0 kubenswrapper[18592]: I0308 03:58:08.746297 18592 scope.go:117] "RemoveContainer" containerID="435ec6619f140faf62f3c471feef1a5ed855198061a6c789c631830099974cc7" Mar 08 03:58:10.676276 master-0 kubenswrapper[18592]: I0308 03:58:10.676116 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 03:58:10.676276 master-0 kubenswrapper[18592]: I0308 03:58:10.676208 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 03:58:13.302033 master-0 kubenswrapper[18592]: I0308 03:58:13.301944 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 03:58:13.302610 master-0 kubenswrapper[18592]: I0308 03:58:13.302080 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 03:58:16.758365 master-0 kubenswrapper[18592]: I0308 03:58:16.758322 18592 scope.go:117] "RemoveContainer" containerID="83104ae7907744071db87e84bc9a9478d626ea2deb92206d324e81616d00d3f4" Mar 08 03:58:16.759461 master-0 kubenswrapper[18592]: E0308 03:58:16.759400 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83104ae7907744071db87e84bc9a9478d626ea2deb92206d324e81616d00d3f4\": container with ID starting with 83104ae7907744071db87e84bc9a9478d626ea2deb92206d324e81616d00d3f4 not found: ID does not exist" containerID="83104ae7907744071db87e84bc9a9478d626ea2deb92206d324e81616d00d3f4" Mar 08 03:58:16.759551 master-0 kubenswrapper[18592]: I0308 03:58:16.759464 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83104ae7907744071db87e84bc9a9478d626ea2deb92206d324e81616d00d3f4"} err="failed to get container status \"83104ae7907744071db87e84bc9a9478d626ea2deb92206d324e81616d00d3f4\": rpc error: code = NotFound desc = could not find container \"83104ae7907744071db87e84bc9a9478d626ea2deb92206d324e81616d00d3f4\": container with ID starting with 83104ae7907744071db87e84bc9a9478d626ea2deb92206d324e81616d00d3f4 not found: ID does not exist" Mar 08 03:58:16.768792 master-0 kubenswrapper[18592]: I0308 03:58:16.768742 18592 scope.go:117] "RemoveContainer" containerID="28bcae4d70566beaa13732bd5095c7d8d6a2ad6f8be2ed4c2e4b067a051fc9f1" Mar 08 03:58:16.853723 master-0 kubenswrapper[18592]: I0308 03:58:16.852141 18592 kubelet.go:1505] "Image garbage collection succeeded" Mar 08 03:58:16.936748 master-0 kubenswrapper[18592]: I0308 03:58:16.936663 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 08 03:58:16.937527 master-0 kubenswrapper[18592]: I0308 03:58:16.937439 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 08 03:58:16.951380 master-0 kubenswrapper[18592]: I0308 03:58:16.951325 18592 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="5d6ee3d775aef6ab7485b93f260e08787c53fee03078cb5743281f3a00c6731a" exitCode=2 Mar 08 03:58:16.951380 master-0 kubenswrapper[18592]: I0308 03:58:16.951367 18592 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="743b6d0d3328cb1e5fd90f39085d9403830aebc1de828659a1d9c0fc9660f4a2" exitCode=0 Mar 08 03:58:16.951380 master-0 kubenswrapper[18592]: I0308 03:58:16.951375 18592 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="a8391451b1644c11ad363666bf4d456fe86930894f61c4a9474dc40e3b26d78b" exitCode=2 Mar 08 03:58:16.955136 master-0 kubenswrapper[18592]: I0308 03:58:16.955096 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_cde83292-7f24-4a84-aa85-e917c0f33a02/installer/0.log" Mar 08 03:58:16.955482 master-0 kubenswrapper[18592]: I0308 03:58:16.955153 18592 generic.go:334] "Generic (PLEG): container finished" podID="cde83292-7f24-4a84-aa85-e917c0f33a02" containerID="cd4bd59e144a886da1266aa75a6e84b10456106610dd8827a7a1300c5f80d968" exitCode=1 Mar 08 03:58:16.955482 master-0 kubenswrapper[18592]: I0308 03:58:16.955218 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"cde83292-7f24-4a84-aa85-e917c0f33a02","Type":"ContainerDied","Data":"cd4bd59e144a886da1266aa75a6e84b10456106610dd8827a7a1300c5f80d968"} Mar 08 03:58:17.293170 master-0 kubenswrapper[18592]: I0308 03:58:17.293118 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_cde83292-7f24-4a84-aa85-e917c0f33a02/installer/0.log" Mar 08 03:58:17.293560 master-0 kubenswrapper[18592]: I0308 03:58:17.293533 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:58:17.469338 master-0 kubenswrapper[18592]: I0308 03:58:17.469272 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cde83292-7f24-4a84-aa85-e917c0f33a02-var-lock\") pod \"cde83292-7f24-4a84-aa85-e917c0f33a02\" (UID: \"cde83292-7f24-4a84-aa85-e917c0f33a02\") " Mar 08 03:58:17.469618 master-0 kubenswrapper[18592]: I0308 03:58:17.469434 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cde83292-7f24-4a84-aa85-e917c0f33a02-kube-api-access\") pod \"cde83292-7f24-4a84-aa85-e917c0f33a02\" (UID: \"cde83292-7f24-4a84-aa85-e917c0f33a02\") " Mar 08 03:58:17.469742 master-0 kubenswrapper[18592]: I0308 03:58:17.469644 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cde83292-7f24-4a84-aa85-e917c0f33a02-kubelet-dir\") pod \"cde83292-7f24-4a84-aa85-e917c0f33a02\" (UID: \"cde83292-7f24-4a84-aa85-e917c0f33a02\") " Mar 08 03:58:17.470008 master-0 kubenswrapper[18592]: I0308 03:58:17.469774 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cde83292-7f24-4a84-aa85-e917c0f33a02-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cde83292-7f24-4a84-aa85-e917c0f33a02" (UID: "cde83292-7f24-4a84-aa85-e917c0f33a02"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:58:17.470155 master-0 kubenswrapper[18592]: I0308 03:58:17.469953 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cde83292-7f24-4a84-aa85-e917c0f33a02-var-lock" (OuterVolumeSpecName: "var-lock") pod "cde83292-7f24-4a84-aa85-e917c0f33a02" (UID: "cde83292-7f24-4a84-aa85-e917c0f33a02"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:58:17.471496 master-0 kubenswrapper[18592]: I0308 03:58:17.471427 18592 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cde83292-7f24-4a84-aa85-e917c0f33a02-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:17.471496 master-0 kubenswrapper[18592]: I0308 03:58:17.471477 18592 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cde83292-7f24-4a84-aa85-e917c0f33a02-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:17.473425 master-0 kubenswrapper[18592]: I0308 03:58:17.473346 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cde83292-7f24-4a84-aa85-e917c0f33a02-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cde83292-7f24-4a84-aa85-e917c0f33a02" (UID: "cde83292-7f24-4a84-aa85-e917c0f33a02"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:58:17.573490 master-0 kubenswrapper[18592]: I0308 03:58:17.573394 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cde83292-7f24-4a84-aa85-e917c0f33a02-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:17.966976 master-0 kubenswrapper[18592]: I0308 03:58:17.966921 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_cde83292-7f24-4a84-aa85-e917c0f33a02/installer/0.log" Mar 08 03:58:17.967481 master-0 kubenswrapper[18592]: I0308 03:58:17.966989 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"cde83292-7f24-4a84-aa85-e917c0f33a02","Type":"ContainerDied","Data":"15abe6185497f8aa588730ebfa1b7ed190ce63e7b82ce2dced2372dab045ecc2"} Mar 08 03:58:17.967481 master-0 kubenswrapper[18592]: I0308 03:58:17.967024 18592 scope.go:117] "RemoveContainer" containerID="cd4bd59e144a886da1266aa75a6e84b10456106610dd8827a7a1300c5f80d968" Mar 08 03:58:17.967481 master-0 kubenswrapper[18592]: I0308 03:58:17.967075 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:58:18.977146 master-0 kubenswrapper[18592]: I0308 03:58:18.977074 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-84f57b9877-n666k" event={"ID":"f8fc59f5-7a53-4075-8005-8fdb2b45ccb5","Type":"ContainerStarted","Data":"4d1fce2f5f424add19cea6f685c9fc536fa6ac0de3f5673eb8882cc7005d8964"} Mar 08 03:58:18.978158 master-0 kubenswrapper[18592]: I0308 03:58:18.977295 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-84f57b9877-n666k" Mar 08 03:58:18.980004 master-0 kubenswrapper[18592]: I0308 03:58:18.979942 18592 patch_prober.go:28] interesting pod/downloads-84f57b9877-n666k container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.95:8080/\": dial tcp 10.128.0.95:8080: connect: connection refused" start-of-body= Mar 08 03:58:18.980149 master-0 kubenswrapper[18592]: I0308 03:58:18.980030 18592 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-84f57b9877-n666k" podUID="f8fc59f5-7a53-4075-8005-8fdb2b45ccb5" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.95:8080/\": dial tcp 10.128.0.95:8080: connect: connection refused" Mar 08 03:58:19.988061 master-0 kubenswrapper[18592]: I0308 03:58:19.987991 18592 patch_prober.go:28] interesting pod/downloads-84f57b9877-n666k container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.95:8080/\": dial tcp 10.128.0.95:8080: connect: connection refused" start-of-body= Mar 08 03:58:19.988674 master-0 kubenswrapper[18592]: I0308 03:58:19.988073 18592 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-84f57b9877-n666k" podUID="f8fc59f5-7a53-4075-8005-8fdb2b45ccb5" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.95:8080/\": dial tcp 10.128.0.95:8080: connect: connection refused" Mar 08 03:58:20.289475 master-0 kubenswrapper[18592]: E0308 03:58:20.289290 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:58:10Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:58:10Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:58:10Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:58:10Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 03:58:20.686414 master-0 kubenswrapper[18592]: I0308 03:58:20.686227 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 03:58:20.686414 master-0 kubenswrapper[18592]: I0308 03:58:20.686340 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 03:58:21.524066 master-0 kubenswrapper[18592]: E0308 03:58:21.523949 18592 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:58:22.007538 master-0 kubenswrapper[18592]: I0308 03:58:22.007460 18592 generic.go:334] "Generic (PLEG): container finished" podID="d84e0373-988e-47db-be73-5690d18beba3" containerID="868a23a98a2c6aca96a779a449124f3a6f973b2f70f8e065c579b59214082095" exitCode=0 Mar 08 03:58:22.007538 master-0 kubenswrapper[18592]: I0308 03:58:22.007516 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"d84e0373-988e-47db-be73-5690d18beba3","Type":"ContainerDied","Data":"868a23a98a2c6aca96a779a449124f3a6f973b2f70f8e065c579b59214082095"} Mar 08 03:58:23.021313 master-0 kubenswrapper[18592]: I0308 03:58:23.021147 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/kube-controller-manager/0.log" Mar 08 03:58:23.021313 master-0 kubenswrapper[18592]: I0308 03:58:23.021242 18592 generic.go:334] "Generic (PLEG): container finished" podID="0580c83f64e952a7a614903b6fdf6965" containerID="c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3" exitCode=1 Mar 08 03:58:23.022115 master-0 kubenswrapper[18592]: I0308 03:58:23.021340 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0580c83f64e952a7a614903b6fdf6965","Type":"ContainerDied","Data":"c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3"} Mar 08 03:58:23.022211 master-0 kubenswrapper[18592]: I0308 03:58:23.022166 18592 scope.go:117] "RemoveContainer" containerID="c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3" Mar 08 03:58:23.025990 master-0 kubenswrapper[18592]: I0308 03:58:23.025935 18592 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="f7171701edb795064e29edd4a52aeb0af591e01a8efb0166607b6c1961305d36" exitCode=1 Mar 08 03:58:23.026064 master-0 kubenswrapper[18592]: I0308 03:58:23.026028 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerDied","Data":"f7171701edb795064e29edd4a52aeb0af591e01a8efb0166607b6c1961305d36"} Mar 08 03:58:23.026119 master-0 kubenswrapper[18592]: I0308 03:58:23.026101 18592 scope.go:117] "RemoveContainer" containerID="255dd70f3aa78d8d4e9fb681404034a533a64980f735eecd5cf5d8b6ad4838a5" Mar 08 03:58:23.026655 master-0 kubenswrapper[18592]: I0308 03:58:23.026611 18592 scope.go:117] "RemoveContainer" containerID="f7171701edb795064e29edd4a52aeb0af591e01a8efb0166607b6c1961305d36" Mar 08 03:58:23.027041 master-0 kubenswrapper[18592]: E0308 03:58:23.026949 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-scheduler pod=bootstrap-kube-scheduler-master-0_kube-system(a1a56802af72ce1aac6b5077f1695ac0)\"" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="a1a56802af72ce1aac6b5077f1695ac0" Mar 08 03:58:23.303242 master-0 kubenswrapper[18592]: I0308 03:58:23.303124 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 03:58:23.303242 master-0 kubenswrapper[18592]: I0308 03:58:23.303184 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 03:58:23.466444 master-0 kubenswrapper[18592]: I0308 03:58:23.466389 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 08 03:58:23.481115 master-0 kubenswrapper[18592]: I0308 03:58:23.481029 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d84e0373-988e-47db-be73-5690d18beba3-var-lock\") pod \"d84e0373-988e-47db-be73-5690d18beba3\" (UID: \"d84e0373-988e-47db-be73-5690d18beba3\") " Mar 08 03:58:23.481115 master-0 kubenswrapper[18592]: I0308 03:58:23.481113 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d84e0373-988e-47db-be73-5690d18beba3-kube-api-access\") pod \"d84e0373-988e-47db-be73-5690d18beba3\" (UID: \"d84e0373-988e-47db-be73-5690d18beba3\") " Mar 08 03:58:23.481399 master-0 kubenswrapper[18592]: I0308 03:58:23.481162 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d84e0373-988e-47db-be73-5690d18beba3-kubelet-dir\") pod \"d84e0373-988e-47db-be73-5690d18beba3\" (UID: \"d84e0373-988e-47db-be73-5690d18beba3\") " Mar 08 03:58:23.481399 master-0 kubenswrapper[18592]: I0308 03:58:23.481148 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d84e0373-988e-47db-be73-5690d18beba3-var-lock" (OuterVolumeSpecName: "var-lock") pod "d84e0373-988e-47db-be73-5690d18beba3" (UID: "d84e0373-988e-47db-be73-5690d18beba3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:58:23.481399 master-0 kubenswrapper[18592]: I0308 03:58:23.481314 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d84e0373-988e-47db-be73-5690d18beba3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d84e0373-988e-47db-be73-5690d18beba3" (UID: "d84e0373-988e-47db-be73-5690d18beba3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:58:23.481599 master-0 kubenswrapper[18592]: I0308 03:58:23.481523 18592 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d84e0373-988e-47db-be73-5690d18beba3-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:23.481599 master-0 kubenswrapper[18592]: I0308 03:58:23.481541 18592 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d84e0373-988e-47db-be73-5690d18beba3-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:23.486115 master-0 kubenswrapper[18592]: I0308 03:58:23.485994 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d84e0373-988e-47db-be73-5690d18beba3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d84e0373-988e-47db-be73-5690d18beba3" (UID: "d84e0373-988e-47db-be73-5690d18beba3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:58:23.583688 master-0 kubenswrapper[18592]: I0308 03:58:23.583530 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d84e0373-988e-47db-be73-5690d18beba3-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:24.053500 master-0 kubenswrapper[18592]: I0308 03:58:24.053420 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/kube-controller-manager/0.log" Mar 08 03:58:24.054374 master-0 kubenswrapper[18592]: I0308 03:58:24.053670 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0580c83f64e952a7a614903b6fdf6965","Type":"ContainerStarted","Data":"b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8"} Mar 08 03:58:24.062296 master-0 kubenswrapper[18592]: I0308 03:58:24.062219 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"d84e0373-988e-47db-be73-5690d18beba3","Type":"ContainerDied","Data":"ebd9f6f4acf9d718bfd8eaa682a4bf6f0d977e7f880a973028282e846abc73ff"} Mar 08 03:58:24.062296 master-0 kubenswrapper[18592]: I0308 03:58:24.062279 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebd9f6f4acf9d718bfd8eaa682a4bf6f0d977e7f880a973028282e846abc73ff" Mar 08 03:58:24.062296 master-0 kubenswrapper[18592]: I0308 03:58:24.062249 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 08 03:58:24.202999 master-0 kubenswrapper[18592]: E0308 03:58:24.202923 18592 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podd84e0373_988e_47db_be73_5690d18beba3.slice\": RecentStats: unable to find data in memory cache]" Mar 08 03:58:25.207070 master-0 kubenswrapper[18592]: I0308 03:58:25.206971 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:58:25.207985 master-0 kubenswrapper[18592]: I0308 03:58:25.207606 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:58:25.214743 master-0 kubenswrapper[18592]: I0308 03:58:25.214673 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:58:27.589124 master-0 kubenswrapper[18592]: I0308 03:58:27.589044 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-84f57b9877-n666k" Mar 08 03:58:30.305713 master-0 kubenswrapper[18592]: E0308 03:58:30.305659 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": the server was unable to return a response in the time allotted, but may still be processing the request (get nodes master-0)" Mar 08 03:58:30.676570 master-0 kubenswrapper[18592]: I0308 03:58:30.676265 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 03:58:30.676570 master-0 kubenswrapper[18592]: I0308 03:58:30.676374 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 03:58:31.525012 master-0 kubenswrapper[18592]: E0308 03:58:31.524784 18592 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:58:33.302604 master-0 kubenswrapper[18592]: I0308 03:58:33.302493 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 03:58:33.303563 master-0 kubenswrapper[18592]: I0308 03:58:33.302641 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 03:58:35.212629 master-0 kubenswrapper[18592]: I0308 03:58:35.212542 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:58:38.143483 master-0 kubenswrapper[18592]: I0308 03:58:38.143356 18592 scope.go:117] "RemoveContainer" containerID="f7171701edb795064e29edd4a52aeb0af591e01a8efb0166607b6c1961305d36" Mar 08 03:58:38.199201 master-0 kubenswrapper[18592]: I0308 03:58:38.199023 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 08 03:58:38.201392 master-0 kubenswrapper[18592]: I0308 03:58:38.201339 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 08 03:58:38.202717 master-0 kubenswrapper[18592]: I0308 03:58:38.202671 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd/0.log" Mar 08 03:58:38.203818 master-0 kubenswrapper[18592]: I0308 03:58:38.203517 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcdctl/0.log" Mar 08 03:58:38.205791 master-0 kubenswrapper[18592]: I0308 03:58:38.205735 18592 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="afc2ac57ed877bb9156ca731d8cd2f853ddb9f606dc1ae3cba22d206076d25c5" exitCode=137 Mar 08 03:58:38.205869 master-0 kubenswrapper[18592]: I0308 03:58:38.205790 18592 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="b968b6dde7d2ded374cd8ae315cb70a664d6c49c41163b10766b7ed997cf628a" exitCode=137 Mar 08 03:58:38.280802 master-0 kubenswrapper[18592]: I0308 03:58:38.280727 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 08 03:58:38.282337 master-0 kubenswrapper[18592]: I0308 03:58:38.282281 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 08 03:58:38.283492 master-0 kubenswrapper[18592]: I0308 03:58:38.283447 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd/0.log" Mar 08 03:58:38.284057 master-0 kubenswrapper[18592]: I0308 03:58:38.284016 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcdctl/0.log" Mar 08 03:58:38.285689 master-0 kubenswrapper[18592]: I0308 03:58:38.285645 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 08 03:58:38.459603 master-0 kubenswrapper[18592]: I0308 03:58:38.459515 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 08 03:58:38.459923 master-0 kubenswrapper[18592]: I0308 03:58:38.459677 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:58:38.459923 master-0 kubenswrapper[18592]: I0308 03:58:38.459805 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 08 03:58:38.459923 master-0 kubenswrapper[18592]: I0308 03:58:38.459897 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 08 03:58:38.460153 master-0 kubenswrapper[18592]: I0308 03:58:38.459949 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 08 03:58:38.460153 master-0 kubenswrapper[18592]: I0308 03:58:38.459996 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin" (OuterVolumeSpecName: "usr-local-bin") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "usr-local-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:58:38.460153 master-0 kubenswrapper[18592]: I0308 03:58:38.460009 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir" (OuterVolumeSpecName: "static-pod-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "static-pod-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:58:38.460153 master-0 kubenswrapper[18592]: I0308 03:58:38.460095 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 08 03:58:38.460409 master-0 kubenswrapper[18592]: I0308 03:58:38.460177 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 08 03:58:38.460409 master-0 kubenswrapper[18592]: I0308 03:58:38.460159 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:58:38.460409 master-0 kubenswrapper[18592]: I0308 03:58:38.460360 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir" (OuterVolumeSpecName: "data-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:58:38.460606 master-0 kubenswrapper[18592]: I0308 03:58:38.460481 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir" (OuterVolumeSpecName: "log-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:58:38.461450 master-0 kubenswrapper[18592]: I0308 03:58:38.461333 18592 reconciler_common.go:293] "Volume detached for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:38.461450 master-0 kubenswrapper[18592]: I0308 03:58:38.461416 18592 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:38.461667 master-0 kubenswrapper[18592]: I0308 03:58:38.461480 18592 reconciler_common.go:293] "Volume detached for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:38.461667 master-0 kubenswrapper[18592]: I0308 03:58:38.461502 18592 reconciler_common.go:293] "Volume detached for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:38.461667 master-0 kubenswrapper[18592]: I0308 03:58:38.461522 18592 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:38.461667 master-0 kubenswrapper[18592]: I0308 03:58:38.461620 18592 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:58:39.219184 master-0 kubenswrapper[18592]: I0308 03:58:39.219127 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 08 03:58:39.220920 master-0 kubenswrapper[18592]: I0308 03:58:39.220872 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 08 03:58:39.222080 master-0 kubenswrapper[18592]: I0308 03:58:39.222044 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd/0.log" Mar 08 03:58:39.222714 master-0 kubenswrapper[18592]: I0308 03:58:39.222676 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcdctl/0.log" Mar 08 03:58:39.224768 master-0 kubenswrapper[18592]: I0308 03:58:39.224733 18592 scope.go:117] "RemoveContainer" containerID="5d6ee3d775aef6ab7485b93f260e08787c53fee03078cb5743281f3a00c6731a" Mar 08 03:58:39.225238 master-0 kubenswrapper[18592]: I0308 03:58:39.225200 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 08 03:58:39.228895 master-0 kubenswrapper[18592]: I0308 03:58:39.227490 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"e04f902e64611088fc5b2a33ae22c063c66ae87560c79f3c3e32ced196e50876"} Mar 08 03:58:39.256256 master-0 kubenswrapper[18592]: I0308 03:58:39.256193 18592 scope.go:117] "RemoveContainer" containerID="743b6d0d3328cb1e5fd90f39085d9403830aebc1de828659a1d9c0fc9660f4a2" Mar 08 03:58:39.280635 master-0 kubenswrapper[18592]: I0308 03:58:39.280574 18592 scope.go:117] "RemoveContainer" containerID="a8391451b1644c11ad363666bf4d456fe86930894f61c4a9474dc40e3b26d78b" Mar 08 03:58:39.317745 master-0 kubenswrapper[18592]: I0308 03:58:39.316796 18592 scope.go:117] "RemoveContainer" containerID="afc2ac57ed877bb9156ca731d8cd2f853ddb9f606dc1ae3cba22d206076d25c5" Mar 08 03:58:39.353061 master-0 kubenswrapper[18592]: I0308 03:58:39.352794 18592 scope.go:117] "RemoveContainer" containerID="b968b6dde7d2ded374cd8ae315cb70a664d6c49c41163b10766b7ed997cf628a" Mar 08 03:58:39.378590 master-0 kubenswrapper[18592]: I0308 03:58:39.378549 18592 scope.go:117] "RemoveContainer" containerID="3df151c3da265182304d84afb0b3bc1e42416ef6485b53e3bd88733c8055b421" Mar 08 03:58:39.403353 master-0 kubenswrapper[18592]: I0308 03:58:39.403292 18592 scope.go:117] "RemoveContainer" containerID="338e07cf4149947d0b4bb7aee072ff8d4da6cb3eeb924ae9f2fa6dc0d8d523b1" Mar 08 03:58:39.427764 master-0 kubenswrapper[18592]: I0308 03:58:39.427715 18592 scope.go:117] "RemoveContainer" containerID="74cf1dbcbe0d060e62a1cff77950d3cf19f4f4c11ebaceeae2f072445a583ffa" Mar 08 03:58:40.159040 master-0 kubenswrapper[18592]: I0308 03:58:40.158909 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" path="/var/lib/kubelet/pods/8e52bef89f4b50e4590a1719bcc5d7e5/volumes" Mar 08 03:58:40.307806 master-0 kubenswrapper[18592]: E0308 03:58:40.307673 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:58:40.675777 master-0 kubenswrapper[18592]: I0308 03:58:40.675687 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 03:58:40.676102 master-0 kubenswrapper[18592]: I0308 03:58:40.675777 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 03:58:41.526117 master-0 kubenswrapper[18592]: E0308 03:58:41.525676 18592 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:58:41.685073 master-0 kubenswrapper[18592]: E0308 03:58:41.684856 18592 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189ac19dfe776277 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Killing,Message:Stopping container etcd-rev,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:58:07.646974583 +0000 UTC m=+299.745728933,LastTimestamp:2026-03-08 03:58:07.646974583 +0000 UTC m=+299.745728933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:58:43.303123 master-0 kubenswrapper[18592]: I0308 03:58:43.303063 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 03:58:43.304236 master-0 kubenswrapper[18592]: I0308 03:58:43.304188 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 03:58:47.142511 master-0 kubenswrapper[18592]: I0308 03:58:47.142412 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 08 03:58:47.174653 master-0 kubenswrapper[18592]: I0308 03:58:47.174595 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="3ecf3d00-32ee-49db-bcf4-1bdb6f372c65" Mar 08 03:58:47.174653 master-0 kubenswrapper[18592]: I0308 03:58:47.174642 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="3ecf3d00-32ee-49db-bcf4-1bdb6f372c65" Mar 08 03:58:50.309340 master-0 kubenswrapper[18592]: E0308 03:58:50.309230 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": the server was unable to return a response in the time allotted, but may still be processing the request (get nodes master-0)" Mar 08 03:58:50.677040 master-0 kubenswrapper[18592]: I0308 03:58:50.675786 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 03:58:50.677040 master-0 kubenswrapper[18592]: I0308 03:58:50.675912 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 03:58:50.979001 master-0 kubenswrapper[18592]: I0308 03:58:50.978878 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:58:51.025177 master-0 kubenswrapper[18592]: I0308 03:58:51.025108 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:58:51.419260 master-0 kubenswrapper[18592]: I0308 03:58:51.419066 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:58:51.527415 master-0 kubenswrapper[18592]: E0308 03:58:51.527324 18592 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:58:53.303191 master-0 kubenswrapper[18592]: I0308 03:58:53.303075 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 03:58:53.304050 master-0 kubenswrapper[18592]: I0308 03:58:53.303210 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 03:58:54.403854 master-0 kubenswrapper[18592]: I0308 03:58:54.403720 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-ggzm8_164586b1-f133-4427-8ab6-eb0839b79738/approver/1.log" Mar 08 03:58:54.405010 master-0 kubenswrapper[18592]: I0308 03:58:54.404581 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-ggzm8_164586b1-f133-4427-8ab6-eb0839b79738/approver/0.log" Mar 08 03:58:54.405214 master-0 kubenswrapper[18592]: I0308 03:58:54.405128 18592 generic.go:334] "Generic (PLEG): container finished" podID="164586b1-f133-4427-8ab6-eb0839b79738" containerID="8ebdb5e799974ba85edc25e5ce7cb1526623500db5c19cf0e1e303e5992b5514" exitCode=1 Mar 08 03:58:54.405333 master-0 kubenswrapper[18592]: I0308 03:58:54.405213 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-ggzm8" event={"ID":"164586b1-f133-4427-8ab6-eb0839b79738","Type":"ContainerDied","Data":"8ebdb5e799974ba85edc25e5ce7cb1526623500db5c19cf0e1e303e5992b5514"} Mar 08 03:58:54.405333 master-0 kubenswrapper[18592]: I0308 03:58:54.405295 18592 scope.go:117] "RemoveContainer" containerID="eca6f5647fbdf9b3ef8c7044a7fb91cd16de860543c74991829e340da4a238fe" Mar 08 03:58:54.406660 master-0 kubenswrapper[18592]: I0308 03:58:54.406403 18592 scope.go:117] "RemoveContainer" containerID="8ebdb5e799974ba85edc25e5ce7cb1526623500db5c19cf0e1e303e5992b5514" Mar 08 03:58:55.417744 master-0 kubenswrapper[18592]: I0308 03:58:55.417638 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-ggzm8_164586b1-f133-4427-8ab6-eb0839b79738/approver/1.log" Mar 08 03:58:55.418738 master-0 kubenswrapper[18592]: I0308 03:58:55.418200 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-ggzm8" event={"ID":"164586b1-f133-4427-8ab6-eb0839b79738","Type":"ContainerStarted","Data":"c2a8e8284d8f484d6e2bb2173337c926781157f3f3f28c54a1f2664affe1bc4f"} Mar 08 03:59:00.310173 master-0 kubenswrapper[18592]: E0308 03:59:00.310056 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:59:00.310173 master-0 kubenswrapper[18592]: E0308 03:59:00.310127 18592 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 03:59:00.677022 master-0 kubenswrapper[18592]: I0308 03:59:00.676387 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 03:59:00.677022 master-0 kubenswrapper[18592]: I0308 03:59:00.676483 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 03:59:01.528746 master-0 kubenswrapper[18592]: E0308 03:59:01.528640 18592 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:59:01.528746 master-0 kubenswrapper[18592]: I0308 03:59:01.528723 18592 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 08 03:59:03.303117 master-0 kubenswrapper[18592]: I0308 03:59:03.302959 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 03:59:03.303117 master-0 kubenswrapper[18592]: I0308 03:59:03.303057 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 03:59:08.147527 master-0 kubenswrapper[18592]: I0308 03:59:08.147459 18592 status_manager.go:851] "Failed to get status for pod" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" pod="openshift-etcd/etcd-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods etcd-master-0)" Mar 08 03:59:10.676163 master-0 kubenswrapper[18592]: I0308 03:59:10.676082 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 03:59:10.676963 master-0 kubenswrapper[18592]: I0308 03:59:10.676167 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 03:59:11.529369 master-0 kubenswrapper[18592]: E0308 03:59:11.529273 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 08 03:59:13.303204 master-0 kubenswrapper[18592]: I0308 03:59:13.303132 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 03:59:13.304195 master-0 kubenswrapper[18592]: I0308 03:59:13.304071 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 03:59:15.688631 master-0 kubenswrapper[18592]: E0308 03:59:15.688435 18592 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189ac19dfe77f17a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Killing,Message:Stopping container etcd-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:58:07.647011194 +0000 UTC m=+299.745765544,LastTimestamp:2026-03-08 03:58:07.647011194 +0000 UTC m=+299.745765544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:59:16.998840 master-0 kubenswrapper[18592]: I0308 03:59:16.998749 18592 scope.go:117] "RemoveContainer" containerID="d92fdcd0bd88e0c579bd858a04d7e6e266a7f72aec3885543e0de2cee51140ac" Mar 08 03:59:17.017643 master-0 kubenswrapper[18592]: I0308 03:59:17.017575 18592 scope.go:117] "RemoveContainer" containerID="9d1a3af9468d450b8ce515e818a31e6bfe522f30f01bccb1080ebaabf3f6d3f1" Mar 08 03:59:17.048020 master-0 kubenswrapper[18592]: I0308 03:59:17.047954 18592 scope.go:117] "RemoveContainer" containerID="f5cec83dc05dfae95933e7d5e4646a470fd6b2150eeed3507d1b115fc1dfcb34" Mar 08 03:59:17.075312 master-0 kubenswrapper[18592]: I0308 03:59:17.075251 18592 scope.go:117] "RemoveContainer" containerID="6979155324a9775c0f334fc4aa6afa070463810c3191479ea2bb2dbfe2843ea3" Mar 08 03:59:17.572886 master-0 kubenswrapper[18592]: E0308 03:59:17.572792 18592 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 08 03:59:17.572886 master-0 kubenswrapper[18592]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6fd77597b5-w649n_openshift-authentication_836c1be1-26de-4840-8ba6-9d34a751aebc_0(426a721a967ea04526dcde412716eb651ee1158db31cb34288b0c7a5584ede00): error adding pod openshift-authentication_oauth-openshift-6fd77597b5-w649n to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"426a721a967ea04526dcde412716eb651ee1158db31cb34288b0c7a5584ede00" Netns:"/var/run/netns/77dce124-f673-40bb-b1e8-99b550c74226" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6fd77597b5-w649n;K8S_POD_INFRA_CONTAINER_ID=426a721a967ea04526dcde412716eb651ee1158db31cb34288b0c7a5584ede00;K8S_POD_UID=836c1be1-26de-4840-8ba6-9d34a751aebc" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6fd77597b5-w649n] networking: Multus: [openshift-authentication/oauth-openshift-6fd77597b5-w649n/836c1be1-26de-4840-8ba6-9d34a751aebc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: status update failed for pod /: the server was unable to return a response in the time allotted, but may still be processing the request (get pods oauth-openshift-6fd77597b5-w649n) Mar 08 03:59:17.572886 master-0 kubenswrapper[18592]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 03:59:17.572886 master-0 kubenswrapper[18592]: > Mar 08 03:59:17.573408 master-0 kubenswrapper[18592]: E0308 03:59:17.572925 18592 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 08 03:59:17.573408 master-0 kubenswrapper[18592]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6fd77597b5-w649n_openshift-authentication_836c1be1-26de-4840-8ba6-9d34a751aebc_0(426a721a967ea04526dcde412716eb651ee1158db31cb34288b0c7a5584ede00): error adding pod openshift-authentication_oauth-openshift-6fd77597b5-w649n to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"426a721a967ea04526dcde412716eb651ee1158db31cb34288b0c7a5584ede00" Netns:"/var/run/netns/77dce124-f673-40bb-b1e8-99b550c74226" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6fd77597b5-w649n;K8S_POD_INFRA_CONTAINER_ID=426a721a967ea04526dcde412716eb651ee1158db31cb34288b0c7a5584ede00;K8S_POD_UID=836c1be1-26de-4840-8ba6-9d34a751aebc" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6fd77597b5-w649n] networking: Multus: [openshift-authentication/oauth-openshift-6fd77597b5-w649n/836c1be1-26de-4840-8ba6-9d34a751aebc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: status update failed for pod /: the server was unable to return a response in the time allotted, but may still be processing the request (get pods oauth-openshift-6fd77597b5-w649n) Mar 08 03:59:17.573408 master-0 kubenswrapper[18592]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 03:59:17.573408 master-0 kubenswrapper[18592]: > pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:59:17.573408 master-0 kubenswrapper[18592]: E0308 03:59:17.572958 18592 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 08 03:59:17.573408 master-0 kubenswrapper[18592]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6fd77597b5-w649n_openshift-authentication_836c1be1-26de-4840-8ba6-9d34a751aebc_0(426a721a967ea04526dcde412716eb651ee1158db31cb34288b0c7a5584ede00): error adding pod openshift-authentication_oauth-openshift-6fd77597b5-w649n to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"426a721a967ea04526dcde412716eb651ee1158db31cb34288b0c7a5584ede00" Netns:"/var/run/netns/77dce124-f673-40bb-b1e8-99b550c74226" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6fd77597b5-w649n;K8S_POD_INFRA_CONTAINER_ID=426a721a967ea04526dcde412716eb651ee1158db31cb34288b0c7a5584ede00;K8S_POD_UID=836c1be1-26de-4840-8ba6-9d34a751aebc" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6fd77597b5-w649n] networking: Multus: [openshift-authentication/oauth-openshift-6fd77597b5-w649n/836c1be1-26de-4840-8ba6-9d34a751aebc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: status update failed for pod /: the server was unable to return a response in the time allotted, but may still be processing the request (get pods oauth-openshift-6fd77597b5-w649n) Mar 08 03:59:17.573408 master-0 kubenswrapper[18592]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 03:59:17.573408 master-0 kubenswrapper[18592]: > pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:59:17.573408 master-0 kubenswrapper[18592]: E0308 03:59:17.573041 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-6fd77597b5-w649n_openshift-authentication(836c1be1-26de-4840-8ba6-9d34a751aebc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-6fd77597b5-w649n_openshift-authentication(836c1be1-26de-4840-8ba6-9d34a751aebc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6fd77597b5-w649n_openshift-authentication_836c1be1-26de-4840-8ba6-9d34a751aebc_0(426a721a967ea04526dcde412716eb651ee1158db31cb34288b0c7a5584ede00): error adding pod openshift-authentication_oauth-openshift-6fd77597b5-w649n to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"426a721a967ea04526dcde412716eb651ee1158db31cb34288b0c7a5584ede00\\\" Netns:\\\"/var/run/netns/77dce124-f673-40bb-b1e8-99b550c74226\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6fd77597b5-w649n;K8S_POD_INFRA_CONTAINER_ID=426a721a967ea04526dcde412716eb651ee1158db31cb34288b0c7a5584ede00;K8S_POD_UID=836c1be1-26de-4840-8ba6-9d34a751aebc\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6fd77597b5-w649n] networking: Multus: [openshift-authentication/oauth-openshift-6fd77597b5-w649n/836c1be1-26de-4840-8ba6-9d34a751aebc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: status update failed for pod /: the server was unable to return a response in the time allotted, but may still be processing the request (get pods oauth-openshift-6fd77597b5-w649n)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" podUID="836c1be1-26de-4840-8ba6-9d34a751aebc" Mar 08 03:59:17.608977 master-0 kubenswrapper[18592]: I0308 03:59:17.607813 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:59:17.608977 master-0 kubenswrapper[18592]: I0308 03:59:17.608585 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 03:59:20.471122 master-0 kubenswrapper[18592]: E0308 03:59:20.470773 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:59:10Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:59:10Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:59:10Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:59:10Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3e7365fa46219476560dd59d3a82f041546a33f0935c57eb4f3274ab3118ef0b\\\"],\\\"sizeBytes\\\":2895821940},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:ae042a5d32eb2f18d537f2068849e665b55df7d8360daedaaeea98bd2a79e769\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d077bbabe6cb885ed229119008480493e8364e4bfddaa00b099f68c52b016e6b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733328350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:063b8972231e65eb43f6545ba37804f68138dc54d97b91a652a1c5bc7dc76aa5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cf682d23b2857e455609879a0867d171a221c18e2cec995dd79570b77c5a4705\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1272201949},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e0c034ae18daa01af8d073f8cc24ae4af87883c664304910eab1167fdfd60c0b\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ef0c6b9e405f7a452211e063ce07ded04ccbe38b53860bfd71b5a7cd5072830a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1229556414},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:79984dfbdf9aeae3985c7fd7515e12328775c0e7fc4782929d0998f4dd2a87c6\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:7be89499615ec913d0fe40ca89682080a3f1181a066dbc501c877cc7ccbcc9ae\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1220167376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff40e33e63d6c1f4e4393d5506e38def25ba20582d980fec8b81f81c867ceeec\\\"],\\\"sizeBytes\\\":918278686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:042e6a37747405da54cf91543d44408c9531327a2cce653c41ca851aa7c896d8\\\"],\\\"sizeBytes\\\":880378279},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2508a5f66e509e813cb09825b5456be91b4cdd4d02f470f22a33de42c753f2b7\\\"],\\\"sizeBytes\\\":862197440},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:db06a0e0308b2e541c7bb2d11517431abb31133b2ce6cb6c34ecf5ef4188a4e8\\\"],\\\"sizeBytes\\\":633876767},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3cdb019b6769514c0e92ef92da73e914fbcf6254cc919677ee077c93ce324de0\\\"],\\\"sizeBytes\\\":605698200},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d470dba32064cc62b2ab29303d6e00612304548262eaa2f4e5b40a00a26f71ce\\\"],\\\"sizeBytes\\\":557426734},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5\\\"],\\\"sizeBytes\\\":513581866},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ca868abfecbf9a9c414a4c79e57c4c55e62c8a6796f899ba59dde86c4cf4bb\\\"],\\\"sizeBytes\\\":512235767},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b8cb5e0caeca0fb02f3e8c72b7ddf1c49e3c602e42e119ba30c60525f1db1821\\\"],\\\"sizeBytes\\\":504658657},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c8f904c1084450856b501d40bbc9246265fe34a2b70efec23541e3285da7f88\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9b2e765b795c30c910c331c85226e5db0d56463b6c81d79ded739cba76e2b032\\\"],\\\"sizeBytes\\\":487151732}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:59:20.676359 master-0 kubenswrapper[18592]: I0308 03:59:20.676314 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 03:59:20.676670 master-0 kubenswrapper[18592]: I0308 03:59:20.676364 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 03:59:21.179133 master-0 kubenswrapper[18592]: E0308 03:59:21.178211 18592 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 08 03:59:21.179133 master-0 kubenswrapper[18592]: I0308 03:59:21.178995 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 08 03:59:21.215312 master-0 kubenswrapper[18592]: W0308 03:59:21.215239 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29c709c82970b529e7b9b895aa92ef05.slice/crio-b1e73ff0a3d72b31ef4c63583bf068c3fdbe908c95a3c78f112609180d032ee7 WatchSource:0}: Error finding container b1e73ff0a3d72b31ef4c63583bf068c3fdbe908c95a3c78f112609180d032ee7: Status 404 returned error can't find the container with id b1e73ff0a3d72b31ef4c63583bf068c3fdbe908c95a3c78f112609180d032ee7 Mar 08 03:59:21.650615 master-0 kubenswrapper[18592]: I0308 03:59:21.650531 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"6aa937e809119a59f8109f26d7ef51d082c2b290c602dc49e4fcdb1f2538529e"} Mar 08 03:59:21.650615 master-0 kubenswrapper[18592]: I0308 03:59:21.650614 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"b1e73ff0a3d72b31ef4c63583bf068c3fdbe908c95a3c78f112609180d032ee7"} Mar 08 03:59:21.653405 master-0 kubenswrapper[18592]: I0308 03:59:21.651203 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="3ecf3d00-32ee-49db-bcf4-1bdb6f372c65" Mar 08 03:59:21.653405 master-0 kubenswrapper[18592]: I0308 03:59:21.651240 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="3ecf3d00-32ee-49db-bcf4-1bdb6f372c65" Mar 08 03:59:21.730667 master-0 kubenswrapper[18592]: E0308 03:59:21.730373 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Mar 08 03:59:22.663457 master-0 kubenswrapper[18592]: I0308 03:59:22.663368 18592 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="6aa937e809119a59f8109f26d7ef51d082c2b290c602dc49e4fcdb1f2538529e" exitCode=0 Mar 08 03:59:22.663457 master-0 kubenswrapper[18592]: I0308 03:59:22.663445 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"6aa937e809119a59f8109f26d7ef51d082c2b290c602dc49e4fcdb1f2538529e"} Mar 08 03:59:23.303103 master-0 kubenswrapper[18592]: I0308 03:59:23.302988 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 03:59:23.303355 master-0 kubenswrapper[18592]: I0308 03:59:23.303111 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 03:59:23.678202 master-0 kubenswrapper[18592]: I0308 03:59:23.677971 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-5-master-0_57550d28-9d40-436b-b20f-da41f30a0e6b/installer/0.log" Mar 08 03:59:23.678202 master-0 kubenswrapper[18592]: I0308 03:59:23.678071 18592 generic.go:334] "Generic (PLEG): container finished" podID="57550d28-9d40-436b-b20f-da41f30a0e6b" containerID="cb5e37fbfe5cc1bcf2f3f419a6b9f123db459cab245005e63a2065425117f690" exitCode=1 Mar 08 03:59:23.678202 master-0 kubenswrapper[18592]: I0308 03:59:23.678117 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"57550d28-9d40-436b-b20f-da41f30a0e6b","Type":"ContainerDied","Data":"cb5e37fbfe5cc1bcf2f3f419a6b9f123db459cab245005e63a2065425117f690"} Mar 08 03:59:25.134944 master-0 kubenswrapper[18592]: I0308 03:59:25.134885 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-5-master-0_57550d28-9d40-436b-b20f-da41f30a0e6b/installer/0.log" Mar 08 03:59:25.135635 master-0 kubenswrapper[18592]: I0308 03:59:25.134962 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:59:25.312994 master-0 kubenswrapper[18592]: I0308 03:59:25.312914 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57550d28-9d40-436b-b20f-da41f30a0e6b-kubelet-dir\") pod \"57550d28-9d40-436b-b20f-da41f30a0e6b\" (UID: \"57550d28-9d40-436b-b20f-da41f30a0e6b\") " Mar 08 03:59:25.313617 master-0 kubenswrapper[18592]: I0308 03:59:25.313018 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/57550d28-9d40-436b-b20f-da41f30a0e6b-var-lock\") pod \"57550d28-9d40-436b-b20f-da41f30a0e6b\" (UID: \"57550d28-9d40-436b-b20f-da41f30a0e6b\") " Mar 08 03:59:25.313617 master-0 kubenswrapper[18592]: I0308 03:59:25.313107 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57550d28-9d40-436b-b20f-da41f30a0e6b-kube-api-access\") pod \"57550d28-9d40-436b-b20f-da41f30a0e6b\" (UID: \"57550d28-9d40-436b-b20f-da41f30a0e6b\") " Mar 08 03:59:25.313617 master-0 kubenswrapper[18592]: I0308 03:59:25.313103 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57550d28-9d40-436b-b20f-da41f30a0e6b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "57550d28-9d40-436b-b20f-da41f30a0e6b" (UID: "57550d28-9d40-436b-b20f-da41f30a0e6b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:59:25.313617 master-0 kubenswrapper[18592]: I0308 03:59:25.313195 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57550d28-9d40-436b-b20f-da41f30a0e6b-var-lock" (OuterVolumeSpecName: "var-lock") pod "57550d28-9d40-436b-b20f-da41f30a0e6b" (UID: "57550d28-9d40-436b-b20f-da41f30a0e6b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:59:25.313939 master-0 kubenswrapper[18592]: I0308 03:59:25.313903 18592 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57550d28-9d40-436b-b20f-da41f30a0e6b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:59:25.314006 master-0 kubenswrapper[18592]: I0308 03:59:25.313948 18592 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/57550d28-9d40-436b-b20f-da41f30a0e6b-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:59:25.318188 master-0 kubenswrapper[18592]: I0308 03:59:25.318104 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57550d28-9d40-436b-b20f-da41f30a0e6b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "57550d28-9d40-436b-b20f-da41f30a0e6b" (UID: "57550d28-9d40-436b-b20f-da41f30a0e6b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:59:25.415351 master-0 kubenswrapper[18592]: I0308 03:59:25.415281 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57550d28-9d40-436b-b20f-da41f30a0e6b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:59:25.699722 master-0 kubenswrapper[18592]: I0308 03:59:25.699641 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-5-master-0_57550d28-9d40-436b-b20f-da41f30a0e6b/installer/0.log" Mar 08 03:59:25.700256 master-0 kubenswrapper[18592]: I0308 03:59:25.699736 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"57550d28-9d40-436b-b20f-da41f30a0e6b","Type":"ContainerDied","Data":"5943e770bf612f37cde4c4f29602d84e71717a59f303c6cb668defe2ccdd5f53"} Mar 08 03:59:25.700256 master-0 kubenswrapper[18592]: I0308 03:59:25.699790 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5943e770bf612f37cde4c4f29602d84e71717a59f303c6cb668defe2ccdd5f53" Mar 08 03:59:25.700256 master-0 kubenswrapper[18592]: I0308 03:59:25.699879 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:59:27.725945 master-0 kubenswrapper[18592]: I0308 03:59:27.725796 18592 generic.go:334] "Generic (PLEG): container finished" podID="ee586416-6f56-4ea4-ad62-95de1e6df23b" containerID="b993a7c9605bb38752ad78b483fef1e87627a44d0b8204e7dbbc52680443a98d" exitCode=0 Mar 08 03:59:27.725945 master-0 kubenswrapper[18592]: I0308 03:59:27.725880 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" event={"ID":"ee586416-6f56-4ea4-ad62-95de1e6df23b","Type":"ContainerDied","Data":"b993a7c9605bb38752ad78b483fef1e87627a44d0b8204e7dbbc52680443a98d"} Mar 08 03:59:27.727043 master-0 kubenswrapper[18592]: I0308 03:59:27.726035 18592 scope.go:117] "RemoveContainer" containerID="ee2bfb125e22b7f5901652b9d324e5701d25b6ae22870a7e30683877ccc3b4cb" Mar 08 03:59:27.727043 master-0 kubenswrapper[18592]: I0308 03:59:27.726818 18592 scope.go:117] "RemoveContainer" containerID="b993a7c9605bb38752ad78b483fef1e87627a44d0b8204e7dbbc52680443a98d" Mar 08 03:59:27.727278 master-0 kubenswrapper[18592]: E0308 03:59:27.727218 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=insights-operator pod=insights-operator-8f89dfddd-4mr6p_openshift-insights(ee586416-6f56-4ea4-ad62-95de1e6df23b)\"" pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" podUID="ee586416-6f56-4ea4-ad62-95de1e6df23b" Mar 08 03:59:30.472092 master-0 kubenswrapper[18592]: E0308 03:59:30.471800 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:59:30.676444 master-0 kubenswrapper[18592]: I0308 03:59:30.676349 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 03:59:30.676741 master-0 kubenswrapper[18592]: I0308 03:59:30.676495 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 03:59:32.131370 master-0 kubenswrapper[18592]: E0308 03:59:32.131299 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Mar 08 03:59:33.302752 master-0 kubenswrapper[18592]: I0308 03:59:33.302649 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 03:59:33.303585 master-0 kubenswrapper[18592]: I0308 03:59:33.302756 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 03:59:40.472740 master-0 kubenswrapper[18592]: E0308 03:59:40.472644 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:59:40.676581 master-0 kubenswrapper[18592]: I0308 03:59:40.676500 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 03:59:40.676887 master-0 kubenswrapper[18592]: I0308 03:59:40.676619 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 03:59:42.146031 master-0 kubenswrapper[18592]: I0308 03:59:42.145968 18592 scope.go:117] "RemoveContainer" containerID="b993a7c9605bb38752ad78b483fef1e87627a44d0b8204e7dbbc52680443a98d" Mar 08 03:59:42.866929 master-0 kubenswrapper[18592]: I0308 03:59:42.866780 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" event={"ID":"ee586416-6f56-4ea4-ad62-95de1e6df23b","Type":"ContainerStarted","Data":"79c3878ca5ec702bd04b79fb49bba2be6016b0025cef22be477ef3ddf3eccc0f"} Mar 08 03:59:42.933279 master-0 kubenswrapper[18592]: E0308 03:59:42.933188 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Mar 08 03:59:43.302343 master-0 kubenswrapper[18592]: I0308 03:59:43.302248 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 03:59:43.302343 master-0 kubenswrapper[18592]: I0308 03:59:43.302332 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 03:59:49.692405 master-0 kubenswrapper[18592]: E0308 03:59:49.692201 18592 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189ac19dfe786a6b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Killing,Message:Stopping container etcd-metrics,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:58:07.647042155 +0000 UTC m=+299.745796505,LastTimestamp:2026-03-08 03:58:07.647042155 +0000 UTC m=+299.745796505,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:59:50.473969 master-0 kubenswrapper[18592]: E0308 03:59:50.473878 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:59:50.676261 master-0 kubenswrapper[18592]: I0308 03:59:50.676137 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 03:59:50.676261 master-0 kubenswrapper[18592]: I0308 03:59:50.676228 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 03:59:53.302621 master-0 kubenswrapper[18592]: I0308 03:59:53.302497 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 03:59:53.302621 master-0 kubenswrapper[18592]: I0308 03:59:53.302607 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 03:59:53.965998 master-0 kubenswrapper[18592]: I0308 03:59:53.965874 18592 generic.go:334] "Generic (PLEG): container finished" podID="54ad284e-d40e-4e69-b898-f5093952a0e6" containerID="1a31a3069861ce06f33609b07d5ca3abb641a6a3e5a27333ce4ca305d8846e91" exitCode=0 Mar 08 03:59:53.965998 master-0 kubenswrapper[18592]: I0308 03:59:53.965899 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" event={"ID":"54ad284e-d40e-4e69-b898-f5093952a0e6","Type":"ContainerDied","Data":"1a31a3069861ce06f33609b07d5ca3abb641a6a3e5a27333ce4ca305d8846e91"} Mar 08 03:59:53.965998 master-0 kubenswrapper[18592]: I0308 03:59:53.965988 18592 scope.go:117] "RemoveContainer" containerID="791aee9d23f28d5b9bc6bbbcd3f26705c245a61021bebb20a57835608ad72cab" Mar 08 03:59:53.966923 master-0 kubenswrapper[18592]: I0308 03:59:53.966863 18592 scope.go:117] "RemoveContainer" containerID="1a31a3069861ce06f33609b07d5ca3abb641a6a3e5a27333ce4ca305d8846e91" Mar 08 03:59:54.470185 master-0 kubenswrapper[18592]: I0308 03:59:54.470106 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:59:54.534233 master-0 kubenswrapper[18592]: E0308 03:59:54.534115 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Mar 08 03:59:54.977640 master-0 kubenswrapper[18592]: I0308 03:59:54.977529 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" event={"ID":"54ad284e-d40e-4e69-b898-f5093952a0e6","Type":"ContainerStarted","Data":"09245991d8a87f579bcdccdc86d0c865cdc5351214a3a90a8e445185807c88d3"} Mar 08 03:59:54.978041 master-0 kubenswrapper[18592]: I0308 03:59:54.977880 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:59:54.980134 master-0 kubenswrapper[18592]: I0308 03:59:54.980058 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-9sw2d" Mar 08 03:59:55.655058 master-0 kubenswrapper[18592]: E0308 03:59:55.654936 18592 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 08 03:59:55.987539 master-0 kubenswrapper[18592]: I0308 03:59:55.987477 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="3ecf3d00-32ee-49db-bcf4-1bdb6f372c65" Mar 08 03:59:55.987539 master-0 kubenswrapper[18592]: I0308 03:59:55.987514 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="3ecf3d00-32ee-49db-bcf4-1bdb6f372c65" Mar 08 04:00:00.474632 master-0 kubenswrapper[18592]: E0308 04:00:00.474546 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 04:00:00.474632 master-0 kubenswrapper[18592]: E0308 04:00:00.474599 18592 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 04:00:00.676526 master-0 kubenswrapper[18592]: I0308 04:00:00.676444 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:00:00.676778 master-0 kubenswrapper[18592]: I0308 04:00:00.676540 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:00:03.061585 master-0 kubenswrapper[18592]: I0308 04:00:03.061502 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-8h6fj_1b69fbf6-1ca5-413e-bffd-965730bcec1b/manager/1.log" Mar 08 04:00:03.062916 master-0 kubenswrapper[18592]: I0308 04:00:03.062820 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-8h6fj_1b69fbf6-1ca5-413e-bffd-965730bcec1b/manager/0.log" Mar 08 04:00:03.063503 master-0 kubenswrapper[18592]: I0308 04:00:03.063449 18592 generic.go:334] "Generic (PLEG): container finished" podID="1b69fbf6-1ca5-413e-bffd-965730bcec1b" containerID="0a736e2d8048f2ceaf18442229490b10930da59bbbcadc3ca87a4ce073f730b4" exitCode=1 Mar 08 04:00:03.063813 master-0 kubenswrapper[18592]: I0308 04:00:03.063666 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" event={"ID":"1b69fbf6-1ca5-413e-bffd-965730bcec1b","Type":"ContainerDied","Data":"0a736e2d8048f2ceaf18442229490b10930da59bbbcadc3ca87a4ce073f730b4"} Mar 08 04:00:03.064033 master-0 kubenswrapper[18592]: I0308 04:00:03.064006 18592 scope.go:117] "RemoveContainer" containerID="55aa7553b7b737c589cdd0270a8ec23cc64ce136f8130219ce1dabd7e976b992" Mar 08 04:00:03.065737 master-0 kubenswrapper[18592]: I0308 04:00:03.065676 18592 scope.go:117] "RemoveContainer" containerID="0a736e2d8048f2ceaf18442229490b10930da59bbbcadc3ca87a4ce073f730b4" Mar 08 04:00:03.302717 master-0 kubenswrapper[18592]: I0308 04:00:03.302650 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:00:03.302914 master-0 kubenswrapper[18592]: I0308 04:00:03.302732 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:00:04.074705 master-0 kubenswrapper[18592]: I0308 04:00:04.074631 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-8h6fj_1b69fbf6-1ca5-413e-bffd-965730bcec1b/manager/1.log" Mar 08 04:00:04.075554 master-0 kubenswrapper[18592]: I0308 04:00:04.075273 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" event={"ID":"1b69fbf6-1ca5-413e-bffd-965730bcec1b","Type":"ContainerStarted","Data":"36fc1db8d84020810d8981920a46db4de4096aab1b9629488cfd75412815fcb2"} Mar 08 04:00:04.075917 master-0 kubenswrapper[18592]: I0308 04:00:04.075745 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 04:00:06.099449 master-0 kubenswrapper[18592]: I0308 04:00:06.099266 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-f89bv_33ed331b-89e9-45f8-ab3c-4533a77cc7b6/cluster-cloud-controller-manager/0.log" Mar 08 04:00:06.099449 master-0 kubenswrapper[18592]: I0308 04:00:06.099363 18592 generic.go:334] "Generic (PLEG): container finished" podID="33ed331b-89e9-45f8-ab3c-4533a77cc7b6" containerID="91401e988995822ef2518e319b542e37936c2f35288b0309adc7d3f08edb68f5" exitCode=1 Mar 08 04:00:06.100529 master-0 kubenswrapper[18592]: I0308 04:00:06.099435 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" event={"ID":"33ed331b-89e9-45f8-ab3c-4533a77cc7b6","Type":"ContainerDied","Data":"91401e988995822ef2518e319b542e37936c2f35288b0309adc7d3f08edb68f5"} Mar 08 04:00:06.100642 master-0 kubenswrapper[18592]: I0308 04:00:06.100517 18592 scope.go:117] "RemoveContainer" containerID="91401e988995822ef2518e319b542e37936c2f35288b0309adc7d3f08edb68f5" Mar 08 04:00:07.117988 master-0 kubenswrapper[18592]: I0308 04:00:07.117905 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-f89bv_33ed331b-89e9-45f8-ab3c-4533a77cc7b6/cluster-cloud-controller-manager/0.log" Mar 08 04:00:07.118871 master-0 kubenswrapper[18592]: I0308 04:00:07.118004 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" event={"ID":"33ed331b-89e9-45f8-ab3c-4533a77cc7b6","Type":"ContainerStarted","Data":"d18a1d160a0c6dd2007b6b4111209f49cc35bc15d23ec79f67fdf5a7c94bf99e"} Mar 08 04:00:07.736173 master-0 kubenswrapper[18592]: E0308 04:00:07.735790 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Mar 08 04:00:08.151233 master-0 kubenswrapper[18592]: I0308 04:00:08.151040 18592 status_manager.go:851] "Failed to get status for pod" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" pod="openshift-monitoring/prometheus-k8s-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods prometheus-k8s-0)" Mar 08 04:00:09.147997 master-0 kubenswrapper[18592]: I0308 04:00:09.147783 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-75qmb_2f59fe81-deee-4ced-ae9d-f17752c82c4b/manager/1.log" Mar 08 04:00:09.149604 master-0 kubenswrapper[18592]: I0308 04:00:09.149566 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-75qmb_2f59fe81-deee-4ced-ae9d-f17752c82c4b/manager/0.log" Mar 08 04:00:09.149749 master-0 kubenswrapper[18592]: I0308 04:00:09.149638 18592 generic.go:334] "Generic (PLEG): container finished" podID="2f59fe81-deee-4ced-ae9d-f17752c82c4b" containerID="b9cf2b977eb9896bb3772a53fc657d6a23f43414b986d9aa544cc9a2d1e44724" exitCode=1 Mar 08 04:00:09.149847 master-0 kubenswrapper[18592]: I0308 04:00:09.149739 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" event={"ID":"2f59fe81-deee-4ced-ae9d-f17752c82c4b","Type":"ContainerDied","Data":"b9cf2b977eb9896bb3772a53fc657d6a23f43414b986d9aa544cc9a2d1e44724"} Mar 08 04:00:09.149934 master-0 kubenswrapper[18592]: I0308 04:00:09.149863 18592 scope.go:117] "RemoveContainer" containerID="3059f49f388319ee646920103084d28d8b0077750e77df3225c9bad4053dd550" Mar 08 04:00:09.150455 master-0 kubenswrapper[18592]: I0308 04:00:09.150416 18592 scope.go:117] "RemoveContainer" containerID="b9cf2b977eb9896bb3772a53fc657d6a23f43414b986d9aa544cc9a2d1e44724" Mar 08 04:00:09.154429 master-0 kubenswrapper[18592]: I0308 04:00:09.154365 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-f89bv_33ed331b-89e9-45f8-ab3c-4533a77cc7b6/config-sync-controllers/0.log" Mar 08 04:00:09.155198 master-0 kubenswrapper[18592]: I0308 04:00:09.155142 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-f89bv_33ed331b-89e9-45f8-ab3c-4533a77cc7b6/cluster-cloud-controller-manager/0.log" Mar 08 04:00:09.155275 master-0 kubenswrapper[18592]: I0308 04:00:09.155230 18592 generic.go:334] "Generic (PLEG): container finished" podID="33ed331b-89e9-45f8-ab3c-4533a77cc7b6" containerID="310c1bcf18c66c58fd78fe7a8197fd35c5d130edf5caca232ed46868d02e501d" exitCode=1 Mar 08 04:00:09.155497 master-0 kubenswrapper[18592]: I0308 04:00:09.155277 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" event={"ID":"33ed331b-89e9-45f8-ab3c-4533a77cc7b6","Type":"ContainerDied","Data":"310c1bcf18c66c58fd78fe7a8197fd35c5d130edf5caca232ed46868d02e501d"} Mar 08 04:00:09.156041 master-0 kubenswrapper[18592]: I0308 04:00:09.155988 18592 scope.go:117] "RemoveContainer" containerID="310c1bcf18c66c58fd78fe7a8197fd35c5d130edf5caca232ed46868d02e501d" Mar 08 04:00:09.746586 master-0 kubenswrapper[18592]: I0308 04:00:09.746470 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 04:00:10.167166 master-0 kubenswrapper[18592]: I0308 04:00:10.166989 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-75qmb_2f59fe81-deee-4ced-ae9d-f17752c82c4b/manager/1.log" Mar 08 04:00:10.168035 master-0 kubenswrapper[18592]: I0308 04:00:10.167905 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" event={"ID":"2f59fe81-deee-4ced-ae9d-f17752c82c4b","Type":"ContainerStarted","Data":"d1d10be09cbaee162bf1aee0adc6baa1854a775f2f1bc51912d80fe72edd152f"} Mar 08 04:00:10.168161 master-0 kubenswrapper[18592]: I0308 04:00:10.168125 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 04:00:10.172255 master-0 kubenswrapper[18592]: I0308 04:00:10.172185 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-f89bv_33ed331b-89e9-45f8-ab3c-4533a77cc7b6/config-sync-controllers/0.log" Mar 08 04:00:10.173102 master-0 kubenswrapper[18592]: I0308 04:00:10.173031 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-f89bv_33ed331b-89e9-45f8-ab3c-4533a77cc7b6/cluster-cloud-controller-manager/0.log" Mar 08 04:00:10.173304 master-0 kubenswrapper[18592]: I0308 04:00:10.173149 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-f89bv" event={"ID":"33ed331b-89e9-45f8-ab3c-4533a77cc7b6","Type":"ContainerStarted","Data":"9a3b80792cae2bdb86d315899befec45f2d22e7748c21f67313ccbc009325414"} Mar 08 04:00:10.676310 master-0 kubenswrapper[18592]: I0308 04:00:10.676231 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:00:10.676597 master-0 kubenswrapper[18592]: I0308 04:00:10.676329 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:00:11.186064 master-0 kubenswrapper[18592]: I0308 04:00:11.185975 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/1.log" Mar 08 04:00:11.187817 master-0 kubenswrapper[18592]: I0308 04:00:11.187727 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/0.log" Mar 08 04:00:11.187998 master-0 kubenswrapper[18592]: I0308 04:00:11.187859 18592 generic.go:334] "Generic (PLEG): container finished" podID="9ec89e27-4360-48f2-a7ca-5d823bda4510" containerID="bc802833ef70245be653ae91aa731daa7eab1a05e8fd9b4edd25c8e6a8279edb" exitCode=1 Mar 08 04:00:11.188077 master-0 kubenswrapper[18592]: I0308 04:00:11.187979 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" event={"ID":"9ec89e27-4360-48f2-a7ca-5d823bda4510","Type":"ContainerDied","Data":"bc802833ef70245be653ae91aa731daa7eab1a05e8fd9b4edd25c8e6a8279edb"} Mar 08 04:00:11.188155 master-0 kubenswrapper[18592]: I0308 04:00:11.188072 18592 scope.go:117] "RemoveContainer" containerID="e1cf094994e913e66c5a9e6e155292c3e34468235cb173dcf1919a0eed0dd4ca" Mar 08 04:00:11.188902 master-0 kubenswrapper[18592]: I0308 04:00:11.188816 18592 scope.go:117] "RemoveContainer" containerID="bc802833ef70245be653ae91aa731daa7eab1a05e8fd9b4edd25c8e6a8279edb" Mar 08 04:00:12.200435 master-0 kubenswrapper[18592]: I0308 04:00:12.200349 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/1.log" Mar 08 04:00:12.201397 master-0 kubenswrapper[18592]: I0308 04:00:12.200477 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" event={"ID":"9ec89e27-4360-48f2-a7ca-5d823bda4510","Type":"ContainerStarted","Data":"bd7c785d8a016e94e8bed926d46c88e1665bdf7a861bc764c57c6fa4c6daf951"} Mar 08 04:00:13.303243 master-0 kubenswrapper[18592]: I0308 04:00:13.303151 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:00:13.304291 master-0 kubenswrapper[18592]: I0308 04:00:13.303244 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:00:16.370030 master-0 kubenswrapper[18592]: I0308 04:00:16.369798 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-8h6fj" Mar 08 04:00:18.476098 master-0 kubenswrapper[18592]: E0308 04:00:18.476010 18592 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 08 04:00:18.476098 master-0 kubenswrapper[18592]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6fd77597b5-w649n_openshift-authentication_836c1be1-26de-4840-8ba6-9d34a751aebc_0(4810207b1449626586b757d8d61897e145318806d9bd189808d6880e118545bd): error adding pod openshift-authentication_oauth-openshift-6fd77597b5-w649n to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4810207b1449626586b757d8d61897e145318806d9bd189808d6880e118545bd" Netns:"/var/run/netns/0036ba8b-f6c7-40ea-b839-ef7713ba39d0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6fd77597b5-w649n;K8S_POD_INFRA_CONTAINER_ID=4810207b1449626586b757d8d61897e145318806d9bd189808d6880e118545bd;K8S_POD_UID=836c1be1-26de-4840-8ba6-9d34a751aebc" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6fd77597b5-w649n] networking: Multus: [openshift-authentication/oauth-openshift-6fd77597b5-w649n/836c1be1-26de-4840-8ba6-9d34a751aebc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6fd77597b5-w649n?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 08 04:00:18.476098 master-0 kubenswrapper[18592]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 04:00:18.476098 master-0 kubenswrapper[18592]: > Mar 08 04:00:18.476731 master-0 kubenswrapper[18592]: E0308 04:00:18.476133 18592 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 08 04:00:18.476731 master-0 kubenswrapper[18592]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6fd77597b5-w649n_openshift-authentication_836c1be1-26de-4840-8ba6-9d34a751aebc_0(4810207b1449626586b757d8d61897e145318806d9bd189808d6880e118545bd): error adding pod openshift-authentication_oauth-openshift-6fd77597b5-w649n to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4810207b1449626586b757d8d61897e145318806d9bd189808d6880e118545bd" Netns:"/var/run/netns/0036ba8b-f6c7-40ea-b839-ef7713ba39d0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6fd77597b5-w649n;K8S_POD_INFRA_CONTAINER_ID=4810207b1449626586b757d8d61897e145318806d9bd189808d6880e118545bd;K8S_POD_UID=836c1be1-26de-4840-8ba6-9d34a751aebc" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6fd77597b5-w649n] networking: Multus: [openshift-authentication/oauth-openshift-6fd77597b5-w649n/836c1be1-26de-4840-8ba6-9d34a751aebc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6fd77597b5-w649n?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 08 04:00:18.476731 master-0 kubenswrapper[18592]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 04:00:18.476731 master-0 kubenswrapper[18592]: > pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 04:00:18.476731 master-0 kubenswrapper[18592]: E0308 04:00:18.476169 18592 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 08 04:00:18.476731 master-0 kubenswrapper[18592]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6fd77597b5-w649n_openshift-authentication_836c1be1-26de-4840-8ba6-9d34a751aebc_0(4810207b1449626586b757d8d61897e145318806d9bd189808d6880e118545bd): error adding pod openshift-authentication_oauth-openshift-6fd77597b5-w649n to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4810207b1449626586b757d8d61897e145318806d9bd189808d6880e118545bd" Netns:"/var/run/netns/0036ba8b-f6c7-40ea-b839-ef7713ba39d0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6fd77597b5-w649n;K8S_POD_INFRA_CONTAINER_ID=4810207b1449626586b757d8d61897e145318806d9bd189808d6880e118545bd;K8S_POD_UID=836c1be1-26de-4840-8ba6-9d34a751aebc" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6fd77597b5-w649n] networking: Multus: [openshift-authentication/oauth-openshift-6fd77597b5-w649n/836c1be1-26de-4840-8ba6-9d34a751aebc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6fd77597b5-w649n?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 08 04:00:18.476731 master-0 kubenswrapper[18592]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 04:00:18.476731 master-0 kubenswrapper[18592]: > pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 04:00:18.476731 master-0 kubenswrapper[18592]: E0308 04:00:18.476274 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-6fd77597b5-w649n_openshift-authentication(836c1be1-26de-4840-8ba6-9d34a751aebc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-6fd77597b5-w649n_openshift-authentication(836c1be1-26de-4840-8ba6-9d34a751aebc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6fd77597b5-w649n_openshift-authentication_836c1be1-26de-4840-8ba6-9d34a751aebc_0(4810207b1449626586b757d8d61897e145318806d9bd189808d6880e118545bd): error adding pod openshift-authentication_oauth-openshift-6fd77597b5-w649n to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"4810207b1449626586b757d8d61897e145318806d9bd189808d6880e118545bd\\\" Netns:\\\"/var/run/netns/0036ba8b-f6c7-40ea-b839-ef7713ba39d0\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6fd77597b5-w649n;K8S_POD_INFRA_CONTAINER_ID=4810207b1449626586b757d8d61897e145318806d9bd189808d6880e118545bd;K8S_POD_UID=836c1be1-26de-4840-8ba6-9d34a751aebc\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6fd77597b5-w649n] networking: Multus: [openshift-authentication/oauth-openshift-6fd77597b5-w649n/836c1be1-26de-4840-8ba6-9d34a751aebc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6fd77597b5-w649n?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" podUID="836c1be1-26de-4840-8ba6-9d34a751aebc" Mar 08 04:00:19.269522 master-0 kubenswrapper[18592]: I0308 04:00:19.269456 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 04:00:19.270907 master-0 kubenswrapper[18592]: I0308 04:00:19.270872 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 04:00:19.751390 master-0 kubenswrapper[18592]: I0308 04:00:19.751320 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-75qmb" Mar 08 04:00:20.499253 master-0 kubenswrapper[18592]: E0308 04:00:20.498863 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T04:00:10Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T04:00:10Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T04:00:10Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T04:00:10Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3e7365fa46219476560dd59d3a82f041546a33f0935c57eb4f3274ab3118ef0b\\\"],\\\"sizeBytes\\\":2895821940},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:ae042a5d32eb2f18d537f2068849e665b55df7d8360daedaaeea98bd2a79e769\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d077bbabe6cb885ed229119008480493e8364e4bfddaa00b099f68c52b016e6b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733328350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:063b8972231e65eb43f6545ba37804f68138dc54d97b91a652a1c5bc7dc76aa5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cf682d23b2857e455609879a0867d171a221c18e2cec995dd79570b77c5a4705\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1272201949},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e0c034ae18daa01af8d073f8cc24ae4af87883c664304910eab1167fdfd60c0b\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ef0c6b9e405f7a452211e063ce07ded04ccbe38b53860bfd71b5a7cd5072830a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1229556414},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:79984dfbdf9aeae3985c7fd7515e12328775c0e7fc4782929d0998f4dd2a87c6\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:7be89499615ec913d0fe40ca89682080a3f1181a066dbc501c877cc7ccbcc9ae\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1220167376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff40e33e63d6c1f4e4393d5506e38def25ba20582d980fec8b81f81c867ceeec\\\"],\\\"sizeBytes\\\":918278686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:042e6a37747405da54cf91543d44408c9531327a2cce653c41ca851aa7c896d8\\\"],\\\"sizeBytes\\\":880378279},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2508a5f66e509e813cb09825b5456be91b4cdd4d02f470f22a33de42c753f2b7\\\"],\\\"sizeBytes\\\":862197440},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:db06a0e0308b2e541c7bb2d11517431abb31133b2ce6cb6c34ecf5ef4188a4e8\\\"],\\\"sizeBytes\\\":633876767},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3cdb019b6769514c0e92ef92da73e914fbcf6254cc919677ee077c93ce324de0\\\"],\\\"sizeBytes\\\":605698200},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d470dba32064cc62b2ab29303d6e00612304548262eaa2f4e5b40a00a26f71ce\\\"],\\\"sizeBytes\\\":557426734},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5\\\"],\\\"sizeBytes\\\":513581866},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ca868abfecbf9a9c414a4c79e57c4c55e62c8a6796f899ba59dde86c4cf4bb\\\"],\\\"sizeBytes\\\":512235767},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b8cb5e0caeca0fb02f3e8c72b7ddf1c49e3c602e42e119ba30c60525f1db1821\\\"],\\\"sizeBytes\\\":504658657},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c8f904c1084450856b501d40bbc9246265fe34a2b70efec23541e3285da7f88\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9b2e765b795c30c910c331c85226e5db0d56463b6c81d79ded739cba76e2b032\\\"],\\\"sizeBytes\\\":487151732}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": context deadline exceeded" Mar 08 04:00:20.677954 master-0 kubenswrapper[18592]: I0308 04:00:20.676745 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:00:20.678328 master-0 kubenswrapper[18592]: I0308 04:00:20.678277 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:00:23.302176 master-0 kubenswrapper[18592]: I0308 04:00:23.302052 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:00:23.302176 master-0 kubenswrapper[18592]: I0308 04:00:23.302125 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:00:23.696413 master-0 kubenswrapper[18592]: E0308 04:00:23.696235 18592 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189ac19dfe78dfeb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:58:07.647072235 +0000 UTC m=+299.745826585,LastTimestamp:2026-03-08 03:58:07.647072235 +0000 UTC m=+299.745826585,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 04:00:24.137107 master-0 kubenswrapper[18592]: E0308 04:00:24.136887 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 08 04:00:29.990561 master-0 kubenswrapper[18592]: E0308 04:00:29.990343 18592 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 08 04:00:30.370475 master-0 kubenswrapper[18592]: I0308 04:00:30.370407 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"9923fd23c7ea45f90654feec1a1fded127dc87f5e3f9ef70940165a34b17180c"} Mar 08 04:00:30.371174 master-0 kubenswrapper[18592]: I0308 04:00:30.371132 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="3ecf3d00-32ee-49db-bcf4-1bdb6f372c65" Mar 08 04:00:30.371339 master-0 kubenswrapper[18592]: I0308 04:00:30.371176 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="3ecf3d00-32ee-49db-bcf4-1bdb6f372c65" Mar 08 04:00:30.500500 master-0 kubenswrapper[18592]: E0308 04:00:30.500336 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 04:00:30.675905 master-0 kubenswrapper[18592]: I0308 04:00:30.675755 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:00:30.676253 master-0 kubenswrapper[18592]: I0308 04:00:30.675906 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:00:31.383806 master-0 kubenswrapper[18592]: I0308 04:00:31.383729 18592 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="9923fd23c7ea45f90654feec1a1fded127dc87f5e3f9ef70940165a34b17180c" exitCode=0 Mar 08 04:00:31.383806 master-0 kubenswrapper[18592]: I0308 04:00:31.383799 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"9923fd23c7ea45f90654feec1a1fded127dc87f5e3f9ef70940165a34b17180c"} Mar 08 04:00:32.398949 master-0 kubenswrapper[18592]: I0308 04:00:32.398900 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/kube-controller-manager/0.log" Mar 08 04:00:32.399963 master-0 kubenswrapper[18592]: I0308 04:00:32.399897 18592 generic.go:334] "Generic (PLEG): container finished" podID="0580c83f64e952a7a614903b6fdf6965" containerID="ade4a3d46dcebb1e326fada73ac4cc99f5151e1191d54ff04a063230922fd053" exitCode=0 Mar 08 04:00:32.400133 master-0 kubenswrapper[18592]: I0308 04:00:32.400002 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0580c83f64e952a7a614903b6fdf6965","Type":"ContainerDied","Data":"ade4a3d46dcebb1e326fada73ac4cc99f5151e1191d54ff04a063230922fd053"} Mar 08 04:00:32.401128 master-0 kubenswrapper[18592]: I0308 04:00:32.401098 18592 scope.go:117] "RemoveContainer" containerID="ade4a3d46dcebb1e326fada73ac4cc99f5151e1191d54ff04a063230922fd053" Mar 08 04:00:33.302574 master-0 kubenswrapper[18592]: I0308 04:00:33.302503 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:00:33.303034 master-0 kubenswrapper[18592]: I0308 04:00:33.302982 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:00:33.417060 master-0 kubenswrapper[18592]: I0308 04:00:33.416963 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/kube-controller-manager/0.log" Mar 08 04:00:33.417060 master-0 kubenswrapper[18592]: I0308 04:00:33.417053 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0580c83f64e952a7a614903b6fdf6965","Type":"ContainerStarted","Data":"b2e2f03048172100d18502fe8929d9dbf909b8a6fed9417d23d6e4cec64856d5"} Mar 08 04:00:34.429737 master-0 kubenswrapper[18592]: I0308 04:00:34.429679 18592 generic.go:334] "Generic (PLEG): container finished" podID="738cecb2-995e-4486-ab1e-05af4df24de0" containerID="faeb18f642a82417511abd4d045c306f37d81c662abebaeb263059eebc3803d7" exitCode=0 Mar 08 04:00:34.430631 master-0 kubenswrapper[18592]: I0308 04:00:34.429741 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" event={"ID":"738cecb2-995e-4486-ab1e-05af4df24de0","Type":"ContainerDied","Data":"faeb18f642a82417511abd4d045c306f37d81c662abebaeb263059eebc3803d7"} Mar 08 04:00:34.431468 master-0 kubenswrapper[18592]: I0308 04:00:34.431438 18592 scope.go:117] "RemoveContainer" containerID="faeb18f642a82417511abd4d045c306f37d81c662abebaeb263059eebc3803d7" Mar 08 04:00:35.207208 master-0 kubenswrapper[18592]: I0308 04:00:35.207080 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:00:35.207208 master-0 kubenswrapper[18592]: I0308 04:00:35.207206 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:00:35.442281 master-0 kubenswrapper[18592]: I0308 04:00:35.442210 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" event={"ID":"738cecb2-995e-4486-ab1e-05af4df24de0","Type":"ContainerStarted","Data":"c74c92181ff516fdd83a866df2d6556df2df948003fcbd924c6a3a2fa67a3050"} Mar 08 04:00:35.443771 master-0 kubenswrapper[18592]: I0308 04:00:35.442647 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 04:00:35.450994 master-0 kubenswrapper[18592]: I0308 04:00:35.450938 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64b4c7cbf8-n5l2s" Mar 08 04:00:38.207566 master-0 kubenswrapper[18592]: I0308 04:00:38.207497 18592 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 04:00:38.208500 master-0 kubenswrapper[18592]: I0308 04:00:38.208462 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 04:00:40.500763 master-0 kubenswrapper[18592]: E0308 04:00:40.500646 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": context deadline exceeded" Mar 08 04:00:40.676490 master-0 kubenswrapper[18592]: I0308 04:00:40.676382 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:00:40.676712 master-0 kubenswrapper[18592]: I0308 04:00:40.676483 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:00:41.137733 master-0 kubenswrapper[18592]: E0308 04:00:41.137635 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 08 04:00:41.501190 master-0 kubenswrapper[18592]: I0308 04:00:41.501112 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/2.log" Mar 08 04:00:41.501945 master-0 kubenswrapper[18592]: I0308 04:00:41.501813 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/1.log" Mar 08 04:00:41.501945 master-0 kubenswrapper[18592]: I0308 04:00:41.501907 18592 generic.go:334] "Generic (PLEG): container finished" podID="9ec89e27-4360-48f2-a7ca-5d823bda4510" containerID="bd7c785d8a016e94e8bed926d46c88e1665bdf7a861bc764c57c6fa4c6daf951" exitCode=1 Mar 08 04:00:41.502169 master-0 kubenswrapper[18592]: I0308 04:00:41.502046 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" event={"ID":"9ec89e27-4360-48f2-a7ca-5d823bda4510","Type":"ContainerDied","Data":"bd7c785d8a016e94e8bed926d46c88e1665bdf7a861bc764c57c6fa4c6daf951"} Mar 08 04:00:41.502341 master-0 kubenswrapper[18592]: I0308 04:00:41.502304 18592 scope.go:117] "RemoveContainer" containerID="bc802833ef70245be653ae91aa731daa7eab1a05e8fd9b4edd25c8e6a8279edb" Mar 08 04:00:41.503750 master-0 kubenswrapper[18592]: I0308 04:00:41.503646 18592 scope.go:117] "RemoveContainer" containerID="bd7c785d8a016e94e8bed926d46c88e1665bdf7a861bc764c57c6fa4c6daf951" Mar 08 04:00:41.504275 master-0 kubenswrapper[18592]: E0308 04:00:41.504202 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-h4qlp_openshift-cluster-storage-operator(9ec89e27-4360-48f2-a7ca-5d823bda4510)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" podUID="9ec89e27-4360-48f2-a7ca-5d823bda4510" Mar 08 04:00:41.507425 master-0 kubenswrapper[18592]: I0308 04:00:41.507365 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-7bcsk_139881ee-6cfa-4a7e-b002-63cece048d16/control-plane-machine-set-operator/0.log" Mar 08 04:00:41.507625 master-0 kubenswrapper[18592]: I0308 04:00:41.507458 18592 generic.go:334] "Generic (PLEG): container finished" podID="139881ee-6cfa-4a7e-b002-63cece048d16" containerID="ee626e564b55f5d917bbbc7e14f501fa99cbf93b071cbbb29f2046199512a0ec" exitCode=1 Mar 08 04:00:41.507625 master-0 kubenswrapper[18592]: I0308 04:00:41.507504 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk" event={"ID":"139881ee-6cfa-4a7e-b002-63cece048d16","Type":"ContainerDied","Data":"ee626e564b55f5d917bbbc7e14f501fa99cbf93b071cbbb29f2046199512a0ec"} Mar 08 04:00:41.508358 master-0 kubenswrapper[18592]: I0308 04:00:41.508302 18592 scope.go:117] "RemoveContainer" containerID="ee626e564b55f5d917bbbc7e14f501fa99cbf93b071cbbb29f2046199512a0ec" Mar 08 04:00:42.518777 master-0 kubenswrapper[18592]: I0308 04:00:42.518707 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-7bcsk_139881ee-6cfa-4a7e-b002-63cece048d16/control-plane-machine-set-operator/0.log" Mar 08 04:00:42.519300 master-0 kubenswrapper[18592]: I0308 04:00:42.518921 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-7bcsk" event={"ID":"139881ee-6cfa-4a7e-b002-63cece048d16","Type":"ContainerStarted","Data":"ea1a87945285362a8ff85544c8ddda3bac6559e3d70b5f87920343c5df18efa8"} Mar 08 04:00:42.521453 master-0 kubenswrapper[18592]: I0308 04:00:42.521437 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/2.log" Mar 08 04:00:43.302699 master-0 kubenswrapper[18592]: I0308 04:00:43.302615 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:00:43.302983 master-0 kubenswrapper[18592]: I0308 04:00:43.302698 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:00:44.550554 master-0 kubenswrapper[18592]: I0308 04:00:44.550250 18592 generic.go:334] "Generic (PLEG): container finished" podID="7e5935ea-8d95-45e3-b836-c7892953ef3d" containerID="2437868a78df876a1b3a4d8757e1788d355f9ffb34efa554d9cc67df46e10738" exitCode=0 Mar 08 04:00:44.550554 master-0 kubenswrapper[18592]: I0308 04:00:44.550320 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" event={"ID":"7e5935ea-8d95-45e3-b836-c7892953ef3d","Type":"ContainerDied","Data":"2437868a78df876a1b3a4d8757e1788d355f9ffb34efa554d9cc67df46e10738"} Mar 08 04:00:44.550554 master-0 kubenswrapper[18592]: I0308 04:00:44.550381 18592 scope.go:117] "RemoveContainer" containerID="7fe9302ada8235a3afd5b8f3fc53b3d920a5fbae69778891c3722690a5eb8590" Mar 08 04:00:44.552188 master-0 kubenswrapper[18592]: I0308 04:00:44.551482 18592 scope.go:117] "RemoveContainer" containerID="2437868a78df876a1b3a4d8757e1788d355f9ffb34efa554d9cc67df46e10738" Mar 08 04:00:45.563586 master-0 kubenswrapper[18592]: I0308 04:00:45.563493 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-bjjfh" event={"ID":"7e5935ea-8d95-45e3-b836-c7892953ef3d","Type":"ContainerStarted","Data":"65c226a33e10a3471a0d0efe53b130c8084b8d1ef1a25a048dcdd416efd3b490"} Mar 08 04:00:47.588683 master-0 kubenswrapper[18592]: I0308 04:00:47.588612 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-754bdc9f9d-z4sdd_127c3f92-8283-4179-9e40-a12dcabaaa12/machine-approver-controller/0.log" Mar 08 04:00:47.589581 master-0 kubenswrapper[18592]: I0308 04:00:47.589388 18592 generic.go:334] "Generic (PLEG): container finished" podID="127c3f92-8283-4179-9e40-a12dcabaaa12" containerID="8eae55733094f3b0776d6b5120bc7a09315bf36094aa0d778ddab4301f0dda90" exitCode=255 Mar 08 04:00:47.589581 master-0 kubenswrapper[18592]: I0308 04:00:47.589451 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" event={"ID":"127c3f92-8283-4179-9e40-a12dcabaaa12","Type":"ContainerDied","Data":"8eae55733094f3b0776d6b5120bc7a09315bf36094aa0d778ddab4301f0dda90"} Mar 08 04:00:47.590453 master-0 kubenswrapper[18592]: I0308 04:00:47.590399 18592 scope.go:117] "RemoveContainer" containerID="8eae55733094f3b0776d6b5120bc7a09315bf36094aa0d778ddab4301f0dda90" Mar 08 04:00:48.206671 master-0 kubenswrapper[18592]: I0308 04:00:48.206524 18592 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 04:00:48.207026 master-0 kubenswrapper[18592]: I0308 04:00:48.206671 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 04:00:48.601521 master-0 kubenswrapper[18592]: I0308 04:00:48.601360 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-754bdc9f9d-z4sdd_127c3f92-8283-4179-9e40-a12dcabaaa12/machine-approver-controller/0.log" Mar 08 04:00:48.602396 master-0 kubenswrapper[18592]: I0308 04:00:48.602029 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-z4sdd" event={"ID":"127c3f92-8283-4179-9e40-a12dcabaaa12","Type":"ContainerStarted","Data":"ddb1e95ef886df86aa8972a27018c1b8970414c6a1f2b5df0041f62281dcb026"} Mar 08 04:00:49.614327 master-0 kubenswrapper[18592]: I0308 04:00:49.614138 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-jghp5_d831cb23-7411-4072-8273-c167d9afca28/cluster-baremetal-operator/1.log" Mar 08 04:00:49.616733 master-0 kubenswrapper[18592]: I0308 04:00:49.616680 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-jghp5_d831cb23-7411-4072-8273-c167d9afca28/cluster-baremetal-operator/0.log" Mar 08 04:00:49.616909 master-0 kubenswrapper[18592]: I0308 04:00:49.616762 18592 generic.go:334] "Generic (PLEG): container finished" podID="d831cb23-7411-4072-8273-c167d9afca28" containerID="d31110b071a8e480e8de5e3d670b44efd0f87c9bdf6fe27693931341b4131afe" exitCode=1 Mar 08 04:00:49.616909 master-0 kubenswrapper[18592]: I0308 04:00:49.616813 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" event={"ID":"d831cb23-7411-4072-8273-c167d9afca28","Type":"ContainerDied","Data":"d31110b071a8e480e8de5e3d670b44efd0f87c9bdf6fe27693931341b4131afe"} Mar 08 04:00:49.617068 master-0 kubenswrapper[18592]: I0308 04:00:49.616917 18592 scope.go:117] "RemoveContainer" containerID="712603a1b97b084eebc58893e05cde574b9f0f2e5360a98b0fe0e6acfea60707" Mar 08 04:00:49.617635 master-0 kubenswrapper[18592]: I0308 04:00:49.617575 18592 scope.go:117] "RemoveContainer" containerID="d31110b071a8e480e8de5e3d670b44efd0f87c9bdf6fe27693931341b4131afe" Mar 08 04:00:50.501504 master-0 kubenswrapper[18592]: E0308 04:00:50.501347 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 04:00:50.629322 master-0 kubenswrapper[18592]: I0308 04:00:50.629225 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-jghp5_d831cb23-7411-4072-8273-c167d9afca28/cluster-baremetal-operator/1.log" Mar 08 04:00:50.630379 master-0 kubenswrapper[18592]: I0308 04:00:50.629762 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" event={"ID":"d831cb23-7411-4072-8273-c167d9afca28","Type":"ContainerStarted","Data":"6db7bba8b22fb8aa1570ff3e56f6b75b965e844f4559f957eeceeaa7d80a64c0"} Mar 08 04:00:50.675816 master-0 kubenswrapper[18592]: I0308 04:00:50.675725 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:00:50.676119 master-0 kubenswrapper[18592]: I0308 04:00:50.675856 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:00:53.143280 master-0 kubenswrapper[18592]: I0308 04:00:53.143164 18592 scope.go:117] "RemoveContainer" containerID="bd7c785d8a016e94e8bed926d46c88e1665bdf7a861bc764c57c6fa4c6daf951" Mar 08 04:00:53.302807 master-0 kubenswrapper[18592]: I0308 04:00:53.302691 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:00:53.303030 master-0 kubenswrapper[18592]: I0308 04:00:53.302816 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:00:53.667667 master-0 kubenswrapper[18592]: I0308 04:00:53.667569 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/2.log" Mar 08 04:00:53.667667 master-0 kubenswrapper[18592]: I0308 04:00:53.667665 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" event={"ID":"9ec89e27-4360-48f2-a7ca-5d823bda4510","Type":"ContainerStarted","Data":"d35e35bce6407453877c8123c5425d614cf068904ff2cbaabddff838a6a4b536"} Mar 08 04:00:57.699916 master-0 kubenswrapper[18592]: E0308 04:00:57.699752 18592 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event=< Mar 08 04:00:57.699916 master-0 kubenswrapper[18592]: &Event{ObjectMeta:{console-744db48f96-lgsd4.189ac19a0b049b8c openshift-console 14598 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-console,Name:console-744db48f96-lgsd4,UID:935ab7fb-b097-41c3-8926-8343eb29e7fc,APIVersion:v1,ResourceVersion:14344,FieldPath:spec.containers{console},},Reason:ProbeError,Message:Startup probe error: Get "https://10.128.0.96:8443/health": dial tcp 10.128.0.96:8443: connect: connection refused Mar 08 04:00:57.699916 master-0 kubenswrapper[18592]: body: Mar 08 04:00:57.699916 master-0 kubenswrapper[18592]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:57:50 +0000 UTC,LastTimestamp:2026-03-08 03:58:10.676177719 +0000 UTC m=+302.774932099,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Mar 08 04:00:57.699916 master-0 kubenswrapper[18592]: > Mar 08 04:00:58.139821 master-0 kubenswrapper[18592]: E0308 04:00:58.139516 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 08 04:00:58.208455 master-0 kubenswrapper[18592]: I0308 04:00:58.208328 18592 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 04:00:58.209041 master-0 kubenswrapper[18592]: I0308 04:00:58.208456 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 04:00:58.209041 master-0 kubenswrapper[18592]: I0308 04:00:58.208545 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:00:58.209945 master-0 kubenswrapper[18592]: I0308 04:00:58.209881 18592 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"b2e2f03048172100d18502fe8929d9dbf909b8a6fed9417d23d6e4cec64856d5"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 08 04:00:58.210145 master-0 kubenswrapper[18592]: I0308 04:00:58.210087 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" containerID="cri-o://b2e2f03048172100d18502fe8929d9dbf909b8a6fed9417d23d6e4cec64856d5" gracePeriod=30 Mar 08 04:00:58.718857 master-0 kubenswrapper[18592]: I0308 04:00:58.718754 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/cluster-policy-controller/1.log" Mar 08 04:00:58.722862 master-0 kubenswrapper[18592]: I0308 04:00:58.722749 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/kube-controller-manager/0.log" Mar 08 04:00:58.723088 master-0 kubenswrapper[18592]: I0308 04:00:58.722929 18592 generic.go:334] "Generic (PLEG): container finished" podID="0580c83f64e952a7a614903b6fdf6965" containerID="b2e2f03048172100d18502fe8929d9dbf909b8a6fed9417d23d6e4cec64856d5" exitCode=255 Mar 08 04:00:58.723088 master-0 kubenswrapper[18592]: I0308 04:00:58.723010 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0580c83f64e952a7a614903b6fdf6965","Type":"ContainerDied","Data":"b2e2f03048172100d18502fe8929d9dbf909b8a6fed9417d23d6e4cec64856d5"} Mar 08 04:00:58.723088 master-0 kubenswrapper[18592]: I0308 04:00:58.723083 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0580c83f64e952a7a614903b6fdf6965","Type":"ContainerStarted","Data":"856e6d6f220b377eb08036d501a1ca7d92cb2b4e922c6f058e4ce28c5b94addf"} Mar 08 04:00:58.723421 master-0 kubenswrapper[18592]: I0308 04:00:58.723113 18592 scope.go:117] "RemoveContainer" containerID="ade4a3d46dcebb1e326fada73ac4cc99f5151e1191d54ff04a063230922fd053" Mar 08 04:00:59.736759 master-0 kubenswrapper[18592]: I0308 04:00:59.736562 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/cluster-policy-controller/1.log" Mar 08 04:00:59.740502 master-0 kubenswrapper[18592]: I0308 04:00:59.740431 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/kube-controller-manager/0.log" Mar 08 04:01:00.501794 master-0 kubenswrapper[18592]: E0308 04:01:00.501632 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 04:01:00.501794 master-0 kubenswrapper[18592]: E0308 04:01:00.501729 18592 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 04:01:00.676223 master-0 kubenswrapper[18592]: I0308 04:01:00.676144 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:01:00.676471 master-0 kubenswrapper[18592]: I0308 04:01:00.676218 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:01:03.302859 master-0 kubenswrapper[18592]: I0308 04:01:03.302715 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:01:03.304057 master-0 kubenswrapper[18592]: I0308 04:01:03.302908 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:01:04.375563 master-0 kubenswrapper[18592]: E0308 04:01:04.375456 18592 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 08 04:01:04.789300 master-0 kubenswrapper[18592]: I0308 04:01:04.789218 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="3ecf3d00-32ee-49db-bcf4-1bdb6f372c65" Mar 08 04:01:04.789300 master-0 kubenswrapper[18592]: I0308 04:01:04.789273 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="3ecf3d00-32ee-49db-bcf4-1bdb6f372c65" Mar 08 04:01:05.206896 master-0 kubenswrapper[18592]: I0308 04:01:05.206785 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:01:05.212067 master-0 kubenswrapper[18592]: I0308 04:01:05.207699 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:01:08.153293 master-0 kubenswrapper[18592]: I0308 04:01:08.153221 18592 status_manager.go:851] "Failed to get status for pod" podUID="164586b1-f133-4427-8ab6-eb0839b79738" pod="openshift-network-node-identity/network-node-identity-ggzm8" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods network-node-identity-ggzm8)" Mar 08 04:01:08.208988 master-0 kubenswrapper[18592]: I0308 04:01:08.208892 18592 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 04:01:08.209263 master-0 kubenswrapper[18592]: I0308 04:01:08.208996 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 04:01:10.675965 master-0 kubenswrapper[18592]: I0308 04:01:10.675790 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:01:10.677041 master-0 kubenswrapper[18592]: I0308 04:01:10.675971 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:01:13.302677 master-0 kubenswrapper[18592]: I0308 04:01:13.302584 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:01:13.303598 master-0 kubenswrapper[18592]: I0308 04:01:13.302676 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:01:15.141181 master-0 kubenswrapper[18592]: E0308 04:01:15.141021 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 08 04:01:18.207135 master-0 kubenswrapper[18592]: I0308 04:01:18.207037 18592 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 04:01:18.208251 master-0 kubenswrapper[18592]: I0308 04:01:18.207135 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 04:01:19.993621 master-0 kubenswrapper[18592]: E0308 04:01:19.993554 18592 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 08 04:01:19.993621 master-0 kubenswrapper[18592]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6fd77597b5-w649n_openshift-authentication_836c1be1-26de-4840-8ba6-9d34a751aebc_0(992946af31867424dbf7df9751fe49bb08be6d61612fc24fd707f2795d193206): error adding pod openshift-authentication_oauth-openshift-6fd77597b5-w649n to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"992946af31867424dbf7df9751fe49bb08be6d61612fc24fd707f2795d193206" Netns:"/var/run/netns/f2db3fa7-4996-4d95-be4a-16dc678e9c5f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6fd77597b5-w649n;K8S_POD_INFRA_CONTAINER_ID=992946af31867424dbf7df9751fe49bb08be6d61612fc24fd707f2795d193206;K8S_POD_UID=836c1be1-26de-4840-8ba6-9d34a751aebc" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6fd77597b5-w649n] networking: Multus: [openshift-authentication/oauth-openshift-6fd77597b5-w649n/836c1be1-26de-4840-8ba6-9d34a751aebc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6fd77597b5-w649n?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 08 04:01:19.993621 master-0 kubenswrapper[18592]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 04:01:19.993621 master-0 kubenswrapper[18592]: > Mar 08 04:01:19.994578 master-0 kubenswrapper[18592]: E0308 04:01:19.993633 18592 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 08 04:01:19.994578 master-0 kubenswrapper[18592]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6fd77597b5-w649n_openshift-authentication_836c1be1-26de-4840-8ba6-9d34a751aebc_0(992946af31867424dbf7df9751fe49bb08be6d61612fc24fd707f2795d193206): error adding pod openshift-authentication_oauth-openshift-6fd77597b5-w649n to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"992946af31867424dbf7df9751fe49bb08be6d61612fc24fd707f2795d193206" Netns:"/var/run/netns/f2db3fa7-4996-4d95-be4a-16dc678e9c5f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6fd77597b5-w649n;K8S_POD_INFRA_CONTAINER_ID=992946af31867424dbf7df9751fe49bb08be6d61612fc24fd707f2795d193206;K8S_POD_UID=836c1be1-26de-4840-8ba6-9d34a751aebc" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6fd77597b5-w649n] networking: Multus: [openshift-authentication/oauth-openshift-6fd77597b5-w649n/836c1be1-26de-4840-8ba6-9d34a751aebc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6fd77597b5-w649n?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 08 04:01:19.994578 master-0 kubenswrapper[18592]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 04:01:19.994578 master-0 kubenswrapper[18592]: > pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 04:01:19.994578 master-0 kubenswrapper[18592]: E0308 04:01:19.993652 18592 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 08 04:01:19.994578 master-0 kubenswrapper[18592]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6fd77597b5-w649n_openshift-authentication_836c1be1-26de-4840-8ba6-9d34a751aebc_0(992946af31867424dbf7df9751fe49bb08be6d61612fc24fd707f2795d193206): error adding pod openshift-authentication_oauth-openshift-6fd77597b5-w649n to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"992946af31867424dbf7df9751fe49bb08be6d61612fc24fd707f2795d193206" Netns:"/var/run/netns/f2db3fa7-4996-4d95-be4a-16dc678e9c5f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6fd77597b5-w649n;K8S_POD_INFRA_CONTAINER_ID=992946af31867424dbf7df9751fe49bb08be6d61612fc24fd707f2795d193206;K8S_POD_UID=836c1be1-26de-4840-8ba6-9d34a751aebc" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6fd77597b5-w649n] networking: Multus: [openshift-authentication/oauth-openshift-6fd77597b5-w649n/836c1be1-26de-4840-8ba6-9d34a751aebc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6fd77597b5-w649n?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 08 04:01:19.994578 master-0 kubenswrapper[18592]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 04:01:19.994578 master-0 kubenswrapper[18592]: > pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 04:01:19.994578 master-0 kubenswrapper[18592]: E0308 04:01:19.993708 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-6fd77597b5-w649n_openshift-authentication(836c1be1-26de-4840-8ba6-9d34a751aebc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-6fd77597b5-w649n_openshift-authentication(836c1be1-26de-4840-8ba6-9d34a751aebc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6fd77597b5-w649n_openshift-authentication_836c1be1-26de-4840-8ba6-9d34a751aebc_0(992946af31867424dbf7df9751fe49bb08be6d61612fc24fd707f2795d193206): error adding pod openshift-authentication_oauth-openshift-6fd77597b5-w649n to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"992946af31867424dbf7df9751fe49bb08be6d61612fc24fd707f2795d193206\\\" Netns:\\\"/var/run/netns/f2db3fa7-4996-4d95-be4a-16dc678e9c5f\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6fd77597b5-w649n;K8S_POD_INFRA_CONTAINER_ID=992946af31867424dbf7df9751fe49bb08be6d61612fc24fd707f2795d193206;K8S_POD_UID=836c1be1-26de-4840-8ba6-9d34a751aebc\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6fd77597b5-w649n] networking: Multus: [openshift-authentication/oauth-openshift-6fd77597b5-w649n/836c1be1-26de-4840-8ba6-9d34a751aebc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6fd77597b5-w649n?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" podUID="836c1be1-26de-4840-8ba6-9d34a751aebc" Mar 08 04:01:20.676032 master-0 kubenswrapper[18592]: I0308 04:01:20.675974 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:01:20.676215 master-0 kubenswrapper[18592]: I0308 04:01:20.676052 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:01:20.797465 master-0 kubenswrapper[18592]: E0308 04:01:20.797203 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T04:01:10Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T04:01:10Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T04:01:10Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T04:01:10Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3e7365fa46219476560dd59d3a82f041546a33f0935c57eb4f3274ab3118ef0b\\\"],\\\"sizeBytes\\\":2895821940},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:ae042a5d32eb2f18d537f2068849e665b55df7d8360daedaaeea98bd2a79e769\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d077bbabe6cb885ed229119008480493e8364e4bfddaa00b099f68c52b016e6b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733328350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:063b8972231e65eb43f6545ba37804f68138dc54d97b91a652a1c5bc7dc76aa5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cf682d23b2857e455609879a0867d171a221c18e2cec995dd79570b77c5a4705\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1272201949},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e0c034ae18daa01af8d073f8cc24ae4af87883c664304910eab1167fdfd60c0b\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ef0c6b9e405f7a452211e063ce07ded04ccbe38b53860bfd71b5a7cd5072830a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1229556414},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:79984dfbdf9aeae3985c7fd7515e12328775c0e7fc4782929d0998f4dd2a87c6\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:7be89499615ec913d0fe40ca89682080a3f1181a066dbc501c877cc7ccbcc9ae\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1220167376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff40e33e63d6c1f4e4393d5506e38def25ba20582d980fec8b81f81c867ceeec\\\"],\\\"sizeBytes\\\":918278686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:042e6a37747405da54cf91543d44408c9531327a2cce653c41ca851aa7c896d8\\\"],\\\"sizeBytes\\\":880378279},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2508a5f66e509e813cb09825b5456be91b4cdd4d02f470f22a33de42c753f2b7\\\"],\\\"sizeBytes\\\":862197440},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:db06a0e0308b2e541c7bb2d11517431abb31133b2ce6cb6c34ecf5ef4188a4e8\\\"],\\\"sizeBytes\\\":633876767},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3cdb019b6769514c0e92ef92da73e914fbcf6254cc919677ee077c93ce324de0\\\"],\\\"sizeBytes\\\":605698200},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d470dba32064cc62b2ab29303d6e00612304548262eaa2f4e5b40a00a26f71ce\\\"],\\\"sizeBytes\\\":557426734},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5\\\"],\\\"sizeBytes\\\":513581866},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ca868abfecbf9a9c414a4c79e57c4c55e62c8a6796f899ba59dde86c4cf4bb\\\"],\\\"sizeBytes\\\":512235767},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b8cb5e0caeca0fb02f3e8c72b7ddf1c49e3c602e42e119ba30c60525f1db1821\\\"],\\\"sizeBytes\\\":504658657},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c8f904c1084450856b501d40bbc9246265fe34a2b70efec23541e3285da7f88\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9b2e765b795c30c910c331c85226e5db0d56463b6c81d79ded739cba76e2b032\\\"],\\\"sizeBytes\\\":487151732}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 04:01:20.928299 master-0 kubenswrapper[18592]: I0308 04:01:20.928113 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 04:01:20.928947 master-0 kubenswrapper[18592]: I0308 04:01:20.928904 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 04:01:23.302196 master-0 kubenswrapper[18592]: I0308 04:01:23.302132 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:01:23.302763 master-0 kubenswrapper[18592]: I0308 04:01:23.302222 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:01:23.957695 master-0 kubenswrapper[18592]: I0308 04:01:23.957598 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/3.log" Mar 08 04:01:23.958604 master-0 kubenswrapper[18592]: I0308 04:01:23.958549 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/2.log" Mar 08 04:01:23.958727 master-0 kubenswrapper[18592]: I0308 04:01:23.958622 18592 generic.go:334] "Generic (PLEG): container finished" podID="9ec89e27-4360-48f2-a7ca-5d823bda4510" containerID="d35e35bce6407453877c8123c5425d614cf068904ff2cbaabddff838a6a4b536" exitCode=1 Mar 08 04:01:23.958727 master-0 kubenswrapper[18592]: I0308 04:01:23.958663 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" event={"ID":"9ec89e27-4360-48f2-a7ca-5d823bda4510","Type":"ContainerDied","Data":"d35e35bce6407453877c8123c5425d614cf068904ff2cbaabddff838a6a4b536"} Mar 08 04:01:23.958727 master-0 kubenswrapper[18592]: I0308 04:01:23.958718 18592 scope.go:117] "RemoveContainer" containerID="bd7c785d8a016e94e8bed926d46c88e1665bdf7a861bc764c57c6fa4c6daf951" Mar 08 04:01:23.959820 master-0 kubenswrapper[18592]: I0308 04:01:23.959739 18592 scope.go:117] "RemoveContainer" containerID="d35e35bce6407453877c8123c5425d614cf068904ff2cbaabddff838a6a4b536" Mar 08 04:01:23.960371 master-0 kubenswrapper[18592]: E0308 04:01:23.960288 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-h4qlp_openshift-cluster-storage-operator(9ec89e27-4360-48f2-a7ca-5d823bda4510)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" podUID="9ec89e27-4360-48f2-a7ca-5d823bda4510" Mar 08 04:01:24.969420 master-0 kubenswrapper[18592]: I0308 04:01:24.969305 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/3.log" Mar 08 04:01:28.207168 master-0 kubenswrapper[18592]: I0308 04:01:28.207068 18592 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 04:01:28.208111 master-0 kubenswrapper[18592]: I0308 04:01:28.207180 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 04:01:28.208111 master-0 kubenswrapper[18592]: I0308 04:01:28.207253 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:01:28.208516 master-0 kubenswrapper[18592]: I0308 04:01:28.208327 18592 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"856e6d6f220b377eb08036d501a1ca7d92cb2b4e922c6f058e4ce28c5b94addf"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 08 04:01:28.208654 master-0 kubenswrapper[18592]: I0308 04:01:28.208522 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" containerID="cri-o://856e6d6f220b377eb08036d501a1ca7d92cb2b4e922c6f058e4ce28c5b94addf" gracePeriod=30 Mar 08 04:01:29.008119 master-0 kubenswrapper[18592]: I0308 04:01:29.008012 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/cluster-policy-controller/2.log" Mar 08 04:01:29.009149 master-0 kubenswrapper[18592]: I0308 04:01:29.009075 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/cluster-policy-controller/1.log" Mar 08 04:01:29.013478 master-0 kubenswrapper[18592]: I0308 04:01:29.013371 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/kube-controller-manager/0.log" Mar 08 04:01:29.013897 master-0 kubenswrapper[18592]: I0308 04:01:29.013692 18592 generic.go:334] "Generic (PLEG): container finished" podID="0580c83f64e952a7a614903b6fdf6965" containerID="856e6d6f220b377eb08036d501a1ca7d92cb2b4e922c6f058e4ce28c5b94addf" exitCode=255 Mar 08 04:01:29.013897 master-0 kubenswrapper[18592]: I0308 04:01:29.013818 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0580c83f64e952a7a614903b6fdf6965","Type":"ContainerDied","Data":"856e6d6f220b377eb08036d501a1ca7d92cb2b4e922c6f058e4ce28c5b94addf"} Mar 08 04:01:29.014196 master-0 kubenswrapper[18592]: I0308 04:01:29.013907 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0580c83f64e952a7a614903b6fdf6965","Type":"ContainerStarted","Data":"8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569"} Mar 08 04:01:29.014196 master-0 kubenswrapper[18592]: I0308 04:01:29.013948 18592 scope.go:117] "RemoveContainer" containerID="b2e2f03048172100d18502fe8929d9dbf909b8a6fed9417d23d6e4cec64856d5" Mar 08 04:01:30.026928 master-0 kubenswrapper[18592]: I0308 04:01:30.026803 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/cluster-policy-controller/2.log" Mar 08 04:01:30.030216 master-0 kubenswrapper[18592]: I0308 04:01:30.030148 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/kube-controller-manager/0.log" Mar 08 04:01:30.676023 master-0 kubenswrapper[18592]: I0308 04:01:30.675896 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:01:30.676023 master-0 kubenswrapper[18592]: I0308 04:01:30.676023 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:01:30.798503 master-0 kubenswrapper[18592]: E0308 04:01:30.798404 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 04:01:31.703943 master-0 kubenswrapper[18592]: E0308 04:01:31.703686 18592 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{console-744db48f96-lgsd4.189ac19a0b056f45 openshift-console 14599 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-console,Name:console-744db48f96-lgsd4,UID:935ab7fb-b097-41c3-8926-8343eb29e7fc,APIVersion:v1,ResourceVersion:14344,FieldPath:spec.containers{console},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:57:50 +0000 UTC,LastTimestamp:2026-03-08 03:58:10.676243001 +0000 UTC m=+302.774997381,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 04:01:32.142800 master-0 kubenswrapper[18592]: E0308 04:01:32.142613 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="the server was unable to return a response in the time allotted, but may still be processing the request (get leases.coordination.k8s.io master-0)" interval="7s" Mar 08 04:01:33.302798 master-0 kubenswrapper[18592]: I0308 04:01:33.302703 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:01:33.303694 master-0 kubenswrapper[18592]: I0308 04:01:33.302803 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:01:35.143291 master-0 kubenswrapper[18592]: I0308 04:01:35.143218 18592 scope.go:117] "RemoveContainer" containerID="d35e35bce6407453877c8123c5425d614cf068904ff2cbaabddff838a6a4b536" Mar 08 04:01:35.144251 master-0 kubenswrapper[18592]: E0308 04:01:35.143595 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-h4qlp_openshift-cluster-storage-operator(9ec89e27-4360-48f2-a7ca-5d823bda4510)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" podUID="9ec89e27-4360-48f2-a7ca-5d823bda4510" Mar 08 04:01:35.206721 master-0 kubenswrapper[18592]: I0308 04:01:35.206608 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:01:35.207130 master-0 kubenswrapper[18592]: I0308 04:01:35.207029 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:01:38.207253 master-0 kubenswrapper[18592]: I0308 04:01:38.207145 18592 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 04:01:38.207253 master-0 kubenswrapper[18592]: I0308 04:01:38.207231 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 04:01:38.793054 master-0 kubenswrapper[18592]: E0308 04:01:38.792910 18592 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 08 04:01:40.117917 master-0 kubenswrapper[18592]: I0308 04:01:40.117813 18592 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="eb43ba7b20ea7f1de138b5068cce2d258037383212c774dc5ce6a0563d09a7ae" exitCode=0 Mar 08 04:01:40.118748 master-0 kubenswrapper[18592]: I0308 04:01:40.118004 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"eb43ba7b20ea7f1de138b5068cce2d258037383212c774dc5ce6a0563d09a7ae"} Mar 08 04:01:40.119351 master-0 kubenswrapper[18592]: I0308 04:01:40.119321 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="3ecf3d00-32ee-49db-bcf4-1bdb6f372c65" Mar 08 04:01:40.119523 master-0 kubenswrapper[18592]: I0308 04:01:40.119500 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="3ecf3d00-32ee-49db-bcf4-1bdb6f372c65" Mar 08 04:01:40.676030 master-0 kubenswrapper[18592]: I0308 04:01:40.675952 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:01:40.676509 master-0 kubenswrapper[18592]: I0308 04:01:40.676044 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:01:40.799789 master-0 kubenswrapper[18592]: E0308 04:01:40.799660 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 04:01:43.144545 master-0 kubenswrapper[18592]: I0308 04:01:43.144445 18592 generic.go:334] "Generic (PLEG): container finished" podID="ee586416-6f56-4ea4-ad62-95de1e6df23b" containerID="79c3878ca5ec702bd04b79fb49bba2be6016b0025cef22be477ef3ddf3eccc0f" exitCode=0 Mar 08 04:01:43.144545 master-0 kubenswrapper[18592]: I0308 04:01:43.144505 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" event={"ID":"ee586416-6f56-4ea4-ad62-95de1e6df23b","Type":"ContainerDied","Data":"79c3878ca5ec702bd04b79fb49bba2be6016b0025cef22be477ef3ddf3eccc0f"} Mar 08 04:01:43.145404 master-0 kubenswrapper[18592]: I0308 04:01:43.144594 18592 scope.go:117] "RemoveContainer" containerID="b993a7c9605bb38752ad78b483fef1e87627a44d0b8204e7dbbc52680443a98d" Mar 08 04:01:43.145404 master-0 kubenswrapper[18592]: I0308 04:01:43.145246 18592 scope.go:117] "RemoveContainer" containerID="79c3878ca5ec702bd04b79fb49bba2be6016b0025cef22be477ef3ddf3eccc0f" Mar 08 04:01:43.145754 master-0 kubenswrapper[18592]: E0308 04:01:43.145694 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=insights-operator pod=insights-operator-8f89dfddd-4mr6p_openshift-insights(ee586416-6f56-4ea4-ad62-95de1e6df23b)\"" pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" podUID="ee586416-6f56-4ea4-ad62-95de1e6df23b" Mar 08 04:01:43.303011 master-0 kubenswrapper[18592]: I0308 04:01:43.302899 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:01:43.303337 master-0 kubenswrapper[18592]: I0308 04:01:43.302998 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:01:47.185701 master-0 kubenswrapper[18592]: I0308 04:01:47.185620 18592 generic.go:334] "Generic (PLEG): container finished" podID="2262647b-c315-477a-93bd-f168c1810475" containerID="ff49e653275c39e7af57a8716c460ac2db01b59edb1ee2f75e1bdf35e77e98b5" exitCode=0 Mar 08 04:01:47.185701 master-0 kubenswrapper[18592]: I0308 04:01:47.185677 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" event={"ID":"2262647b-c315-477a-93bd-f168c1810475","Type":"ContainerDied","Data":"ff49e653275c39e7af57a8716c460ac2db01b59edb1ee2f75e1bdf35e77e98b5"} Mar 08 04:01:47.186805 master-0 kubenswrapper[18592]: I0308 04:01:47.186237 18592 scope.go:117] "RemoveContainer" containerID="ff49e653275c39e7af57a8716c460ac2db01b59edb1ee2f75e1bdf35e77e98b5" Mar 08 04:01:48.199922 master-0 kubenswrapper[18592]: I0308 04:01:48.199716 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zq9rp" event={"ID":"2262647b-c315-477a-93bd-f168c1810475","Type":"ContainerStarted","Data":"42fb66a21e6b47395d028e51a501f9aa1a16a54a3b77342ccdd434688e7e997b"} Mar 08 04:01:48.206335 master-0 kubenswrapper[18592]: I0308 04:01:48.206286 18592 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 04:01:48.206462 master-0 kubenswrapper[18592]: I0308 04:01:48.206362 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 04:01:49.142803 master-0 kubenswrapper[18592]: I0308 04:01:49.142715 18592 scope.go:117] "RemoveContainer" containerID="d35e35bce6407453877c8123c5425d614cf068904ff2cbaabddff838a6a4b536" Mar 08 04:01:49.143437 master-0 kubenswrapper[18592]: E0308 04:01:49.143372 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 08 04:01:50.222683 master-0 kubenswrapper[18592]: I0308 04:01:50.222583 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-jghp5_d831cb23-7411-4072-8273-c167d9afca28/cluster-baremetal-operator/2.log" Mar 08 04:01:50.223929 master-0 kubenswrapper[18592]: I0308 04:01:50.223561 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-jghp5_d831cb23-7411-4072-8273-c167d9afca28/cluster-baremetal-operator/1.log" Mar 08 04:01:50.224355 master-0 kubenswrapper[18592]: I0308 04:01:50.224238 18592 generic.go:334] "Generic (PLEG): container finished" podID="d831cb23-7411-4072-8273-c167d9afca28" containerID="6db7bba8b22fb8aa1570ff3e56f6b75b965e844f4559f957eeceeaa7d80a64c0" exitCode=1 Mar 08 04:01:50.224355 master-0 kubenswrapper[18592]: I0308 04:01:50.224297 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" event={"ID":"d831cb23-7411-4072-8273-c167d9afca28","Type":"ContainerDied","Data":"6db7bba8b22fb8aa1570ff3e56f6b75b965e844f4559f957eeceeaa7d80a64c0"} Mar 08 04:01:50.224579 master-0 kubenswrapper[18592]: I0308 04:01:50.224404 18592 scope.go:117] "RemoveContainer" containerID="d31110b071a8e480e8de5e3d670b44efd0f87c9bdf6fe27693931341b4131afe" Mar 08 04:01:50.225380 master-0 kubenswrapper[18592]: I0308 04:01:50.225311 18592 scope.go:117] "RemoveContainer" containerID="6db7bba8b22fb8aa1570ff3e56f6b75b965e844f4559f957eeceeaa7d80a64c0" Mar 08 04:01:50.225991 master-0 kubenswrapper[18592]: E0308 04:01:50.225912 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-jghp5_openshift-machine-api(d831cb23-7411-4072-8273-c167d9afca28)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" podUID="d831cb23-7411-4072-8273-c167d9afca28" Mar 08 04:01:50.228279 master-0 kubenswrapper[18592]: I0308 04:01:50.227938 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/3.log" Mar 08 04:01:50.228279 master-0 kubenswrapper[18592]: I0308 04:01:50.228003 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" event={"ID":"9ec89e27-4360-48f2-a7ca-5d823bda4510","Type":"ContainerStarted","Data":"e2e26de1430830efa8f6b74e607c09d42160aca3122a8949f5ba15e8c2b266c1"} Mar 08 04:01:50.675952 master-0 kubenswrapper[18592]: I0308 04:01:50.675758 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:01:50.675952 master-0 kubenswrapper[18592]: I0308 04:01:50.675867 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:01:50.800539 master-0 kubenswrapper[18592]: E0308 04:01:50.800440 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 04:01:51.240140 master-0 kubenswrapper[18592]: I0308 04:01:51.240058 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-jghp5_d831cb23-7411-4072-8273-c167d9afca28/cluster-baremetal-operator/2.log" Mar 08 04:01:53.302880 master-0 kubenswrapper[18592]: I0308 04:01:53.302756 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:01:53.303965 master-0 kubenswrapper[18592]: I0308 04:01:53.302882 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:01:55.143625 master-0 kubenswrapper[18592]: I0308 04:01:55.143328 18592 scope.go:117] "RemoveContainer" containerID="79c3878ca5ec702bd04b79fb49bba2be6016b0025cef22be477ef3ddf3eccc0f" Mar 08 04:01:55.143625 master-0 kubenswrapper[18592]: E0308 04:01:55.143584 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=insights-operator pod=insights-operator-8f89dfddd-4mr6p_openshift-insights(ee586416-6f56-4ea4-ad62-95de1e6df23b)\"" pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" podUID="ee586416-6f56-4ea4-ad62-95de1e6df23b" Mar 08 04:01:58.206936 master-0 kubenswrapper[18592]: I0308 04:01:58.206794 18592 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 04:01:58.208217 master-0 kubenswrapper[18592]: I0308 04:01:58.206947 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 04:01:58.208217 master-0 kubenswrapper[18592]: I0308 04:01:58.207059 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:01:58.208726 master-0 kubenswrapper[18592]: I0308 04:01:58.208651 18592 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 08 04:01:58.209162 master-0 kubenswrapper[18592]: I0308 04:01:58.209044 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" containerID="cri-o://8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569" gracePeriod=30 Mar 08 04:01:58.331549 master-0 kubenswrapper[18592]: E0308 04:01:58.331466 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(0580c83f64e952a7a614903b6fdf6965)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" Mar 08 04:01:59.311547 master-0 kubenswrapper[18592]: I0308 04:01:59.311481 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/cluster-policy-controller/3.log" Mar 08 04:01:59.312610 master-0 kubenswrapper[18592]: I0308 04:01:59.312566 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/cluster-policy-controller/2.log" Mar 08 04:01:59.315351 master-0 kubenswrapper[18592]: I0308 04:01:59.315298 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/kube-controller-manager/0.log" Mar 08 04:01:59.315444 master-0 kubenswrapper[18592]: I0308 04:01:59.315386 18592 generic.go:334] "Generic (PLEG): container finished" podID="0580c83f64e952a7a614903b6fdf6965" containerID="8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569" exitCode=255 Mar 08 04:01:59.315444 master-0 kubenswrapper[18592]: I0308 04:01:59.315430 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0580c83f64e952a7a614903b6fdf6965","Type":"ContainerDied","Data":"8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569"} Mar 08 04:01:59.315531 master-0 kubenswrapper[18592]: I0308 04:01:59.315477 18592 scope.go:117] "RemoveContainer" containerID="856e6d6f220b377eb08036d501a1ca7d92cb2b4e922c6f058e4ce28c5b94addf" Mar 08 04:01:59.316535 master-0 kubenswrapper[18592]: I0308 04:01:59.316483 18592 scope.go:117] "RemoveContainer" containerID="8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569" Mar 08 04:01:59.317052 master-0 kubenswrapper[18592]: E0308 04:01:59.316993 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(0580c83f64e952a7a614903b6fdf6965)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" Mar 08 04:02:00.328194 master-0 kubenswrapper[18592]: I0308 04:02:00.328121 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/cluster-policy-controller/3.log" Mar 08 04:02:00.330735 master-0 kubenswrapper[18592]: I0308 04:02:00.330679 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/kube-controller-manager/0.log" Mar 08 04:02:00.676657 master-0 kubenswrapper[18592]: I0308 04:02:00.676476 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:02:00.676657 master-0 kubenswrapper[18592]: I0308 04:02:00.676596 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:02:00.801018 master-0 kubenswrapper[18592]: E0308 04:02:00.800945 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 04:02:00.801018 master-0 kubenswrapper[18592]: E0308 04:02:00.801007 18592 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 04:02:03.303200 master-0 kubenswrapper[18592]: I0308 04:02:03.303114 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:02:03.304182 master-0 kubenswrapper[18592]: I0308 04:02:03.303203 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:02:04.144099 master-0 kubenswrapper[18592]: I0308 04:02:04.144006 18592 scope.go:117] "RemoveContainer" containerID="6db7bba8b22fb8aa1570ff3e56f6b75b965e844f4559f957eeceeaa7d80a64c0" Mar 08 04:02:04.374913 master-0 kubenswrapper[18592]: I0308 04:02:04.374804 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-jghp5_d831cb23-7411-4072-8273-c167d9afca28/cluster-baremetal-operator/2.log" Mar 08 04:02:04.375728 master-0 kubenswrapper[18592]: I0308 04:02:04.375550 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-jghp5" event={"ID":"d831cb23-7411-4072-8273-c167d9afca28","Type":"ContainerStarted","Data":"a28542fb5ed39c858f57b755c0ef81087026c2d66d462a7cf19a70d4e93dcb25"} Mar 08 04:02:05.207012 master-0 kubenswrapper[18592]: I0308 04:02:05.206923 18592 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:02:05.208252 master-0 kubenswrapper[18592]: I0308 04:02:05.208197 18592 scope.go:117] "RemoveContainer" containerID="8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569" Mar 08 04:02:05.208820 master-0 kubenswrapper[18592]: E0308 04:02:05.208678 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(0580c83f64e952a7a614903b6fdf6965)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" Mar 08 04:02:05.708039 master-0 kubenswrapper[18592]: E0308 04:02:05.707871 18592 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event=< Mar 08 04:02:05.708039 master-0 kubenswrapper[18592]: &Event{ObjectMeta:{console-7748864899-8p6h5.189ac19aa7a63f81 openshift-console 14608 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-console,Name:console-7748864899-8p6h5,UID:5bc0469d-1ae9-4606-ba99-7a333b66af37,APIVersion:v1,ResourceVersion:14381,FieldPath:spec.containers{console},},Reason:ProbeError,Message:Startup probe error: Get "https://10.128.0.97:8443/health": dial tcp 10.128.0.97:8443: connect: connection refused Mar 08 04:02:05.708039 master-0 kubenswrapper[18592]: body: Mar 08 04:02:05.708039 master-0 kubenswrapper[18592]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:57:53 +0000 UTC,LastTimestamp:2026-03-08 03:58:13.302033161 +0000 UTC m=+305.400787511,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Mar 08 04:02:05.708039 master-0 kubenswrapper[18592]: > Mar 08 04:02:06.144052 master-0 kubenswrapper[18592]: E0308 04:02:06.143884 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 08 04:02:07.145210 master-0 kubenswrapper[18592]: I0308 04:02:07.144908 18592 scope.go:117] "RemoveContainer" containerID="79c3878ca5ec702bd04b79fb49bba2be6016b0025cef22be477ef3ddf3eccc0f" Mar 08 04:02:07.409095 master-0 kubenswrapper[18592]: I0308 04:02:07.408908 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-4mr6p" event={"ID":"ee586416-6f56-4ea4-ad62-95de1e6df23b","Type":"ContainerStarted","Data":"14b155a9f345270e44e3822a408613eda79a80d5a1541f0ccf653b3376855415"} Mar 08 04:02:08.156893 master-0 kubenswrapper[18592]: I0308 04:02:08.156774 18592 status_manager.go:851] "Failed to get status for pod" podUID="0580c83f64e952a7a614903b6fdf6965" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods kube-controller-manager-master-0)" Mar 08 04:02:10.675954 master-0 kubenswrapper[18592]: I0308 04:02:10.675815 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:02:10.676968 master-0 kubenswrapper[18592]: I0308 04:02:10.675955 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:02:13.303093 master-0 kubenswrapper[18592]: I0308 04:02:13.302971 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:02:13.303093 master-0 kubenswrapper[18592]: I0308 04:02:13.303055 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:02:14.123387 master-0 kubenswrapper[18592]: E0308 04:02:14.123293 18592 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 08 04:02:14.484639 master-0 kubenswrapper[18592]: I0308 04:02:14.484563 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"eb9188eb38d479fde0a4a051bbd9cadf792692a27595b74a68dfd596dd3bd4cb"} Mar 08 04:02:15.507051 master-0 kubenswrapper[18592]: I0308 04:02:15.506989 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"40ee3f7404d755ad51482e89e369ccdc84a6541dc5dfe44eb786d98f94657742"} Mar 08 04:02:15.507051 master-0 kubenswrapper[18592]: I0308 04:02:15.507054 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"ef5905277758c7d5195b669005ed451e16ffd24619ebf8ba21e65224876bdbbf"} Mar 08 04:02:16.144567 master-0 kubenswrapper[18592]: I0308 04:02:16.144412 18592 scope.go:117] "RemoveContainer" containerID="8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569" Mar 08 04:02:16.144983 master-0 kubenswrapper[18592]: E0308 04:02:16.144876 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(0580c83f64e952a7a614903b6fdf6965)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" Mar 08 04:02:16.521480 master-0 kubenswrapper[18592]: I0308 04:02:16.521159 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"525f4f34f7b9f03270cf8e17145799a861ddae30f9c93e0feabf75526e9d7c8b"} Mar 08 04:02:16.521480 master-0 kubenswrapper[18592]: I0308 04:02:16.521223 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"6abbd77cd85c43cf39d19eb06601b8ddabff3d172f4afb400fdf65a96dc33d31"} Mar 08 04:02:16.523314 master-0 kubenswrapper[18592]: I0308 04:02:16.521793 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="3ecf3d00-32ee-49db-bcf4-1bdb6f372c65" Mar 08 04:02:16.523314 master-0 kubenswrapper[18592]: I0308 04:02:16.521860 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="3ecf3d00-32ee-49db-bcf4-1bdb6f372c65" Mar 08 04:02:19.555005 master-0 kubenswrapper[18592]: I0308 04:02:19.554920 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/4.log" Mar 08 04:02:19.556456 master-0 kubenswrapper[18592]: I0308 04:02:19.556400 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/3.log" Mar 08 04:02:19.556610 master-0 kubenswrapper[18592]: I0308 04:02:19.556499 18592 generic.go:334] "Generic (PLEG): container finished" podID="9ec89e27-4360-48f2-a7ca-5d823bda4510" containerID="e2e26de1430830efa8f6b74e607c09d42160aca3122a8949f5ba15e8c2b266c1" exitCode=1 Mar 08 04:02:19.556610 master-0 kubenswrapper[18592]: I0308 04:02:19.556564 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" event={"ID":"9ec89e27-4360-48f2-a7ca-5d823bda4510","Type":"ContainerDied","Data":"e2e26de1430830efa8f6b74e607c09d42160aca3122a8949f5ba15e8c2b266c1"} Mar 08 04:02:19.556610 master-0 kubenswrapper[18592]: I0308 04:02:19.556612 18592 scope.go:117] "RemoveContainer" containerID="d35e35bce6407453877c8123c5425d614cf068904ff2cbaabddff838a6a4b536" Mar 08 04:02:19.557523 master-0 kubenswrapper[18592]: I0308 04:02:19.557465 18592 scope.go:117] "RemoveContainer" containerID="e2e26de1430830efa8f6b74e607c09d42160aca3122a8949f5ba15e8c2b266c1" Mar 08 04:02:19.558056 master-0 kubenswrapper[18592]: E0308 04:02:19.558004 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-h4qlp_openshift-cluster-storage-operator(9ec89e27-4360-48f2-a7ca-5d823bda4510)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" podUID="9ec89e27-4360-48f2-a7ca-5d823bda4510" Mar 08 04:02:20.570341 master-0 kubenswrapper[18592]: I0308 04:02:20.570245 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/4.log" Mar 08 04:02:20.679893 master-0 kubenswrapper[18592]: I0308 04:02:20.675910 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:02:20.679893 master-0 kubenswrapper[18592]: I0308 04:02:20.675992 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:02:21.013987 master-0 kubenswrapper[18592]: E0308 04:02:21.013603 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T04:02:11Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T04:02:11Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T04:02:11Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T04:02:11Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3e7365fa46219476560dd59d3a82f041546a33f0935c57eb4f3274ab3118ef0b\\\"],\\\"sizeBytes\\\":2895821940},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:ae042a5d32eb2f18d537f2068849e665b55df7d8360daedaaeea98bd2a79e769\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d077bbabe6cb885ed229119008480493e8364e4bfddaa00b099f68c52b016e6b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733328350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:063b8972231e65eb43f6545ba37804f68138dc54d97b91a652a1c5bc7dc76aa5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cf682d23b2857e455609879a0867d171a221c18e2cec995dd79570b77c5a4705\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1272201949},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e0c034ae18daa01af8d073f8cc24ae4af87883c664304910eab1167fdfd60c0b\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ef0c6b9e405f7a452211e063ce07ded04ccbe38b53860bfd71b5a7cd5072830a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1229556414},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:79984dfbdf9aeae3985c7fd7515e12328775c0e7fc4782929d0998f4dd2a87c6\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:7be89499615ec913d0fe40ca89682080a3f1181a066dbc501c877cc7ccbcc9ae\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1220167376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff40e33e63d6c1f4e4393d5506e38def25ba20582d980fec8b81f81c867ceeec\\\"],\\\"sizeBytes\\\":918278686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:042e6a37747405da54cf91543d44408c9531327a2cce653c41ca851aa7c896d8\\\"],\\\"sizeBytes\\\":880378279},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2508a5f66e509e813cb09825b5456be91b4cdd4d02f470f22a33de42c753f2b7\\\"],\\\"sizeBytes\\\":862197440},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:db06a0e0308b2e541c7bb2d11517431abb31133b2ce6cb6c34ecf5ef4188a4e8\\\"],\\\"sizeBytes\\\":633876767},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3cdb019b6769514c0e92ef92da73e914fbcf6254cc919677ee077c93ce324de0\\\"],\\\"sizeBytes\\\":605698200},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d470dba32064cc62b2ab29303d6e00612304548262eaa2f4e5b40a00a26f71ce\\\"],\\\"sizeBytes\\\":557426734},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5\\\"],\\\"sizeBytes\\\":513581866},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ca868abfecbf9a9c414a4c79e57c4c55e62c8a6796f899ba59dde86c4cf4bb\\\"],\\\"sizeBytes\\\":512235767},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b8cb5e0caeca0fb02f3e8c72b7ddf1c49e3c602e42e119ba30c60525f1db1821\\\"],\\\"sizeBytes\\\":504658657},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c8f904c1084450856b501d40bbc9246265fe34a2b70efec23541e3285da7f88\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9b2e765b795c30c910c331c85226e5db0d56463b6c81d79ded739cba76e2b032\\\"],\\\"sizeBytes\\\":487151732}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 04:02:21.179614 master-0 kubenswrapper[18592]: I0308 04:02:21.179540 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 08 04:02:21.179614 master-0 kubenswrapper[18592]: I0308 04:02:21.179612 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 08 04:02:21.832975 master-0 kubenswrapper[18592]: E0308 04:02:21.832803 18592 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 08 04:02:21.832975 master-0 kubenswrapper[18592]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6fd77597b5-w649n_openshift-authentication_836c1be1-26de-4840-8ba6-9d34a751aebc_0(8b8bbf68329c784aaff3658aa79577562b53cab78876ffb9c4f5a8aa2517b020): error adding pod openshift-authentication_oauth-openshift-6fd77597b5-w649n to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8b8bbf68329c784aaff3658aa79577562b53cab78876ffb9c4f5a8aa2517b020" Netns:"/var/run/netns/bf0e7291-ec98-40ee-9ae8-bb4370590483" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6fd77597b5-w649n;K8S_POD_INFRA_CONTAINER_ID=8b8bbf68329c784aaff3658aa79577562b53cab78876ffb9c4f5a8aa2517b020;K8S_POD_UID=836c1be1-26de-4840-8ba6-9d34a751aebc" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6fd77597b5-w649n] networking: Multus: [openshift-authentication/oauth-openshift-6fd77597b5-w649n/836c1be1-26de-4840-8ba6-9d34a751aebc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6fd77597b5-w649n?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 08 04:02:21.832975 master-0 kubenswrapper[18592]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 04:02:21.832975 master-0 kubenswrapper[18592]: > Mar 08 04:02:21.834588 master-0 kubenswrapper[18592]: E0308 04:02:21.834506 18592 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 08 04:02:21.834588 master-0 kubenswrapper[18592]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6fd77597b5-w649n_openshift-authentication_836c1be1-26de-4840-8ba6-9d34a751aebc_0(8b8bbf68329c784aaff3658aa79577562b53cab78876ffb9c4f5a8aa2517b020): error adding pod openshift-authentication_oauth-openshift-6fd77597b5-w649n to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8b8bbf68329c784aaff3658aa79577562b53cab78876ffb9c4f5a8aa2517b020" Netns:"/var/run/netns/bf0e7291-ec98-40ee-9ae8-bb4370590483" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6fd77597b5-w649n;K8S_POD_INFRA_CONTAINER_ID=8b8bbf68329c784aaff3658aa79577562b53cab78876ffb9c4f5a8aa2517b020;K8S_POD_UID=836c1be1-26de-4840-8ba6-9d34a751aebc" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6fd77597b5-w649n] networking: Multus: [openshift-authentication/oauth-openshift-6fd77597b5-w649n/836c1be1-26de-4840-8ba6-9d34a751aebc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6fd77597b5-w649n?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 08 04:02:21.834588 master-0 kubenswrapper[18592]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 04:02:21.834588 master-0 kubenswrapper[18592]: > pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 04:02:21.835276 master-0 kubenswrapper[18592]: E0308 04:02:21.835100 18592 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 08 04:02:21.835276 master-0 kubenswrapper[18592]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6fd77597b5-w649n_openshift-authentication_836c1be1-26de-4840-8ba6-9d34a751aebc_0(8b8bbf68329c784aaff3658aa79577562b53cab78876ffb9c4f5a8aa2517b020): error adding pod openshift-authentication_oauth-openshift-6fd77597b5-w649n to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8b8bbf68329c784aaff3658aa79577562b53cab78876ffb9c4f5a8aa2517b020" Netns:"/var/run/netns/bf0e7291-ec98-40ee-9ae8-bb4370590483" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6fd77597b5-w649n;K8S_POD_INFRA_CONTAINER_ID=8b8bbf68329c784aaff3658aa79577562b53cab78876ffb9c4f5a8aa2517b020;K8S_POD_UID=836c1be1-26de-4840-8ba6-9d34a751aebc" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6fd77597b5-w649n] networking: Multus: [openshift-authentication/oauth-openshift-6fd77597b5-w649n/836c1be1-26de-4840-8ba6-9d34a751aebc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6fd77597b5-w649n?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 08 04:02:21.835276 master-0 kubenswrapper[18592]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 04:02:21.835276 master-0 kubenswrapper[18592]: > pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 04:02:21.836050 master-0 kubenswrapper[18592]: E0308 04:02:21.835867 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-6fd77597b5-w649n_openshift-authentication(836c1be1-26de-4840-8ba6-9d34a751aebc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-6fd77597b5-w649n_openshift-authentication(836c1be1-26de-4840-8ba6-9d34a751aebc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6fd77597b5-w649n_openshift-authentication_836c1be1-26de-4840-8ba6-9d34a751aebc_0(8b8bbf68329c784aaff3658aa79577562b53cab78876ffb9c4f5a8aa2517b020): error adding pod openshift-authentication_oauth-openshift-6fd77597b5-w649n to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"8b8bbf68329c784aaff3658aa79577562b53cab78876ffb9c4f5a8aa2517b020\\\" Netns:\\\"/var/run/netns/bf0e7291-ec98-40ee-9ae8-bb4370590483\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6fd77597b5-w649n;K8S_POD_INFRA_CONTAINER_ID=8b8bbf68329c784aaff3658aa79577562b53cab78876ffb9c4f5a8aa2517b020;K8S_POD_UID=836c1be1-26de-4840-8ba6-9d34a751aebc\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6fd77597b5-w649n] networking: Multus: [openshift-authentication/oauth-openshift-6fd77597b5-w649n/836c1be1-26de-4840-8ba6-9d34a751aebc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-6fd77597b5-w649n in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-6fd77597b5-w649n?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" podUID="836c1be1-26de-4840-8ba6-9d34a751aebc" Mar 08 04:02:22.586911 master-0 kubenswrapper[18592]: I0308 04:02:22.586848 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 04:02:22.587435 master-0 kubenswrapper[18592]: I0308 04:02:22.587400 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 04:02:23.146186 master-0 kubenswrapper[18592]: E0308 04:02:23.146086 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 08 04:02:23.303198 master-0 kubenswrapper[18592]: I0308 04:02:23.303114 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:02:23.303446 master-0 kubenswrapper[18592]: I0308 04:02:23.303199 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:02:27.142974 master-0 kubenswrapper[18592]: I0308 04:02:27.142913 18592 scope.go:117] "RemoveContainer" containerID="8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569" Mar 08 04:02:27.143746 master-0 kubenswrapper[18592]: E0308 04:02:27.143370 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(0580c83f64e952a7a614903b6fdf6965)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" Mar 08 04:02:30.676965 master-0 kubenswrapper[18592]: I0308 04:02:30.676759 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:02:30.678100 master-0 kubenswrapper[18592]: I0308 04:02:30.678019 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:02:31.017904 master-0 kubenswrapper[18592]: E0308 04:02:31.015558 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 04:02:31.144174 master-0 kubenswrapper[18592]: I0308 04:02:31.144101 18592 scope.go:117] "RemoveContainer" containerID="e2e26de1430830efa8f6b74e607c09d42160aca3122a8949f5ba15e8c2b266c1" Mar 08 04:02:31.144491 master-0 kubenswrapper[18592]: E0308 04:02:31.144419 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-h4qlp_openshift-cluster-storage-operator(9ec89e27-4360-48f2-a7ca-5d823bda4510)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" podUID="9ec89e27-4360-48f2-a7ca-5d823bda4510" Mar 08 04:02:31.216865 master-0 kubenswrapper[18592]: I0308 04:02:31.216715 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 08 04:02:33.303068 master-0 kubenswrapper[18592]: I0308 04:02:33.302954 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:02:33.303068 master-0 kubenswrapper[18592]: I0308 04:02:33.303040 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:02:34.143638 master-0 kubenswrapper[18592]: I0308 04:02:34.143559 18592 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-etcd/etcd-master-0" Mar 08 04:02:34.184523 master-0 kubenswrapper[18592]: I0308 04:02:34.184480 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 08 04:02:34.217940 master-0 kubenswrapper[18592]: I0308 04:02:34.217747 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 08 04:02:34.240433 master-0 kubenswrapper[18592]: I0308 04:02:34.239973 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6fd77597b5-w649n"] Mar 08 04:02:34.265031 master-0 kubenswrapper[18592]: I0308 04:02:34.264909 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-84f57b9877-n666k" podStartSLOduration=256.799616384 podStartE2EDuration="4m57.264889021s" podCreationTimestamp="2026-03-08 03:57:37 +0000 UTC" firstStartedPulling="2026-03-08 03:57:38.131427863 +0000 UTC m=+270.230182253" lastFinishedPulling="2026-03-08 03:58:18.59670053 +0000 UTC m=+310.695454890" observedRunningTime="2026-03-08 04:02:34.20273027 +0000 UTC m=+566.301484660" watchObservedRunningTime="2026-03-08 04:02:34.264889021 +0000 UTC m=+566.363643371" Mar 08 04:02:34.292959 master-0 kubenswrapper[18592]: I0308 04:02:34.292800 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 08 04:02:34.297459 master-0 kubenswrapper[18592]: I0308 04:02:34.297397 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 08 04:02:34.694844 master-0 kubenswrapper[18592]: I0308 04:02:34.694713 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" event={"ID":"836c1be1-26de-4840-8ba6-9d34a751aebc","Type":"ContainerStarted","Data":"efb4059af95784e7ace669b41edd71b763973aa5762494188a9e24fdfa971004"} Mar 08 04:02:34.695677 master-0 kubenswrapper[18592]: I0308 04:02:34.694875 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" event={"ID":"836c1be1-26de-4840-8ba6-9d34a751aebc","Type":"ContainerStarted","Data":"39f87f9c5e113240e56d3ee73c28d6fd790e8d77e8e4b200eb7a903a92b065d3"} Mar 08 04:02:34.695677 master-0 kubenswrapper[18592]: I0308 04:02:34.694912 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 04:02:34.735753 master-0 kubenswrapper[18592]: I0308 04:02:34.735638 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" podStartSLOduration=274.7356087 podStartE2EDuration="4m34.7356087s" podCreationTimestamp="2026-03-08 03:58:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:02:34.731739536 +0000 UTC m=+566.830493916" watchObservedRunningTime="2026-03-08 04:02:34.7356087 +0000 UTC m=+566.834363080" Mar 08 04:02:35.699005 master-0 kubenswrapper[18592]: I0308 04:02:35.695246 18592 patch_prober.go:28] interesting pod/oauth-openshift-6fd77597b5-w649n container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.99:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 04:02:35.699005 master-0 kubenswrapper[18592]: I0308 04:02:35.695355 18592 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" podUID="836c1be1-26de-4840-8ba6-9d34a751aebc" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.99:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 04:02:36.153344 master-0 kubenswrapper[18592]: I0308 04:02:36.153261 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cde83292-7f24-4a84-aa85-e917c0f33a02" path="/var/lib/kubelet/pods/cde83292-7f24-4a84-aa85-e917c0f33a02/volumes" Mar 08 04:02:36.221355 master-0 kubenswrapper[18592]: I0308 04:02:36.221209 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 08 04:02:36.705070 master-0 kubenswrapper[18592]: I0308 04:02:36.704964 18592 patch_prober.go:28] interesting pod/oauth-openshift-6fd77597b5-w649n container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.99:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 04:02:36.705070 master-0 kubenswrapper[18592]: I0308 04:02:36.705068 18592 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" podUID="836c1be1-26de-4840-8ba6-9d34a751aebc" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.99:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 04:02:36.716706 master-0 kubenswrapper[18592]: I0308 04:02:36.716629 18592 generic.go:334] "Generic (PLEG): container finished" podID="49ec083d-dc74-457e-b10f-3bde04e9e75e" containerID="7c5a574b00cddc6ae858097a251b08c51507a0a632d0ee57830702564612db3b" exitCode=0 Mar 08 04:02:36.717559 master-0 kubenswrapper[18592]: I0308 04:02:36.716729 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" event={"ID":"49ec083d-dc74-457e-b10f-3bde04e9e75e","Type":"ContainerDied","Data":"7c5a574b00cddc6ae858097a251b08c51507a0a632d0ee57830702564612db3b"} Mar 08 04:02:36.717964 master-0 kubenswrapper[18592]: I0308 04:02:36.717687 18592 scope.go:117] "RemoveContainer" containerID="7c5a574b00cddc6ae858097a251b08c51507a0a632d0ee57830702564612db3b" Mar 08 04:02:36.720434 master-0 kubenswrapper[18592]: I0308 04:02:36.720373 18592 generic.go:334] "Generic (PLEG): container finished" podID="e78b283b-981e-48d7-a5f2-53f8401766ea" containerID="a9cb63d374e634553343804bbe1694fd59777432c1442ac27146224b4af9f933" exitCode=0 Mar 08 04:02:36.720882 master-0 kubenswrapper[18592]: I0308 04:02:36.720446 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" event={"ID":"e78b283b-981e-48d7-a5f2-53f8401766ea","Type":"ContainerDied","Data":"a9cb63d374e634553343804bbe1694fd59777432c1442ac27146224b4af9f933"} Mar 08 04:02:36.721314 master-0 kubenswrapper[18592]: I0308 04:02:36.720933 18592 scope.go:117] "RemoveContainer" containerID="a9cb63d374e634553343804bbe1694fd59777432c1442ac27146224b4af9f933" Mar 08 04:02:36.723340 master-0 kubenswrapper[18592]: I0308 04:02:36.723230 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/cluster-policy-controller/3.log" Mar 08 04:02:36.725495 master-0 kubenswrapper[18592]: I0308 04:02:36.725415 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/kube-controller-manager/0.log" Mar 08 04:02:36.725584 master-0 kubenswrapper[18592]: I0308 04:02:36.725546 18592 generic.go:334] "Generic (PLEG): container finished" podID="0580c83f64e952a7a614903b6fdf6965" containerID="7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e" exitCode=0 Mar 08 04:02:36.725654 master-0 kubenswrapper[18592]: I0308 04:02:36.725605 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0580c83f64e952a7a614903b6fdf6965","Type":"ContainerDied","Data":"7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e"} Mar 08 04:02:36.726338 master-0 kubenswrapper[18592]: I0308 04:02:36.726302 18592 scope.go:117] "RemoveContainer" containerID="8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569" Mar 08 04:02:36.726507 master-0 kubenswrapper[18592]: I0308 04:02:36.726344 18592 scope.go:117] "RemoveContainer" containerID="7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e" Mar 08 04:02:36.728068 master-0 kubenswrapper[18592]: I0308 04:02:36.728028 18592 generic.go:334] "Generic (PLEG): container finished" podID="ae4cf373-137f-4ed4-a276-2109a68f3616" containerID="62b7489256e68bf5d0a04c5a110369ae1f66f911ae5ef4890aafd69da4f57420" exitCode=0 Mar 08 04:02:36.728146 master-0 kubenswrapper[18592]: I0308 04:02:36.728073 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" event={"ID":"ae4cf373-137f-4ed4-a276-2109a68f3616","Type":"ContainerDied","Data":"62b7489256e68bf5d0a04c5a110369ae1f66f911ae5ef4890aafd69da4f57420"} Mar 08 04:02:36.728946 master-0 kubenswrapper[18592]: I0308 04:02:36.728896 18592 scope.go:117] "RemoveContainer" containerID="62b7489256e68bf5d0a04c5a110369ae1f66f911ae5ef4890aafd69da4f57420" Mar 08 04:02:36.730490 master-0 kubenswrapper[18592]: I0308 04:02:36.730426 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-bv67b_1b700d17-83d2-46c8-afbc-e5774822eabe/cluster-autoscaler-operator/0.log" Mar 08 04:02:36.731041 master-0 kubenswrapper[18592]: I0308 04:02:36.730980 18592 generic.go:334] "Generic (PLEG): container finished" podID="1b700d17-83d2-46c8-afbc-e5774822eabe" containerID="b9be8ece5a79f90d1b08d38693357868aba4825cbc7bfe504b163d5a110c50ad" exitCode=255 Mar 08 04:02:36.731041 master-0 kubenswrapper[18592]: I0308 04:02:36.731039 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" event={"ID":"1b700d17-83d2-46c8-afbc-e5774822eabe","Type":"ContainerDied","Data":"b9be8ece5a79f90d1b08d38693357868aba4825cbc7bfe504b163d5a110c50ad"} Mar 08 04:02:36.731712 master-0 kubenswrapper[18592]: I0308 04:02:36.731321 18592 scope.go:117] "RemoveContainer" containerID="b9be8ece5a79f90d1b08d38693357868aba4825cbc7bfe504b163d5a110c50ad" Mar 08 04:02:36.733807 master-0 kubenswrapper[18592]: I0308 04:02:36.733778 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-84bf6db4f9-tdrf8_b70adfe9-94f1-44bc-85ce-498e5f0a1ca7/machine-api-operator/0.log" Mar 08 04:02:36.734345 master-0 kubenswrapper[18592]: I0308 04:02:36.734303 18592 generic.go:334] "Generic (PLEG): container finished" podID="b70adfe9-94f1-44bc-85ce-498e5f0a1ca7" containerID="029a27fb736bdc11d71d13f7cee2fad9d45777a98ca3db8fbbd922eea19ce40e" exitCode=255 Mar 08 04:02:36.734888 master-0 kubenswrapper[18592]: I0308 04:02:36.734353 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" event={"ID":"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7","Type":"ContainerDied","Data":"029a27fb736bdc11d71d13f7cee2fad9d45777a98ca3db8fbbd922eea19ce40e"} Mar 08 04:02:36.734888 master-0 kubenswrapper[18592]: I0308 04:02:36.734640 18592 scope.go:117] "RemoveContainer" containerID="029a27fb736bdc11d71d13f7cee2fad9d45777a98ca3db8fbbd922eea19ce40e" Mar 08 04:02:36.737532 master-0 kubenswrapper[18592]: I0308 04:02:36.737488 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-vzms7_5a7752f9-7b9a-451f-997a-e9f696d38b34/etcd-operator/1.log" Mar 08 04:02:36.737616 master-0 kubenswrapper[18592]: I0308 04:02:36.737540 18592 generic.go:334] "Generic (PLEG): container finished" podID="5a7752f9-7b9a-451f-997a-e9f696d38b34" containerID="361da48b967f54f48155f816e56d6a7090a322660478d9fab86bad7cccb8b43b" exitCode=0 Mar 08 04:02:36.737616 master-0 kubenswrapper[18592]: I0308 04:02:36.737587 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" event={"ID":"5a7752f9-7b9a-451f-997a-e9f696d38b34","Type":"ContainerDied","Data":"361da48b967f54f48155f816e56d6a7090a322660478d9fab86bad7cccb8b43b"} Mar 08 04:02:36.737713 master-0 kubenswrapper[18592]: I0308 04:02:36.737660 18592 scope.go:117] "RemoveContainer" containerID="9cb40f8e472021b6bf28adddecb51a371c5cff426f2d0e4b345adbb4c28df1e5" Mar 08 04:02:36.739475 master-0 kubenswrapper[18592]: I0308 04:02:36.738415 18592 scope.go:117] "RemoveContainer" containerID="361da48b967f54f48155f816e56d6a7090a322660478d9fab86bad7cccb8b43b" Mar 08 04:02:36.741694 master-0 kubenswrapper[18592]: I0308 04:02:36.741075 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-kt66j_0ebf1330-e044-4ff5-8b48-2d667e0c5625/openshift-controller-manager-operator/1.log" Mar 08 04:02:36.741694 master-0 kubenswrapper[18592]: I0308 04:02:36.741132 18592 generic.go:334] "Generic (PLEG): container finished" podID="0ebf1330-e044-4ff5-8b48-2d667e0c5625" containerID="1be9d220d27c69fde01642289a0e8eb2f1052094efecbeaca301570ba5131c28" exitCode=0 Mar 08 04:02:36.741694 master-0 kubenswrapper[18592]: I0308 04:02:36.741176 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" event={"ID":"0ebf1330-e044-4ff5-8b48-2d667e0c5625","Type":"ContainerDied","Data":"1be9d220d27c69fde01642289a0e8eb2f1052094efecbeaca301570ba5131c28"} Mar 08 04:02:36.742630 master-0 kubenswrapper[18592]: I0308 04:02:36.742578 18592 scope.go:117] "RemoveContainer" containerID="1be9d220d27c69fde01642289a0e8eb2f1052094efecbeaca301570ba5131c28" Mar 08 04:02:36.745584 master-0 kubenswrapper[18592]: I0308 04:02:36.745550 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-77899cf6d-x9h9q_7ff63c73-62a3-44b4-acd3-1b3df175794f/cluster-olm-operator/1.log" Mar 08 04:02:36.753525 master-0 kubenswrapper[18592]: I0308 04:02:36.753421 18592 generic.go:334] "Generic (PLEG): container finished" podID="7ff63c73-62a3-44b4-acd3-1b3df175794f" containerID="29db3090b08a751660235fe4ca24e963b75ea9864f7e52c12b7c2e57d617cbd1" exitCode=0 Mar 08 04:02:36.753672 master-0 kubenswrapper[18592]: I0308 04:02:36.753636 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" event={"ID":"7ff63c73-62a3-44b4-acd3-1b3df175794f","Type":"ContainerDied","Data":"29db3090b08a751660235fe4ca24e963b75ea9864f7e52c12b7c2e57d617cbd1"} Mar 08 04:02:36.754454 master-0 kubenswrapper[18592]: I0308 04:02:36.754415 18592 scope.go:117] "RemoveContainer" containerID="29db3090b08a751660235fe4ca24e963b75ea9864f7e52c12b7c2e57d617cbd1" Mar 08 04:02:36.763274 master-0 kubenswrapper[18592]: I0308 04:02:36.763232 18592 generic.go:334] "Generic (PLEG): container finished" podID="1cbcb403-a424-4496-8c5c-5eb5e42dfb93" containerID="ee6a6bf31e7af3dec358121d4a82a967c996552057556595b3fa11c8208024b4" exitCode=0 Mar 08 04:02:36.763349 master-0 kubenswrapper[18592]: I0308 04:02:36.763305 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" event={"ID":"1cbcb403-a424-4496-8c5c-5eb5e42dfb93","Type":"ContainerDied","Data":"ee6a6bf31e7af3dec358121d4a82a967c996552057556595b3fa11c8208024b4"} Mar 08 04:02:36.763746 master-0 kubenswrapper[18592]: I0308 04:02:36.763718 18592 scope.go:117] "RemoveContainer" containerID="ee6a6bf31e7af3dec358121d4a82a967c996552057556595b3fa11c8208024b4" Mar 08 04:02:36.766968 master-0 kubenswrapper[18592]: I0308 04:02:36.766947 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-6fbfc8dc8f-nm8fj_b3eea925-73b3-4693-8f0e-6dd26107f60a/cluster-storage-operator/1.log" Mar 08 04:02:36.767048 master-0 kubenswrapper[18592]: I0308 04:02:36.766986 18592 generic.go:334] "Generic (PLEG): container finished" podID="b3eea925-73b3-4693-8f0e-6dd26107f60a" containerID="e1fec486ad164fa39bf18c97bf6ae2ed95b35ced36abe0b9ec4d03fe751e8f33" exitCode=0 Mar 08 04:02:36.767143 master-0 kubenswrapper[18592]: I0308 04:02:36.767056 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" event={"ID":"b3eea925-73b3-4693-8f0e-6dd26107f60a","Type":"ContainerDied","Data":"e1fec486ad164fa39bf18c97bf6ae2ed95b35ced36abe0b9ec4d03fe751e8f33"} Mar 08 04:02:36.768725 master-0 kubenswrapper[18592]: I0308 04:02:36.768669 18592 scope.go:117] "RemoveContainer" containerID="e1fec486ad164fa39bf18c97bf6ae2ed95b35ced36abe0b9ec4d03fe751e8f33" Mar 08 04:02:36.770893 master-0 kubenswrapper[18592]: I0308 04:02:36.770853 18592 generic.go:334] "Generic (PLEG): container finished" podID="6552620e-b23b-4102-a6ed-a0fcaff0f144" containerID="c162dc56ff027f45de8ba28290228ab4d7bf1321c54c8a9b17c086435474a021" exitCode=0 Mar 08 04:02:36.770893 master-0 kubenswrapper[18592]: I0308 04:02:36.770880 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth" event={"ID":"6552620e-b23b-4102-a6ed-a0fcaff0f144","Type":"ContainerDied","Data":"c162dc56ff027f45de8ba28290228ab4d7bf1321c54c8a9b17c086435474a021"} Mar 08 04:02:36.771224 master-0 kubenswrapper[18592]: I0308 04:02:36.771206 18592 scope.go:117] "RemoveContainer" containerID="c162dc56ff027f45de8ba28290228ab4d7bf1321c54c8a9b17c086435474a021" Mar 08 04:02:36.871504 master-0 kubenswrapper[18592]: I0308 04:02:36.871411 18592 scope.go:117] "RemoveContainer" containerID="cae216678d94c10a368ff595527d708d87bd43ed6865eacedbf892861c47fe3a" Mar 08 04:02:37.087161 master-0 kubenswrapper[18592]: I0308 04:02:37.085285 18592 scope.go:117] "RemoveContainer" containerID="7b9e0618571c76237a54adfbc9471783f3afade6ddbedbe9d5d1037a9f845813" Mar 08 04:02:37.127814 master-0 kubenswrapper[18592]: I0308 04:02:37.127772 18592 scope.go:117] "RemoveContainer" containerID="3b576ae60c0b63ec0db45afc74d3ab2b7a31ef872c28479883b2bca1465128e0" Mar 08 04:02:37.220396 master-0 kubenswrapper[18592]: I0308 04:02:37.220319 18592 scope.go:117] "RemoveContainer" containerID="11981809b9cc27f184966b17ad1925dff97bd3f4b8d6d288eb4740ef6e4ff5eb" Mar 08 04:02:37.351587 master-0 kubenswrapper[18592]: E0308 04:02:37.351542 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(0580c83f64e952a7a614903b6fdf6965)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" Mar 08 04:02:37.705321 master-0 kubenswrapper[18592]: I0308 04:02:37.705259 18592 patch_prober.go:28] interesting pod/oauth-openshift-6fd77597b5-w649n container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.99:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 04:02:37.705321 master-0 kubenswrapper[18592]: I0308 04:02:37.705312 18592 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" podUID="836c1be1-26de-4840-8ba6-9d34a751aebc" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.99:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 04:02:37.782056 master-0 kubenswrapper[18592]: I0308 04:02:37.781976 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" event={"ID":"ae4cf373-137f-4ed4-a276-2109a68f3616","Type":"ContainerStarted","Data":"416bf1ca0e52546b5e006ea5793f7e4224d196cc0391bca7c2c088f964de087d"} Mar 08 04:02:37.782440 master-0 kubenswrapper[18592]: I0308 04:02:37.782399 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" Mar 08 04:02:37.785136 master-0 kubenswrapper[18592]: I0308 04:02:37.785049 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-8gfmf" event={"ID":"1cbcb403-a424-4496-8c5c-5eb5e42dfb93","Type":"ContainerStarted","Data":"fc1532426e3536956bde92d79a87b7d90dcf284fbfb2cce000db7ad91774e591"} Mar 08 04:02:37.789792 master-0 kubenswrapper[18592]: I0308 04:02:37.789426 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-6wkth" event={"ID":"6552620e-b23b-4102-a6ed-a0fcaff0f144","Type":"ContainerStarted","Data":"68850e2ce573d4aec65265e541d41218fe082a99185f1879923a2b501666f9f2"} Mar 08 04:02:37.790241 master-0 kubenswrapper[18592]: I0308 04:02:37.789965 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-595b48c567-w5hbb" Mar 08 04:02:37.792031 master-0 kubenswrapper[18592]: I0308 04:02:37.791983 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-nm8fj" event={"ID":"b3eea925-73b3-4693-8f0e-6dd26107f60a","Type":"ContainerStarted","Data":"664c2709d0b6ad2bec7d73317d85180500140b68e59cded4037c079d1aa3605b"} Mar 08 04:02:37.795263 master-0 kubenswrapper[18592]: I0308 04:02:37.795218 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-2vjh2" event={"ID":"e78b283b-981e-48d7-a5f2-53f8401766ea","Type":"ContainerStarted","Data":"23eb343553478d536367261ec4880260bb7f420a57cff20e8707c4529af5a9c2"} Mar 08 04:02:37.797852 master-0 kubenswrapper[18592]: I0308 04:02:37.797770 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/cluster-policy-controller/3.log" Mar 08 04:02:37.800139 master-0 kubenswrapper[18592]: I0308 04:02:37.800085 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/kube-controller-manager/0.log" Mar 08 04:02:37.800263 master-0 kubenswrapper[18592]: I0308 04:02:37.800241 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0580c83f64e952a7a614903b6fdf6965","Type":"ContainerStarted","Data":"7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c"} Mar 08 04:02:37.800923 master-0 kubenswrapper[18592]: I0308 04:02:37.800882 18592 scope.go:117] "RemoveContainer" containerID="8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569" Mar 08 04:02:37.801634 master-0 kubenswrapper[18592]: E0308 04:02:37.801202 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(0580c83f64e952a7a614903b6fdf6965)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" Mar 08 04:02:37.803577 master-0 kubenswrapper[18592]: I0308 04:02:37.803528 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-gj69x" event={"ID":"49ec083d-dc74-457e-b10f-3bde04e9e75e","Type":"ContainerStarted","Data":"5c942c573568b6adc98299931b3e3d66f16368a86530f632387e6c602f3650fe"} Mar 08 04:02:37.807238 master-0 kubenswrapper[18592]: I0308 04:02:37.807156 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" event={"ID":"5a7752f9-7b9a-451f-997a-e9f696d38b34","Type":"ContainerStarted","Data":"4acc5ff225a5665f898604fd3eb265db2228f64df58c9370afeda95401ee9bde"} Mar 08 04:02:37.811789 master-0 kubenswrapper[18592]: I0308 04:02:37.809540 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-kt66j" event={"ID":"0ebf1330-e044-4ff5-8b48-2d667e0c5625","Type":"ContainerStarted","Data":"45f596dd42c1194298d193699818b8389c790f4ab400662a1b377c595f4ddf1b"} Mar 08 04:02:37.819451 master-0 kubenswrapper[18592]: I0308 04:02:37.819397 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-x9h9q" event={"ID":"7ff63c73-62a3-44b4-acd3-1b3df175794f","Type":"ContainerStarted","Data":"555ea4f508af7748498cf4cde8b0089c15736c51c14f3b6cca97b14429ea81b5"} Mar 08 04:02:37.823021 master-0 kubenswrapper[18592]: I0308 04:02:37.822964 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-bv67b_1b700d17-83d2-46c8-afbc-e5774822eabe/cluster-autoscaler-operator/0.log" Mar 08 04:02:37.823767 master-0 kubenswrapper[18592]: I0308 04:02:37.823707 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-bv67b" event={"ID":"1b700d17-83d2-46c8-afbc-e5774822eabe","Type":"ContainerStarted","Data":"54407c72d25c594480b28a0a9822505f62b0201e1af63fea5a456a4691a036f4"} Mar 08 04:02:37.827655 master-0 kubenswrapper[18592]: I0308 04:02:37.827583 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-84bf6db4f9-tdrf8_b70adfe9-94f1-44bc-85ce-498e5f0a1ca7/machine-api-operator/0.log" Mar 08 04:02:37.828173 master-0 kubenswrapper[18592]: I0308 04:02:37.828134 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-tdrf8" event={"ID":"b70adfe9-94f1-44bc-85ce-498e5f0a1ca7","Type":"ContainerStarted","Data":"9e1c3d0b5450693a52b5a172fd71c4e0d3c5acdbdf7ab537817d0143a056b245"} Mar 08 04:02:40.675561 master-0 kubenswrapper[18592]: I0308 04:02:40.675493 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:02:40.676396 master-0 kubenswrapper[18592]: I0308 04:02:40.675568 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:02:40.676396 master-0 kubenswrapper[18592]: I0308 04:02:40.675628 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-744db48f96-lgsd4" Mar 08 04:02:40.676396 master-0 kubenswrapper[18592]: I0308 04:02:40.676358 18592 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console" containerStatusID={"Type":"cri-o","ID":"576ba1799553100c4affee23b8bf267786a47451fbe5e13058a6627861ec622f"} pod="openshift-console/console-744db48f96-lgsd4" containerMessage="Container console failed startup probe, will be restarted" Mar 08 04:02:41.017897 master-0 kubenswrapper[18592]: E0308 04:02:41.016125 18592 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 04:02:42.143218 master-0 kubenswrapper[18592]: I0308 04:02:42.143130 18592 scope.go:117] "RemoveContainer" containerID="e2e26de1430830efa8f6b74e607c09d42160aca3122a8949f5ba15e8c2b266c1" Mar 08 04:02:42.143939 master-0 kubenswrapper[18592]: E0308 04:02:42.143535 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-h4qlp_openshift-cluster-storage-operator(9ec89e27-4360-48f2-a7ca-5d823bda4510)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" podUID="9ec89e27-4360-48f2-a7ca-5d823bda4510" Mar 08 04:02:43.302669 master-0 kubenswrapper[18592]: I0308 04:02:43.302567 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:02:43.303602 master-0 kubenswrapper[18592]: I0308 04:02:43.302680 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:02:43.303602 master-0 kubenswrapper[18592]: I0308 04:02:43.302757 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7748864899-8p6h5" Mar 08 04:02:43.303954 master-0 kubenswrapper[18592]: I0308 04:02:43.303899 18592 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console" containerStatusID={"Type":"cri-o","ID":"e72150628b7e9311091e98f72ec77d7eee04d74ad6ce896529075ce288be841e"} pod="openshift-console/console-7748864899-8p6h5" containerMessage="Container console failed startup probe, will be restarted" Mar 08 04:02:45.082082 master-0 kubenswrapper[18592]: E0308 04:02:45.081999 18592 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command 'sleep 25' exited with 137: " execCommand=["sleep","25"] containerName="console" pod="openshift-console/console-744db48f96-lgsd4" message="" Mar 08 04:02:45.082747 master-0 kubenswrapper[18592]: E0308 04:02:45.082074 18592 kuberuntime_container.go:691] "PreStop hook failed" err="command 'sleep 25' exited with 137: " pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" containerID="cri-o://576ba1799553100c4affee23b8bf267786a47451fbe5e13058a6627861ec622f" Mar 08 04:02:45.082747 master-0 kubenswrapper[18592]: I0308 04:02:45.082181 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" containerID="cri-o://576ba1799553100c4affee23b8bf267786a47451fbe5e13058a6627861ec622f" gracePeriod=36 Mar 08 04:02:45.542511 master-0 kubenswrapper[18592]: E0308 04:02:45.542412 18592 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command 'sleep 25' exited with 137: " execCommand=["sleep","25"] containerName="console" pod="openshift-console/console-7748864899-8p6h5" message="" Mar 08 04:02:45.542511 master-0 kubenswrapper[18592]: E0308 04:02:45.542499 18592 kuberuntime_container.go:691] "PreStop hook failed" err="command 'sleep 25' exited with 137: " pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" containerID="cri-o://e72150628b7e9311091e98f72ec77d7eee04d74ad6ce896529075ce288be841e" Mar 08 04:02:45.542728 master-0 kubenswrapper[18592]: I0308 04:02:45.542558 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" containerID="cri-o://e72150628b7e9311091e98f72ec77d7eee04d74ad6ce896529075ce288be841e" gracePeriod=38 Mar 08 04:02:45.929633 master-0 kubenswrapper[18592]: I0308 04:02:45.929580 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-744db48f96-lgsd4_935ab7fb-b097-41c3-8926-8343eb29e7fc/console/0.log" Mar 08 04:02:45.930032 master-0 kubenswrapper[18592]: I0308 04:02:45.929994 18592 generic.go:334] "Generic (PLEG): container finished" podID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerID="576ba1799553100c4affee23b8bf267786a47451fbe5e13058a6627861ec622f" exitCode=255 Mar 08 04:02:45.930671 master-0 kubenswrapper[18592]: I0308 04:02:45.930085 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-744db48f96-lgsd4" event={"ID":"935ab7fb-b097-41c3-8926-8343eb29e7fc","Type":"ContainerDied","Data":"576ba1799553100c4affee23b8bf267786a47451fbe5e13058a6627861ec622f"} Mar 08 04:02:45.930982 master-0 kubenswrapper[18592]: I0308 04:02:45.930939 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-744db48f96-lgsd4" event={"ID":"935ab7fb-b097-41c3-8926-8343eb29e7fc","Type":"ContainerStarted","Data":"32a66192def097d021b93aa8df6620f28795ead81c5ef79a07d5eac921422344"} Mar 08 04:02:45.935098 master-0 kubenswrapper[18592]: I0308 04:02:45.935041 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7748864899-8p6h5_5bc0469d-1ae9-4606-ba99-7a333b66af37/console/0.log" Mar 08 04:02:45.935250 master-0 kubenswrapper[18592]: I0308 04:02:45.935122 18592 generic.go:334] "Generic (PLEG): container finished" podID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerID="e72150628b7e9311091e98f72ec77d7eee04d74ad6ce896529075ce288be841e" exitCode=255 Mar 08 04:02:45.935250 master-0 kubenswrapper[18592]: I0308 04:02:45.935172 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7748864899-8p6h5" event={"ID":"5bc0469d-1ae9-4606-ba99-7a333b66af37","Type":"ContainerDied","Data":"e72150628b7e9311091e98f72ec77d7eee04d74ad6ce896529075ce288be841e"} Mar 08 04:02:45.935250 master-0 kubenswrapper[18592]: I0308 04:02:45.935235 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7748864899-8p6h5" event={"ID":"5bc0469d-1ae9-4606-ba99-7a333b66af37","Type":"ContainerStarted","Data":"211659fe18e5c4cb36e283be5616f6cec9746eb8310d8642428ea129f0b41e05"} Mar 08 04:02:46.889093 master-0 kubenswrapper[18592]: I0308 04:02:46.888993 18592 patch_prober.go:28] interesting pod/oauth-openshift-6fd77597b5-w649n container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.99:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 04:02:46.889936 master-0 kubenswrapper[18592]: I0308 04:02:46.889092 18592 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" podUID="836c1be1-26de-4840-8ba6-9d34a751aebc" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.99:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 04:02:47.154106 master-0 kubenswrapper[18592]: E0308 04:02:47.153919 18592 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 08 04:02:47.541760 master-0 kubenswrapper[18592]: I0308 04:02:47.541652 18592 patch_prober.go:28] interesting pod/etcd-operator-5884b9cd56-vzms7 container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.5:8443/healthz\": dial tcp 10.128.0.5:8443: connect: connection refused" start-of-body= Mar 08 04:02:47.542114 master-0 kubenswrapper[18592]: I0308 04:02:47.541762 18592 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-vzms7" podUID="5a7752f9-7b9a-451f-997a-e9f696d38b34" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.5:8443/healthz\": dial tcp 10.128.0.5:8443: connect: connection refused" Mar 08 04:02:50.675967 master-0 kubenswrapper[18592]: I0308 04:02:50.675921 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-744db48f96-lgsd4" Mar 08 04:02:50.675967 master-0 kubenswrapper[18592]: I0308 04:02:50.675966 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-744db48f96-lgsd4" Mar 08 04:02:50.676432 master-0 kubenswrapper[18592]: I0308 04:02:50.676270 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:02:50.676432 master-0 kubenswrapper[18592]: I0308 04:02:50.676311 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:02:51.143530 master-0 kubenswrapper[18592]: I0308 04:02:51.143488 18592 scope.go:117] "RemoveContainer" containerID="8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569" Mar 08 04:02:51.995694 master-0 kubenswrapper[18592]: I0308 04:02:51.995604 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/cluster-policy-controller/3.log" Mar 08 04:02:52.003185 master-0 kubenswrapper[18592]: I0308 04:02:52.003103 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/kube-controller-manager/0.log" Mar 08 04:02:52.003394 master-0 kubenswrapper[18592]: I0308 04:02:52.003248 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0580c83f64e952a7a614903b6fdf6965","Type":"ContainerStarted","Data":"8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379"} Mar 08 04:02:53.302265 master-0 kubenswrapper[18592]: I0308 04:02:53.302206 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7748864899-8p6h5" Mar 08 04:02:53.302265 master-0 kubenswrapper[18592]: I0308 04:02:53.302270 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7748864899-8p6h5" Mar 08 04:02:53.303568 master-0 kubenswrapper[18592]: I0308 04:02:53.303516 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:02:53.303640 master-0 kubenswrapper[18592]: I0308 04:02:53.303571 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:02:55.206319 master-0 kubenswrapper[18592]: I0308 04:02:55.206238 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:02:55.206319 master-0 kubenswrapper[18592]: I0308 04:02:55.206325 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:02:55.212736 master-0 kubenswrapper[18592]: I0308 04:02:55.212665 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:02:56.690647 master-0 kubenswrapper[18592]: I0308 04:02:56.690585 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6fd77597b5-w649n" Mar 08 04:02:57.143853 master-0 kubenswrapper[18592]: I0308 04:02:57.143715 18592 scope.go:117] "RemoveContainer" containerID="e2e26de1430830efa8f6b74e607c09d42160aca3122a8949f5ba15e8c2b266c1" Mar 08 04:02:57.144074 master-0 kubenswrapper[18592]: E0308 04:02:57.144043 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-h4qlp_openshift-cluster-storage-operator(9ec89e27-4360-48f2-a7ca-5d823bda4510)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" podUID="9ec89e27-4360-48f2-a7ca-5d823bda4510" Mar 08 04:03:00.675948 master-0 kubenswrapper[18592]: I0308 04:03:00.675873 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:03:00.675948 master-0 kubenswrapper[18592]: I0308 04:03:00.675938 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:03:00.962463 master-0 kubenswrapper[18592]: E0308 04:03:00.962382 18592 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 08 04:03:03.302415 master-0 kubenswrapper[18592]: I0308 04:03:03.302341 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:03:03.303114 master-0 kubenswrapper[18592]: I0308 04:03:03.302446 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:03:05.211384 master-0 kubenswrapper[18592]: I0308 04:03:05.211336 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:03:08.160237 master-0 kubenswrapper[18592]: I0308 04:03:08.160171 18592 scope.go:117] "RemoveContainer" containerID="e2e26de1430830efa8f6b74e607c09d42160aca3122a8949f5ba15e8c2b266c1" Mar 08 04:03:09.152730 master-0 kubenswrapper[18592]: I0308 04:03:09.152661 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/4.log" Mar 08 04:03:09.153061 master-0 kubenswrapper[18592]: I0308 04:03:09.152733 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-h4qlp" event={"ID":"9ec89e27-4360-48f2-a7ca-5d823bda4510","Type":"ContainerStarted","Data":"b6dfab8797daa394f79563e3e5aa64c9afafbcf7902012413302fb8a52351559"} Mar 08 04:03:09.536158 master-0 kubenswrapper[18592]: I0308 04:03:09.536100 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Mar 08 04:03:09.536652 master-0 kubenswrapper[18592]: E0308 04:03:09.536394 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde83292-7f24-4a84-aa85-e917c0f33a02" containerName="installer" Mar 08 04:03:09.536652 master-0 kubenswrapper[18592]: I0308 04:03:09.536408 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde83292-7f24-4a84-aa85-e917c0f33a02" containerName="installer" Mar 08 04:03:09.536652 master-0 kubenswrapper[18592]: E0308 04:03:09.536425 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57550d28-9d40-436b-b20f-da41f30a0e6b" containerName="installer" Mar 08 04:03:09.536652 master-0 kubenswrapper[18592]: I0308 04:03:09.536432 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="57550d28-9d40-436b-b20f-da41f30a0e6b" containerName="installer" Mar 08 04:03:09.536652 master-0 kubenswrapper[18592]: E0308 04:03:09.536457 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d84e0373-988e-47db-be73-5690d18beba3" containerName="installer" Mar 08 04:03:09.536652 master-0 kubenswrapper[18592]: I0308 04:03:09.536464 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="d84e0373-988e-47db-be73-5690d18beba3" containerName="installer" Mar 08 04:03:09.536652 master-0 kubenswrapper[18592]: I0308 04:03:09.536572 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="cde83292-7f24-4a84-aa85-e917c0f33a02" containerName="installer" Mar 08 04:03:09.536652 master-0 kubenswrapper[18592]: I0308 04:03:09.536589 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="57550d28-9d40-436b-b20f-da41f30a0e6b" containerName="installer" Mar 08 04:03:09.536652 master-0 kubenswrapper[18592]: I0308 04:03:09.536605 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="d84e0373-988e-47db-be73-5690d18beba3" containerName="installer" Mar 08 04:03:09.537082 master-0 kubenswrapper[18592]: I0308 04:03:09.537043 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 04:03:09.538443 master-0 kubenswrapper[18592]: I0308 04:03:09.538419 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 04:03:09.539071 master-0 kubenswrapper[18592]: I0308 04:03:09.539040 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-7rcx9" Mar 08 04:03:09.552252 master-0 kubenswrapper[18592]: I0308 04:03:09.552200 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Mar 08 04:03:09.627210 master-0 kubenswrapper[18592]: I0308 04:03:09.627153 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/514f8205-32f5-4a29-9779-fa9e339e452c-kube-api-access\") pod \"installer-5-master-0\" (UID: \"514f8205-32f5-4a29-9779-fa9e339e452c\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 04:03:09.627210 master-0 kubenswrapper[18592]: I0308 04:03:09.627212 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/514f8205-32f5-4a29-9779-fa9e339e452c-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"514f8205-32f5-4a29-9779-fa9e339e452c\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 04:03:09.627440 master-0 kubenswrapper[18592]: I0308 04:03:09.627232 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/514f8205-32f5-4a29-9779-fa9e339e452c-var-lock\") pod \"installer-5-master-0\" (UID: \"514f8205-32f5-4a29-9779-fa9e339e452c\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 04:03:09.728934 master-0 kubenswrapper[18592]: I0308 04:03:09.728845 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/514f8205-32f5-4a29-9779-fa9e339e452c-kube-api-access\") pod \"installer-5-master-0\" (UID: \"514f8205-32f5-4a29-9779-fa9e339e452c\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 04:03:09.729242 master-0 kubenswrapper[18592]: I0308 04:03:09.729162 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/514f8205-32f5-4a29-9779-fa9e339e452c-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"514f8205-32f5-4a29-9779-fa9e339e452c\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 04:03:09.729364 master-0 kubenswrapper[18592]: I0308 04:03:09.729263 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/514f8205-32f5-4a29-9779-fa9e339e452c-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"514f8205-32f5-4a29-9779-fa9e339e452c\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 04:03:09.729364 master-0 kubenswrapper[18592]: I0308 04:03:09.729273 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/514f8205-32f5-4a29-9779-fa9e339e452c-var-lock\") pod \"installer-5-master-0\" (UID: \"514f8205-32f5-4a29-9779-fa9e339e452c\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 04:03:09.729364 master-0 kubenswrapper[18592]: I0308 04:03:09.729336 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/514f8205-32f5-4a29-9779-fa9e339e452c-var-lock\") pod \"installer-5-master-0\" (UID: \"514f8205-32f5-4a29-9779-fa9e339e452c\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 04:03:10.139233 master-0 kubenswrapper[18592]: I0308 04:03:10.139160 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/514f8205-32f5-4a29-9779-fa9e339e452c-kube-api-access\") pod \"installer-5-master-0\" (UID: \"514f8205-32f5-4a29-9779-fa9e339e452c\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 04:03:10.197281 master-0 kubenswrapper[18592]: I0308 04:03:10.197199 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 04:03:10.632585 master-0 kubenswrapper[18592]: W0308 04:03:10.632501 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod514f8205_32f5_4a29_9779_fa9e339e452c.slice/crio-e3923e9598929e481d263721937694f2dec1eff395749cbd223cb65432526b8a WatchSource:0}: Error finding container e3923e9598929e481d263721937694f2dec1eff395749cbd223cb65432526b8a: Status 404 returned error can't find the container with id e3923e9598929e481d263721937694f2dec1eff395749cbd223cb65432526b8a Mar 08 04:03:10.633281 master-0 kubenswrapper[18592]: I0308 04:03:10.632817 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Mar 08 04:03:10.675810 master-0 kubenswrapper[18592]: I0308 04:03:10.675741 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:03:10.676011 master-0 kubenswrapper[18592]: I0308 04:03:10.675859 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:03:11.187536 master-0 kubenswrapper[18592]: I0308 04:03:11.187456 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"514f8205-32f5-4a29-9779-fa9e339e452c","Type":"ContainerStarted","Data":"54c15bdbb93a0a3bdf8aa6cc57dfa2614b46751c191b6037f987467b62fd3f92"} Mar 08 04:03:11.187536 master-0 kubenswrapper[18592]: I0308 04:03:11.187509 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"514f8205-32f5-4a29-9779-fa9e339e452c","Type":"ContainerStarted","Data":"e3923e9598929e481d263721937694f2dec1eff395749cbd223cb65432526b8a"} Mar 08 04:03:11.218465 master-0 kubenswrapper[18592]: I0308 04:03:11.218352 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-5-master-0" podStartSLOduration=2.218324717 podStartE2EDuration="2.218324717s" podCreationTimestamp="2026-03-08 04:03:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:03:11.210805146 +0000 UTC m=+603.309559536" watchObservedRunningTime="2026-03-08 04:03:11.218324717 +0000 UTC m=+603.317079097" Mar 08 04:03:13.302992 master-0 kubenswrapper[18592]: I0308 04:03:13.302891 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:03:13.303751 master-0 kubenswrapper[18592]: I0308 04:03:13.302988 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:03:15.297252 master-0 kubenswrapper[18592]: I0308 04:03:15.297169 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 04:03:15.298299 master-0 kubenswrapper[18592]: I0308 04:03:15.298181 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="alertmanager" containerID="cri-o://8ffa2ebcaa54a41e98873119e7f53fb354c1c0439475fce800c91d93f862c5b3" gracePeriod=120 Mar 08 04:03:15.298299 master-0 kubenswrapper[18592]: I0308 04:03:15.298234 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="kube-rbac-proxy-web" containerID="cri-o://5b6b052f8bf0820cfe48407dc8704560b66216dfe0ff3c4511019bfcdbea1ef2" gracePeriod=120 Mar 08 04:03:15.298541 master-0 kubenswrapper[18592]: I0308 04:03:15.298301 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="kube-rbac-proxy-metric" containerID="cri-o://762bc926182fe44b4af81804c22991bc1dff6e23202c8882e8000e3e2aaad012" gracePeriod=120 Mar 08 04:03:15.298541 master-0 kubenswrapper[18592]: I0308 04:03:15.298336 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="prom-label-proxy" containerID="cri-o://94dbcdb89aa0f263874521912d133035369312397d1ad983f2732352b74fb994" gracePeriod=120 Mar 08 04:03:15.298541 master-0 kubenswrapper[18592]: I0308 04:03:15.298277 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="config-reloader" containerID="cri-o://34335e826503ea10aadf683d38cad805f434f5b33127bda531c7acdcb51fa3a5" gracePeriod=120 Mar 08 04:03:15.298541 master-0 kubenswrapper[18592]: I0308 04:03:15.298231 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="kube-rbac-proxy" containerID="cri-o://d23a898658aeb28f11810d92075f16eb074d90b8b3a0a2f14b7f1f5e6e3c3074" gracePeriod=120 Mar 08 04:03:16.236336 master-0 kubenswrapper[18592]: I0308 04:03:16.236188 18592 generic.go:334] "Generic (PLEG): container finished" podID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerID="94dbcdb89aa0f263874521912d133035369312397d1ad983f2732352b74fb994" exitCode=0 Mar 08 04:03:16.236336 master-0 kubenswrapper[18592]: I0308 04:03:16.236242 18592 generic.go:334] "Generic (PLEG): container finished" podID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerID="d23a898658aeb28f11810d92075f16eb074d90b8b3a0a2f14b7f1f5e6e3c3074" exitCode=0 Mar 08 04:03:16.236336 master-0 kubenswrapper[18592]: I0308 04:03:16.236258 18592 generic.go:334] "Generic (PLEG): container finished" podID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerID="34335e826503ea10aadf683d38cad805f434f5b33127bda531c7acdcb51fa3a5" exitCode=0 Mar 08 04:03:16.236336 master-0 kubenswrapper[18592]: I0308 04:03:16.236274 18592 generic.go:334] "Generic (PLEG): container finished" podID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerID="8ffa2ebcaa54a41e98873119e7f53fb354c1c0439475fce800c91d93f862c5b3" exitCode=0 Mar 08 04:03:16.236336 master-0 kubenswrapper[18592]: I0308 04:03:16.236301 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cacb9582-2132-4543-8a31-7b100ba4dd2f","Type":"ContainerDied","Data":"94dbcdb89aa0f263874521912d133035369312397d1ad983f2732352b74fb994"} Mar 08 04:03:16.236336 master-0 kubenswrapper[18592]: I0308 04:03:16.236335 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cacb9582-2132-4543-8a31-7b100ba4dd2f","Type":"ContainerDied","Data":"d23a898658aeb28f11810d92075f16eb074d90b8b3a0a2f14b7f1f5e6e3c3074"} Mar 08 04:03:16.237230 master-0 kubenswrapper[18592]: I0308 04:03:16.236358 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cacb9582-2132-4543-8a31-7b100ba4dd2f","Type":"ContainerDied","Data":"34335e826503ea10aadf683d38cad805f434f5b33127bda531c7acdcb51fa3a5"} Mar 08 04:03:16.237230 master-0 kubenswrapper[18592]: I0308 04:03:16.236377 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cacb9582-2132-4543-8a31-7b100ba4dd2f","Type":"ContainerDied","Data":"8ffa2ebcaa54a41e98873119e7f53fb354c1c0439475fce800c91d93f862c5b3"} Mar 08 04:03:16.921992 master-0 kubenswrapper[18592]: I0308 04:03:16.921934 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:16.962256 master-0 kubenswrapper[18592]: I0308 04:03:16.961333 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-metrics-client-ca\") pod \"cacb9582-2132-4543-8a31-7b100ba4dd2f\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " Mar 08 04:03:16.962256 master-0 kubenswrapper[18592]: I0308 04:03:16.961435 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle\") pod \"cacb9582-2132-4543-8a31-7b100ba4dd2f\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " Mar 08 04:03:16.962256 master-0 kubenswrapper[18592]: I0308 04:03:16.961514 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-kube-rbac-proxy-web\") pod \"cacb9582-2132-4543-8a31-7b100ba4dd2f\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " Mar 08 04:03:16.962256 master-0 kubenswrapper[18592]: I0308 04:03:16.961555 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-main-tls\") pod \"cacb9582-2132-4543-8a31-7b100ba4dd2f\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " Mar 08 04:03:16.962256 master-0 kubenswrapper[18592]: I0308 04:03:16.961605 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-main-db\") pod \"cacb9582-2132-4543-8a31-7b100ba4dd2f\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " Mar 08 04:03:16.962256 master-0 kubenswrapper[18592]: I0308 04:03:16.961628 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzp7p\" (UniqueName: \"kubernetes.io/projected/cacb9582-2132-4543-8a31-7b100ba4dd2f-kube-api-access-lzp7p\") pod \"cacb9582-2132-4543-8a31-7b100ba4dd2f\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " Mar 08 04:03:16.962256 master-0 kubenswrapper[18592]: I0308 04:03:16.961650 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cacb9582-2132-4543-8a31-7b100ba4dd2f-tls-assets\") pod \"cacb9582-2132-4543-8a31-7b100ba4dd2f\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " Mar 08 04:03:16.962256 master-0 kubenswrapper[18592]: I0308 04:03:16.961687 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cacb9582-2132-4543-8a31-7b100ba4dd2f-config-out\") pod \"cacb9582-2132-4543-8a31-7b100ba4dd2f\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " Mar 08 04:03:16.962256 master-0 kubenswrapper[18592]: I0308 04:03:16.961708 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-web-config\") pod \"cacb9582-2132-4543-8a31-7b100ba4dd2f\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " Mar 08 04:03:16.962256 master-0 kubenswrapper[18592]: I0308 04:03:16.961771 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-kube-rbac-proxy\") pod \"cacb9582-2132-4543-8a31-7b100ba4dd2f\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " Mar 08 04:03:16.962256 master-0 kubenswrapper[18592]: I0308 04:03:16.961814 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-config-volume\") pod \"cacb9582-2132-4543-8a31-7b100ba4dd2f\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " Mar 08 04:03:16.962256 master-0 kubenswrapper[18592]: I0308 04:03:16.961883 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"cacb9582-2132-4543-8a31-7b100ba4dd2f\" (UID: \"cacb9582-2132-4543-8a31-7b100ba4dd2f\") " Mar 08 04:03:16.966538 master-0 kubenswrapper[18592]: I0308 04:03:16.965550 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "cacb9582-2132-4543-8a31-7b100ba4dd2f" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:03:16.966859 master-0 kubenswrapper[18592]: I0308 04:03:16.966749 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "cacb9582-2132-4543-8a31-7b100ba4dd2f" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:03:16.967891 master-0 kubenswrapper[18592]: I0308 04:03:16.967854 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-config-volume" (OuterVolumeSpecName: "config-volume") pod "cacb9582-2132-4543-8a31-7b100ba4dd2f" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:03:16.969845 master-0 kubenswrapper[18592]: I0308 04:03:16.969741 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "cacb9582-2132-4543-8a31-7b100ba4dd2f" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:03:16.970517 master-0 kubenswrapper[18592]: I0308 04:03:16.970467 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "cacb9582-2132-4543-8a31-7b100ba4dd2f" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:03:16.972474 master-0 kubenswrapper[18592]: I0308 04:03:16.972186 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "cacb9582-2132-4543-8a31-7b100ba4dd2f" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:03:16.973207 master-0 kubenswrapper[18592]: I0308 04:03:16.973173 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "cacb9582-2132-4543-8a31-7b100ba4dd2f" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:03:16.973858 master-0 kubenswrapper[18592]: I0308 04:03:16.973770 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cacb9582-2132-4543-8a31-7b100ba4dd2f-kube-api-access-lzp7p" (OuterVolumeSpecName: "kube-api-access-lzp7p") pod "cacb9582-2132-4543-8a31-7b100ba4dd2f" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f"). InnerVolumeSpecName "kube-api-access-lzp7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:03:16.988376 master-0 kubenswrapper[18592]: I0308 04:03:16.987796 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cacb9582-2132-4543-8a31-7b100ba4dd2f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "cacb9582-2132-4543-8a31-7b100ba4dd2f" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:03:16.988376 master-0 kubenswrapper[18592]: I0308 04:03:16.987803 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cacb9582-2132-4543-8a31-7b100ba4dd2f-config-out" (OuterVolumeSpecName: "config-out") pod "cacb9582-2132-4543-8a31-7b100ba4dd2f" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:03:16.988376 master-0 kubenswrapper[18592]: I0308 04:03:16.987963 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "cacb9582-2132-4543-8a31-7b100ba4dd2f" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:03:17.023118 master-0 kubenswrapper[18592]: I0308 04:03:17.023050 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-web-config" (OuterVolumeSpecName: "web-config") pod "cacb9582-2132-4543-8a31-7b100ba4dd2f" (UID: "cacb9582-2132-4543-8a31-7b100ba4dd2f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:03:17.063689 master-0 kubenswrapper[18592]: I0308 04:03:17.063621 18592 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-main-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:17.063689 master-0 kubenswrapper[18592]: I0308 04:03:17.063675 18592 reconciler_common.go:293] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-main-db\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:17.063689 master-0 kubenswrapper[18592]: I0308 04:03:17.063690 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzp7p\" (UniqueName: \"kubernetes.io/projected/cacb9582-2132-4543-8a31-7b100ba4dd2f-kube-api-access-lzp7p\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:17.064086 master-0 kubenswrapper[18592]: I0308 04:03:17.063705 18592 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cacb9582-2132-4543-8a31-7b100ba4dd2f-tls-assets\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:17.064086 master-0 kubenswrapper[18592]: I0308 04:03:17.063719 18592 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cacb9582-2132-4543-8a31-7b100ba4dd2f-config-out\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:17.064086 master-0 kubenswrapper[18592]: I0308 04:03:17.063733 18592 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-web-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:17.064086 master-0 kubenswrapper[18592]: I0308 04:03:17.063746 18592 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:17.064086 master-0 kubenswrapper[18592]: I0308 04:03:17.063760 18592 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-config-volume\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:17.064086 master-0 kubenswrapper[18592]: I0308 04:03:17.063776 18592 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-kube-rbac-proxy-metric\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:17.064086 master-0 kubenswrapper[18592]: I0308 04:03:17.063789 18592 reconciler_common.go:293] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:17.064086 master-0 kubenswrapper[18592]: I0308 04:03:17.063804 18592 reconciler_common.go:293] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cacb9582-2132-4543-8a31-7b100ba4dd2f-alertmanager-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:17.064086 master-0 kubenswrapper[18592]: I0308 04:03:17.063817 18592 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cacb9582-2132-4543-8a31-7b100ba4dd2f-secret-alertmanager-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:17.298176 master-0 kubenswrapper[18592]: I0308 04:03:17.298118 18592 generic.go:334] "Generic (PLEG): container finished" podID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerID="762bc926182fe44b4af81804c22991bc1dff6e23202c8882e8000e3e2aaad012" exitCode=0 Mar 08 04:03:17.298401 master-0 kubenswrapper[18592]: I0308 04:03:17.298386 18592 generic.go:334] "Generic (PLEG): container finished" podID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerID="5b6b052f8bf0820cfe48407dc8704560b66216dfe0ff3c4511019bfcdbea1ef2" exitCode=0 Mar 08 04:03:17.298499 master-0 kubenswrapper[18592]: I0308 04:03:17.298482 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cacb9582-2132-4543-8a31-7b100ba4dd2f","Type":"ContainerDied","Data":"762bc926182fe44b4af81804c22991bc1dff6e23202c8882e8000e3e2aaad012"} Mar 08 04:03:17.298592 master-0 kubenswrapper[18592]: I0308 04:03:17.298580 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cacb9582-2132-4543-8a31-7b100ba4dd2f","Type":"ContainerDied","Data":"5b6b052f8bf0820cfe48407dc8704560b66216dfe0ff3c4511019bfcdbea1ef2"} Mar 08 04:03:17.298682 master-0 kubenswrapper[18592]: I0308 04:03:17.298670 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cacb9582-2132-4543-8a31-7b100ba4dd2f","Type":"ContainerDied","Data":"3d93dc780d604df774e75a0674240fd776184772216c04f04a16c061ca8f1739"} Mar 08 04:03:17.298782 master-0 kubenswrapper[18592]: I0308 04:03:17.298745 18592 scope.go:117] "RemoveContainer" containerID="94dbcdb89aa0f263874521912d133035369312397d1ad983f2732352b74fb994" Mar 08 04:03:17.298912 master-0 kubenswrapper[18592]: I0308 04:03:17.298861 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.317186 master-0 kubenswrapper[18592]: I0308 04:03:17.317154 18592 scope.go:117] "RemoveContainer" containerID="762bc926182fe44b4af81804c22991bc1dff6e23202c8882e8000e3e2aaad012" Mar 08 04:03:17.344987 master-0 kubenswrapper[18592]: I0308 04:03:17.344879 18592 scope.go:117] "RemoveContainer" containerID="d23a898658aeb28f11810d92075f16eb074d90b8b3a0a2f14b7f1f5e6e3c3074" Mar 08 04:03:17.346483 master-0 kubenswrapper[18592]: I0308 04:03:17.346421 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 04:03:17.353942 master-0 kubenswrapper[18592]: I0308 04:03:17.353220 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 04:03:17.376234 master-0 kubenswrapper[18592]: I0308 04:03:17.372085 18592 scope.go:117] "RemoveContainer" containerID="5b6b052f8bf0820cfe48407dc8704560b66216dfe0ff3c4511019bfcdbea1ef2" Mar 08 04:03:17.394667 master-0 kubenswrapper[18592]: I0308 04:03:17.394597 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 04:03:17.394955 master-0 kubenswrapper[18592]: E0308 04:03:17.394883 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="prom-label-proxy" Mar 08 04:03:17.394955 master-0 kubenswrapper[18592]: I0308 04:03:17.394900 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="prom-label-proxy" Mar 08 04:03:17.394955 master-0 kubenswrapper[18592]: E0308 04:03:17.394926 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="kube-rbac-proxy-metric" Mar 08 04:03:17.394955 master-0 kubenswrapper[18592]: I0308 04:03:17.394932 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="kube-rbac-proxy-metric" Mar 08 04:03:17.394955 master-0 kubenswrapper[18592]: E0308 04:03:17.394943 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="init-config-reloader" Mar 08 04:03:17.394955 master-0 kubenswrapper[18592]: I0308 04:03:17.394949 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="init-config-reloader" Mar 08 04:03:17.394955 master-0 kubenswrapper[18592]: E0308 04:03:17.394959 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="config-reloader" Mar 08 04:03:17.394955 master-0 kubenswrapper[18592]: I0308 04:03:17.394966 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="config-reloader" Mar 08 04:03:17.395260 master-0 kubenswrapper[18592]: E0308 04:03:17.394977 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="kube-rbac-proxy" Mar 08 04:03:17.395260 master-0 kubenswrapper[18592]: I0308 04:03:17.394983 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="kube-rbac-proxy" Mar 08 04:03:17.395260 master-0 kubenswrapper[18592]: E0308 04:03:17.394996 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="kube-rbac-proxy-web" Mar 08 04:03:17.395260 master-0 kubenswrapper[18592]: I0308 04:03:17.395003 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="kube-rbac-proxy-web" Mar 08 04:03:17.395260 master-0 kubenswrapper[18592]: E0308 04:03:17.395014 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="alertmanager" Mar 08 04:03:17.395260 master-0 kubenswrapper[18592]: I0308 04:03:17.395019 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="alertmanager" Mar 08 04:03:17.395260 master-0 kubenswrapper[18592]: I0308 04:03:17.395139 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="config-reloader" Mar 08 04:03:17.395260 master-0 kubenswrapper[18592]: I0308 04:03:17.395157 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="kube-rbac-proxy-web" Mar 08 04:03:17.395260 master-0 kubenswrapper[18592]: I0308 04:03:17.395178 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="alertmanager" Mar 08 04:03:17.395260 master-0 kubenswrapper[18592]: I0308 04:03:17.395188 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="kube-rbac-proxy" Mar 08 04:03:17.395260 master-0 kubenswrapper[18592]: I0308 04:03:17.395202 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="kube-rbac-proxy-metric" Mar 08 04:03:17.395260 master-0 kubenswrapper[18592]: I0308 04:03:17.395213 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" containerName="prom-label-proxy" Mar 08 04:03:17.397094 master-0 kubenswrapper[18592]: I0308 04:03:17.397025 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.399853 master-0 kubenswrapper[18592]: I0308 04:03:17.399799 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 08 04:03:17.400037 master-0 kubenswrapper[18592]: I0308 04:03:17.400011 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 08 04:03:17.400158 master-0 kubenswrapper[18592]: I0308 04:03:17.400144 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 08 04:03:17.400455 master-0 kubenswrapper[18592]: I0308 04:03:17.400441 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-65bbw" Mar 08 04:03:17.400528 master-0 kubenswrapper[18592]: I0308 04:03:17.400450 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 08 04:03:17.400596 master-0 kubenswrapper[18592]: I0308 04:03:17.400579 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 08 04:03:17.400751 master-0 kubenswrapper[18592]: I0308 04:03:17.400739 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 08 04:03:17.401016 master-0 kubenswrapper[18592]: I0308 04:03:17.401003 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 08 04:03:17.408916 master-0 kubenswrapper[18592]: I0308 04:03:17.408805 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 08 04:03:17.410268 master-0 kubenswrapper[18592]: I0308 04:03:17.410111 18592 scope.go:117] "RemoveContainer" containerID="34335e826503ea10aadf683d38cad805f434f5b33127bda531c7acdcb51fa3a5" Mar 08 04:03:17.416700 master-0 kubenswrapper[18592]: I0308 04:03:17.416640 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 04:03:17.457711 master-0 kubenswrapper[18592]: I0308 04:03:17.457547 18592 scope.go:117] "RemoveContainer" containerID="8ffa2ebcaa54a41e98873119e7f53fb354c1c0439475fce800c91d93f862c5b3" Mar 08 04:03:17.472889 master-0 kubenswrapper[18592]: I0308 04:03:17.472817 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c68746bf-7562-4da0-85e5-dce1ad9786b3-web-config\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.473078 master-0 kubenswrapper[18592]: I0308 04:03:17.472936 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c68746bf-7562-4da0-85e5-dce1ad9786b3-config-volume\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.473078 master-0 kubenswrapper[18592]: I0308 04:03:17.472977 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68746bf-7562-4da0-85e5-dce1ad9786b3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.473078 master-0 kubenswrapper[18592]: I0308 04:03:17.473008 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c68746bf-7562-4da0-85e5-dce1ad9786b3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.473078 master-0 kubenswrapper[18592]: I0308 04:03:17.473052 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqhnk\" (UniqueName: \"kubernetes.io/projected/c68746bf-7562-4da0-85e5-dce1ad9786b3-kube-api-access-rqhnk\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.473208 master-0 kubenswrapper[18592]: I0308 04:03:17.473092 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c68746bf-7562-4da0-85e5-dce1ad9786b3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.473208 master-0 kubenswrapper[18592]: I0308 04:03:17.473119 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c68746bf-7562-4da0-85e5-dce1ad9786b3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.473208 master-0 kubenswrapper[18592]: I0308 04:03:17.473145 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c68746bf-7562-4da0-85e5-dce1ad9786b3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.473208 master-0 kubenswrapper[18592]: I0308 04:03:17.473174 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c68746bf-7562-4da0-85e5-dce1ad9786b3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.473208 master-0 kubenswrapper[18592]: I0308 04:03:17.473199 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c68746bf-7562-4da0-85e5-dce1ad9786b3-config-out\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.473504 master-0 kubenswrapper[18592]: I0308 04:03:17.473240 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c68746bf-7562-4da0-85e5-dce1ad9786b3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.473504 master-0 kubenswrapper[18592]: I0308 04:03:17.473267 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c68746bf-7562-4da0-85e5-dce1ad9786b3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.480031 master-0 kubenswrapper[18592]: I0308 04:03:17.479989 18592 scope.go:117] "RemoveContainer" containerID="296f822f770e0810353c5af5937b6f7a98bd88f53e04415bc2c63dbf385c5929" Mar 08 04:03:17.495309 master-0 kubenswrapper[18592]: I0308 04:03:17.495012 18592 scope.go:117] "RemoveContainer" containerID="94dbcdb89aa0f263874521912d133035369312397d1ad983f2732352b74fb994" Mar 08 04:03:17.495471 master-0 kubenswrapper[18592]: E0308 04:03:17.495439 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94dbcdb89aa0f263874521912d133035369312397d1ad983f2732352b74fb994\": container with ID starting with 94dbcdb89aa0f263874521912d133035369312397d1ad983f2732352b74fb994 not found: ID does not exist" containerID="94dbcdb89aa0f263874521912d133035369312397d1ad983f2732352b74fb994" Mar 08 04:03:17.495524 master-0 kubenswrapper[18592]: I0308 04:03:17.495491 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94dbcdb89aa0f263874521912d133035369312397d1ad983f2732352b74fb994"} err="failed to get container status \"94dbcdb89aa0f263874521912d133035369312397d1ad983f2732352b74fb994\": rpc error: code = NotFound desc = could not find container \"94dbcdb89aa0f263874521912d133035369312397d1ad983f2732352b74fb994\": container with ID starting with 94dbcdb89aa0f263874521912d133035369312397d1ad983f2732352b74fb994 not found: ID does not exist" Mar 08 04:03:17.495524 master-0 kubenswrapper[18592]: I0308 04:03:17.495512 18592 scope.go:117] "RemoveContainer" containerID="762bc926182fe44b4af81804c22991bc1dff6e23202c8882e8000e3e2aaad012" Mar 08 04:03:17.496133 master-0 kubenswrapper[18592]: E0308 04:03:17.496105 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"762bc926182fe44b4af81804c22991bc1dff6e23202c8882e8000e3e2aaad012\": container with ID starting with 762bc926182fe44b4af81804c22991bc1dff6e23202c8882e8000e3e2aaad012 not found: ID does not exist" containerID="762bc926182fe44b4af81804c22991bc1dff6e23202c8882e8000e3e2aaad012" Mar 08 04:03:17.496202 master-0 kubenswrapper[18592]: I0308 04:03:17.496144 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762bc926182fe44b4af81804c22991bc1dff6e23202c8882e8000e3e2aaad012"} err="failed to get container status \"762bc926182fe44b4af81804c22991bc1dff6e23202c8882e8000e3e2aaad012\": rpc error: code = NotFound desc = could not find container \"762bc926182fe44b4af81804c22991bc1dff6e23202c8882e8000e3e2aaad012\": container with ID starting with 762bc926182fe44b4af81804c22991bc1dff6e23202c8882e8000e3e2aaad012 not found: ID does not exist" Mar 08 04:03:17.496202 master-0 kubenswrapper[18592]: I0308 04:03:17.496157 18592 scope.go:117] "RemoveContainer" containerID="d23a898658aeb28f11810d92075f16eb074d90b8b3a0a2f14b7f1f5e6e3c3074" Mar 08 04:03:17.496883 master-0 kubenswrapper[18592]: E0308 04:03:17.496837 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d23a898658aeb28f11810d92075f16eb074d90b8b3a0a2f14b7f1f5e6e3c3074\": container with ID starting with d23a898658aeb28f11810d92075f16eb074d90b8b3a0a2f14b7f1f5e6e3c3074 not found: ID does not exist" containerID="d23a898658aeb28f11810d92075f16eb074d90b8b3a0a2f14b7f1f5e6e3c3074" Mar 08 04:03:17.496955 master-0 kubenswrapper[18592]: I0308 04:03:17.496898 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d23a898658aeb28f11810d92075f16eb074d90b8b3a0a2f14b7f1f5e6e3c3074"} err="failed to get container status \"d23a898658aeb28f11810d92075f16eb074d90b8b3a0a2f14b7f1f5e6e3c3074\": rpc error: code = NotFound desc = could not find container \"d23a898658aeb28f11810d92075f16eb074d90b8b3a0a2f14b7f1f5e6e3c3074\": container with ID starting with d23a898658aeb28f11810d92075f16eb074d90b8b3a0a2f14b7f1f5e6e3c3074 not found: ID does not exist" Mar 08 04:03:17.496955 master-0 kubenswrapper[18592]: I0308 04:03:17.496934 18592 scope.go:117] "RemoveContainer" containerID="5b6b052f8bf0820cfe48407dc8704560b66216dfe0ff3c4511019bfcdbea1ef2" Mar 08 04:03:17.497564 master-0 kubenswrapper[18592]: E0308 04:03:17.497484 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b6b052f8bf0820cfe48407dc8704560b66216dfe0ff3c4511019bfcdbea1ef2\": container with ID starting with 5b6b052f8bf0820cfe48407dc8704560b66216dfe0ff3c4511019bfcdbea1ef2 not found: ID does not exist" containerID="5b6b052f8bf0820cfe48407dc8704560b66216dfe0ff3c4511019bfcdbea1ef2" Mar 08 04:03:17.497610 master-0 kubenswrapper[18592]: I0308 04:03:17.497560 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b6b052f8bf0820cfe48407dc8704560b66216dfe0ff3c4511019bfcdbea1ef2"} err="failed to get container status \"5b6b052f8bf0820cfe48407dc8704560b66216dfe0ff3c4511019bfcdbea1ef2\": rpc error: code = NotFound desc = could not find container \"5b6b052f8bf0820cfe48407dc8704560b66216dfe0ff3c4511019bfcdbea1ef2\": container with ID starting with 5b6b052f8bf0820cfe48407dc8704560b66216dfe0ff3c4511019bfcdbea1ef2 not found: ID does not exist" Mar 08 04:03:17.497610 master-0 kubenswrapper[18592]: I0308 04:03:17.497581 18592 scope.go:117] "RemoveContainer" containerID="34335e826503ea10aadf683d38cad805f434f5b33127bda531c7acdcb51fa3a5" Mar 08 04:03:17.497835 master-0 kubenswrapper[18592]: E0308 04:03:17.497786 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34335e826503ea10aadf683d38cad805f434f5b33127bda531c7acdcb51fa3a5\": container with ID starting with 34335e826503ea10aadf683d38cad805f434f5b33127bda531c7acdcb51fa3a5 not found: ID does not exist" containerID="34335e826503ea10aadf683d38cad805f434f5b33127bda531c7acdcb51fa3a5" Mar 08 04:03:17.497890 master-0 kubenswrapper[18592]: I0308 04:03:17.497816 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34335e826503ea10aadf683d38cad805f434f5b33127bda531c7acdcb51fa3a5"} err="failed to get container status \"34335e826503ea10aadf683d38cad805f434f5b33127bda531c7acdcb51fa3a5\": rpc error: code = NotFound desc = could not find container \"34335e826503ea10aadf683d38cad805f434f5b33127bda531c7acdcb51fa3a5\": container with ID starting with 34335e826503ea10aadf683d38cad805f434f5b33127bda531c7acdcb51fa3a5 not found: ID does not exist" Mar 08 04:03:17.497890 master-0 kubenswrapper[18592]: I0308 04:03:17.497845 18592 scope.go:117] "RemoveContainer" containerID="8ffa2ebcaa54a41e98873119e7f53fb354c1c0439475fce800c91d93f862c5b3" Mar 08 04:03:17.498042 master-0 kubenswrapper[18592]: E0308 04:03:17.498009 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ffa2ebcaa54a41e98873119e7f53fb354c1c0439475fce800c91d93f862c5b3\": container with ID starting with 8ffa2ebcaa54a41e98873119e7f53fb354c1c0439475fce800c91d93f862c5b3 not found: ID does not exist" containerID="8ffa2ebcaa54a41e98873119e7f53fb354c1c0439475fce800c91d93f862c5b3" Mar 08 04:03:17.498080 master-0 kubenswrapper[18592]: I0308 04:03:17.498036 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ffa2ebcaa54a41e98873119e7f53fb354c1c0439475fce800c91d93f862c5b3"} err="failed to get container status \"8ffa2ebcaa54a41e98873119e7f53fb354c1c0439475fce800c91d93f862c5b3\": rpc error: code = NotFound desc = could not find container \"8ffa2ebcaa54a41e98873119e7f53fb354c1c0439475fce800c91d93f862c5b3\": container with ID starting with 8ffa2ebcaa54a41e98873119e7f53fb354c1c0439475fce800c91d93f862c5b3 not found: ID does not exist" Mar 08 04:03:17.498080 master-0 kubenswrapper[18592]: I0308 04:03:17.498053 18592 scope.go:117] "RemoveContainer" containerID="296f822f770e0810353c5af5937b6f7a98bd88f53e04415bc2c63dbf385c5929" Mar 08 04:03:17.498257 master-0 kubenswrapper[18592]: E0308 04:03:17.498225 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"296f822f770e0810353c5af5937b6f7a98bd88f53e04415bc2c63dbf385c5929\": container with ID starting with 296f822f770e0810353c5af5937b6f7a98bd88f53e04415bc2c63dbf385c5929 not found: ID does not exist" containerID="296f822f770e0810353c5af5937b6f7a98bd88f53e04415bc2c63dbf385c5929" Mar 08 04:03:17.498295 master-0 kubenswrapper[18592]: I0308 04:03:17.498254 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"296f822f770e0810353c5af5937b6f7a98bd88f53e04415bc2c63dbf385c5929"} err="failed to get container status \"296f822f770e0810353c5af5937b6f7a98bd88f53e04415bc2c63dbf385c5929\": rpc error: code = NotFound desc = could not find container \"296f822f770e0810353c5af5937b6f7a98bd88f53e04415bc2c63dbf385c5929\": container with ID starting with 296f822f770e0810353c5af5937b6f7a98bd88f53e04415bc2c63dbf385c5929 not found: ID does not exist" Mar 08 04:03:17.498295 master-0 kubenswrapper[18592]: I0308 04:03:17.498270 18592 scope.go:117] "RemoveContainer" containerID="94dbcdb89aa0f263874521912d133035369312397d1ad983f2732352b74fb994" Mar 08 04:03:17.498453 master-0 kubenswrapper[18592]: I0308 04:03:17.498416 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94dbcdb89aa0f263874521912d133035369312397d1ad983f2732352b74fb994"} err="failed to get container status \"94dbcdb89aa0f263874521912d133035369312397d1ad983f2732352b74fb994\": rpc error: code = NotFound desc = could not find container \"94dbcdb89aa0f263874521912d133035369312397d1ad983f2732352b74fb994\": container with ID starting with 94dbcdb89aa0f263874521912d133035369312397d1ad983f2732352b74fb994 not found: ID does not exist" Mar 08 04:03:17.498453 master-0 kubenswrapper[18592]: I0308 04:03:17.498438 18592 scope.go:117] "RemoveContainer" containerID="762bc926182fe44b4af81804c22991bc1dff6e23202c8882e8000e3e2aaad012" Mar 08 04:03:17.498619 master-0 kubenswrapper[18592]: I0308 04:03:17.498580 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762bc926182fe44b4af81804c22991bc1dff6e23202c8882e8000e3e2aaad012"} err="failed to get container status \"762bc926182fe44b4af81804c22991bc1dff6e23202c8882e8000e3e2aaad012\": rpc error: code = NotFound desc = could not find container \"762bc926182fe44b4af81804c22991bc1dff6e23202c8882e8000e3e2aaad012\": container with ID starting with 762bc926182fe44b4af81804c22991bc1dff6e23202c8882e8000e3e2aaad012 not found: ID does not exist" Mar 08 04:03:17.498619 master-0 kubenswrapper[18592]: I0308 04:03:17.498605 18592 scope.go:117] "RemoveContainer" containerID="d23a898658aeb28f11810d92075f16eb074d90b8b3a0a2f14b7f1f5e6e3c3074" Mar 08 04:03:17.498989 master-0 kubenswrapper[18592]: I0308 04:03:17.498776 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d23a898658aeb28f11810d92075f16eb074d90b8b3a0a2f14b7f1f5e6e3c3074"} err="failed to get container status \"d23a898658aeb28f11810d92075f16eb074d90b8b3a0a2f14b7f1f5e6e3c3074\": rpc error: code = NotFound desc = could not find container \"d23a898658aeb28f11810d92075f16eb074d90b8b3a0a2f14b7f1f5e6e3c3074\": container with ID starting with d23a898658aeb28f11810d92075f16eb074d90b8b3a0a2f14b7f1f5e6e3c3074 not found: ID does not exist" Mar 08 04:03:17.498989 master-0 kubenswrapper[18592]: I0308 04:03:17.498803 18592 scope.go:117] "RemoveContainer" containerID="5b6b052f8bf0820cfe48407dc8704560b66216dfe0ff3c4511019bfcdbea1ef2" Mar 08 04:03:17.499193 master-0 kubenswrapper[18592]: I0308 04:03:17.499119 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b6b052f8bf0820cfe48407dc8704560b66216dfe0ff3c4511019bfcdbea1ef2"} err="failed to get container status \"5b6b052f8bf0820cfe48407dc8704560b66216dfe0ff3c4511019bfcdbea1ef2\": rpc error: code = NotFound desc = could not find container \"5b6b052f8bf0820cfe48407dc8704560b66216dfe0ff3c4511019bfcdbea1ef2\": container with ID starting with 5b6b052f8bf0820cfe48407dc8704560b66216dfe0ff3c4511019bfcdbea1ef2 not found: ID does not exist" Mar 08 04:03:17.499193 master-0 kubenswrapper[18592]: I0308 04:03:17.499143 18592 scope.go:117] "RemoveContainer" containerID="34335e826503ea10aadf683d38cad805f434f5b33127bda531c7acdcb51fa3a5" Mar 08 04:03:17.499323 master-0 kubenswrapper[18592]: I0308 04:03:17.499305 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34335e826503ea10aadf683d38cad805f434f5b33127bda531c7acdcb51fa3a5"} err="failed to get container status \"34335e826503ea10aadf683d38cad805f434f5b33127bda531c7acdcb51fa3a5\": rpc error: code = NotFound desc = could not find container \"34335e826503ea10aadf683d38cad805f434f5b33127bda531c7acdcb51fa3a5\": container with ID starting with 34335e826503ea10aadf683d38cad805f434f5b33127bda531c7acdcb51fa3a5 not found: ID does not exist" Mar 08 04:03:17.499358 master-0 kubenswrapper[18592]: I0308 04:03:17.499323 18592 scope.go:117] "RemoveContainer" containerID="8ffa2ebcaa54a41e98873119e7f53fb354c1c0439475fce800c91d93f862c5b3" Mar 08 04:03:17.499480 master-0 kubenswrapper[18592]: I0308 04:03:17.499463 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ffa2ebcaa54a41e98873119e7f53fb354c1c0439475fce800c91d93f862c5b3"} err="failed to get container status \"8ffa2ebcaa54a41e98873119e7f53fb354c1c0439475fce800c91d93f862c5b3\": rpc error: code = NotFound desc = could not find container \"8ffa2ebcaa54a41e98873119e7f53fb354c1c0439475fce800c91d93f862c5b3\": container with ID starting with 8ffa2ebcaa54a41e98873119e7f53fb354c1c0439475fce800c91d93f862c5b3 not found: ID does not exist" Mar 08 04:03:17.499514 master-0 kubenswrapper[18592]: I0308 04:03:17.499481 18592 scope.go:117] "RemoveContainer" containerID="296f822f770e0810353c5af5937b6f7a98bd88f53e04415bc2c63dbf385c5929" Mar 08 04:03:17.499629 master-0 kubenswrapper[18592]: I0308 04:03:17.499613 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"296f822f770e0810353c5af5937b6f7a98bd88f53e04415bc2c63dbf385c5929"} err="failed to get container status \"296f822f770e0810353c5af5937b6f7a98bd88f53e04415bc2c63dbf385c5929\": rpc error: code = NotFound desc = could not find container \"296f822f770e0810353c5af5937b6f7a98bd88f53e04415bc2c63dbf385c5929\": container with ID starting with 296f822f770e0810353c5af5937b6f7a98bd88f53e04415bc2c63dbf385c5929 not found: ID does not exist" Mar 08 04:03:17.575278 master-0 kubenswrapper[18592]: I0308 04:03:17.575154 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c68746bf-7562-4da0-85e5-dce1ad9786b3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.575278 master-0 kubenswrapper[18592]: I0308 04:03:17.575228 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c68746bf-7562-4da0-85e5-dce1ad9786b3-config-out\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.575278 master-0 kubenswrapper[18592]: I0308 04:03:17.575263 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c68746bf-7562-4da0-85e5-dce1ad9786b3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.575506 master-0 kubenswrapper[18592]: I0308 04:03:17.575290 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c68746bf-7562-4da0-85e5-dce1ad9786b3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.575506 master-0 kubenswrapper[18592]: I0308 04:03:17.575341 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c68746bf-7562-4da0-85e5-dce1ad9786b3-web-config\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.575506 master-0 kubenswrapper[18592]: I0308 04:03:17.575395 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c68746bf-7562-4da0-85e5-dce1ad9786b3-config-volume\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.575506 master-0 kubenswrapper[18592]: I0308 04:03:17.575424 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68746bf-7562-4da0-85e5-dce1ad9786b3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.575506 master-0 kubenswrapper[18592]: I0308 04:03:17.575451 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c68746bf-7562-4da0-85e5-dce1ad9786b3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.575506 master-0 kubenswrapper[18592]: I0308 04:03:17.575491 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqhnk\" (UniqueName: \"kubernetes.io/projected/c68746bf-7562-4da0-85e5-dce1ad9786b3-kube-api-access-rqhnk\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.575669 master-0 kubenswrapper[18592]: I0308 04:03:17.575537 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c68746bf-7562-4da0-85e5-dce1ad9786b3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.575669 master-0 kubenswrapper[18592]: I0308 04:03:17.575561 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c68746bf-7562-4da0-85e5-dce1ad9786b3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.575669 master-0 kubenswrapper[18592]: I0308 04:03:17.575585 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c68746bf-7562-4da0-85e5-dce1ad9786b3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.577073 master-0 kubenswrapper[18592]: I0308 04:03:17.577041 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c68746bf-7562-4da0-85e5-dce1ad9786b3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.578509 master-0 kubenswrapper[18592]: I0308 04:03:17.578487 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c68746bf-7562-4da0-85e5-dce1ad9786b3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.578848 master-0 kubenswrapper[18592]: I0308 04:03:17.578796 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68746bf-7562-4da0-85e5-dce1ad9786b3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.581206 master-0 kubenswrapper[18592]: I0308 04:03:17.581162 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c68746bf-7562-4da0-85e5-dce1ad9786b3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.581373 master-0 kubenswrapper[18592]: I0308 04:03:17.581328 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c68746bf-7562-4da0-85e5-dce1ad9786b3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.581659 master-0 kubenswrapper[18592]: I0308 04:03:17.581624 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c68746bf-7562-4da0-85e5-dce1ad9786b3-web-config\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.581790 master-0 kubenswrapper[18592]: I0308 04:03:17.581755 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c68746bf-7562-4da0-85e5-dce1ad9786b3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.582161 master-0 kubenswrapper[18592]: I0308 04:03:17.582099 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c68746bf-7562-4da0-85e5-dce1ad9786b3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.582244 master-0 kubenswrapper[18592]: I0308 04:03:17.582212 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c68746bf-7562-4da0-85e5-dce1ad9786b3-config-out\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.582703 master-0 kubenswrapper[18592]: I0308 04:03:17.582671 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c68746bf-7562-4da0-85e5-dce1ad9786b3-config-volume\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.586077 master-0 kubenswrapper[18592]: I0308 04:03:17.585931 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c68746bf-7562-4da0-85e5-dce1ad9786b3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.593521 master-0 kubenswrapper[18592]: I0308 04:03:17.593475 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqhnk\" (UniqueName: \"kubernetes.io/projected/c68746bf-7562-4da0-85e5-dce1ad9786b3-kube-api-access-rqhnk\") pod \"alertmanager-main-0\" (UID: \"c68746bf-7562-4da0-85e5-dce1ad9786b3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.772093 master-0 kubenswrapper[18592]: I0308 04:03:17.772002 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 04:03:17.855368 master-0 kubenswrapper[18592]: I0308 04:03:17.855147 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-6-master-0"] Mar 08 04:03:17.856941 master-0 kubenswrapper[18592]: I0308 04:03:17.856141 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 04:03:17.862045 master-0 kubenswrapper[18592]: I0308 04:03:17.860217 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-stjc6" Mar 08 04:03:17.862045 master-0 kubenswrapper[18592]: I0308 04:03:17.862011 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 08 04:03:17.878974 master-0 kubenswrapper[18592]: I0308 04:03:17.878902 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-6-master-0"] Mar 08 04:03:17.879432 master-0 kubenswrapper[18592]: I0308 04:03:17.879388 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 04:03:17.879552 master-0 kubenswrapper[18592]: I0308 04:03:17.879447 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb-var-lock\") pod \"installer-6-master-0\" (UID: \"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 04:03:17.879552 master-0 kubenswrapper[18592]: I0308 04:03:17.879474 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb-kube-api-access\") pod \"installer-6-master-0\" (UID: \"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 04:03:17.982020 master-0 kubenswrapper[18592]: I0308 04:03:17.981269 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb-var-lock\") pod \"installer-6-master-0\" (UID: \"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 04:03:17.982747 master-0 kubenswrapper[18592]: I0308 04:03:17.982114 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb-kube-api-access\") pod \"installer-6-master-0\" (UID: \"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 04:03:17.983012 master-0 kubenswrapper[18592]: I0308 04:03:17.982957 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb-var-lock\") pod \"installer-6-master-0\" (UID: \"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 04:03:17.983257 master-0 kubenswrapper[18592]: I0308 04:03:17.983199 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 04:03:17.983469 master-0 kubenswrapper[18592]: I0308 04:03:17.983423 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 04:03:17.998118 master-0 kubenswrapper[18592]: I0308 04:03:17.998065 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb-kube-api-access\") pod \"installer-6-master-0\" (UID: \"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 04:03:18.162656 master-0 kubenswrapper[18592]: I0308 04:03:18.162481 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cacb9582-2132-4543-8a31-7b100ba4dd2f" path="/var/lib/kubelet/pods/cacb9582-2132-4543-8a31-7b100ba4dd2f/volumes" Mar 08 04:03:18.205566 master-0 kubenswrapper[18592]: I0308 04:03:18.205443 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 04:03:18.275148 master-0 kubenswrapper[18592]: I0308 04:03:18.274417 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 04:03:18.276701 master-0 kubenswrapper[18592]: W0308 04:03:18.276610 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc68746bf_7562_4da0_85e5_dce1ad9786b3.slice/crio-62785c9bf6658eb175dd0740e22d3840982e4c03ff69a61fdf630f9509dfadb7 WatchSource:0}: Error finding container 62785c9bf6658eb175dd0740e22d3840982e4c03ff69a61fdf630f9509dfadb7: Status 404 returned error can't find the container with id 62785c9bf6658eb175dd0740e22d3840982e4c03ff69a61fdf630f9509dfadb7 Mar 08 04:03:18.335249 master-0 kubenswrapper[18592]: I0308 04:03:18.335200 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c68746bf-7562-4da0-85e5-dce1ad9786b3","Type":"ContainerStarted","Data":"62785c9bf6658eb175dd0740e22d3840982e4c03ff69a61fdf630f9509dfadb7"} Mar 08 04:03:18.688089 master-0 kubenswrapper[18592]: I0308 04:03:18.687991 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-6-master-0"] Mar 08 04:03:19.350518 master-0 kubenswrapper[18592]: I0308 04:03:19.350059 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb","Type":"ContainerStarted","Data":"fb353c6e6f508946d6e48864215c0bcbb1578b78f6c37fc4f5747b87ef99a44e"} Mar 08 04:03:19.350518 master-0 kubenswrapper[18592]: I0308 04:03:19.350108 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb","Type":"ContainerStarted","Data":"75626aea3d66d0ae47b2146334586f11fc2d98cf49a021c7d5078b6701b55f22"} Mar 08 04:03:19.353381 master-0 kubenswrapper[18592]: I0308 04:03:19.353303 18592 generic.go:334] "Generic (PLEG): container finished" podID="c68746bf-7562-4da0-85e5-dce1ad9786b3" containerID="1f9ae83194d6ad753814fd9c420b12f270cb2665f33858523a970437c9b00429" exitCode=0 Mar 08 04:03:19.353381 master-0 kubenswrapper[18592]: I0308 04:03:19.353337 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c68746bf-7562-4da0-85e5-dce1ad9786b3","Type":"ContainerDied","Data":"1f9ae83194d6ad753814fd9c420b12f270cb2665f33858523a970437c9b00429"} Mar 08 04:03:19.373450 master-0 kubenswrapper[18592]: I0308 04:03:19.373351 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-6-master-0" podStartSLOduration=2.373313763 podStartE2EDuration="2.373313763s" podCreationTimestamp="2026-03-08 04:03:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:03:19.368569116 +0000 UTC m=+611.467323466" watchObservedRunningTime="2026-03-08 04:03:19.373313763 +0000 UTC m=+611.472068193" Mar 08 04:03:20.365004 master-0 kubenswrapper[18592]: I0308 04:03:20.364928 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c68746bf-7562-4da0-85e5-dce1ad9786b3","Type":"ContainerStarted","Data":"b5a3c378bc3e67955ad31f545ca393ba92fc578f0359e5d94fc13e5f236c624c"} Mar 08 04:03:20.365004 master-0 kubenswrapper[18592]: I0308 04:03:20.365008 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c68746bf-7562-4da0-85e5-dce1ad9786b3","Type":"ContainerStarted","Data":"030fbfacfef6f10b7b1765fa9722c3393e49f8f80948b8916aa93038e8de168f"} Mar 08 04:03:20.365988 master-0 kubenswrapper[18592]: I0308 04:03:20.365027 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c68746bf-7562-4da0-85e5-dce1ad9786b3","Type":"ContainerStarted","Data":"82aac401dcb7273e06ab0161aeecc4267b820f71d1378d06e339887594352706"} Mar 08 04:03:20.365988 master-0 kubenswrapper[18592]: I0308 04:03:20.365043 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c68746bf-7562-4da0-85e5-dce1ad9786b3","Type":"ContainerStarted","Data":"46a4099730c5663e51758907d45625643992beb2865715c3898b1ff518d192a2"} Mar 08 04:03:20.365988 master-0 kubenswrapper[18592]: I0308 04:03:20.365058 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c68746bf-7562-4da0-85e5-dce1ad9786b3","Type":"ContainerStarted","Data":"96753ff97bc9bddabac91897231447cfd8c078aab74166aaa5a69ef06f0b879d"} Mar 08 04:03:20.365988 master-0 kubenswrapper[18592]: I0308 04:03:20.365076 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c68746bf-7562-4da0-85e5-dce1ad9786b3","Type":"ContainerStarted","Data":"3d364d524e9041da67cd68c02bad254df8fa7fa00bb66149e961fc5d1fb0b272"} Mar 08 04:03:20.417279 master-0 kubenswrapper[18592]: I0308 04:03:20.417178 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.417147555 podStartE2EDuration="3.417147555s" podCreationTimestamp="2026-03-08 04:03:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:03:20.4131924 +0000 UTC m=+612.511946780" watchObservedRunningTime="2026-03-08 04:03:20.417147555 +0000 UTC m=+612.515901915" Mar 08 04:03:20.675694 master-0 kubenswrapper[18592]: I0308 04:03:20.675603 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:03:20.676070 master-0 kubenswrapper[18592]: I0308 04:03:20.676043 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:03:20.886383 master-0 kubenswrapper[18592]: I0308 04:03:20.886321 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 04:03:20.886789 master-0 kubenswrapper[18592]: I0308 04:03:20.886698 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="prometheus" containerID="cri-o://9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2" gracePeriod=600 Mar 08 04:03:20.886959 master-0 kubenswrapper[18592]: I0308 04:03:20.886850 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="config-reloader" containerID="cri-o://d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf" gracePeriod=600 Mar 08 04:03:20.886959 master-0 kubenswrapper[18592]: I0308 04:03:20.886845 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="thanos-sidecar" containerID="cri-o://0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488" gracePeriod=600 Mar 08 04:03:20.887108 master-0 kubenswrapper[18592]: I0308 04:03:20.886899 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="kube-rbac-proxy-web" containerID="cri-o://45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f" gracePeriod=600 Mar 08 04:03:20.887108 master-0 kubenswrapper[18592]: I0308 04:03:20.887015 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="kube-rbac-proxy-thanos" containerID="cri-o://72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1" gracePeriod=600 Mar 08 04:03:20.888682 master-0 kubenswrapper[18592]: I0308 04:03:20.888502 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="kube-rbac-proxy" containerID="cri-o://4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56" gracePeriod=600 Mar 08 04:03:20.976349 master-0 kubenswrapper[18592]: E0308 04:03:20.976297 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2 is running failed: container process not found" containerID="9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Mar 08 04:03:20.976931 master-0 kubenswrapper[18592]: E0308 04:03:20.976781 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2 is running failed: container process not found" containerID="9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Mar 08 04:03:20.977845 master-0 kubenswrapper[18592]: E0308 04:03:20.977417 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2 is running failed: container process not found" containerID="9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Mar 08 04:03:20.977845 master-0 kubenswrapper[18592]: E0308 04:03:20.977450 18592 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2 is running failed: container process not found" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="prometheus" Mar 08 04:03:21.349848 master-0 kubenswrapper[18592]: I0308 04:03:21.346967 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.379973 master-0 kubenswrapper[18592]: I0308 04:03:21.376030 18592 generic.go:334] "Generic (PLEG): container finished" podID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerID="72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1" exitCode=0 Mar 08 04:03:21.379973 master-0 kubenswrapper[18592]: I0308 04:03:21.376088 18592 generic.go:334] "Generic (PLEG): container finished" podID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerID="4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56" exitCode=0 Mar 08 04:03:21.379973 master-0 kubenswrapper[18592]: I0308 04:03:21.376097 18592 generic.go:334] "Generic (PLEG): container finished" podID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerID="45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f" exitCode=0 Mar 08 04:03:21.379973 master-0 kubenswrapper[18592]: I0308 04:03:21.376104 18592 generic.go:334] "Generic (PLEG): container finished" podID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerID="0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488" exitCode=0 Mar 08 04:03:21.379973 master-0 kubenswrapper[18592]: I0308 04:03:21.376110 18592 generic.go:334] "Generic (PLEG): container finished" podID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerID="d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf" exitCode=0 Mar 08 04:03:21.379973 master-0 kubenswrapper[18592]: I0308 04:03:21.376117 18592 generic.go:334] "Generic (PLEG): container finished" podID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerID="9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2" exitCode=0 Mar 08 04:03:21.379973 master-0 kubenswrapper[18592]: I0308 04:03:21.377376 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.379973 master-0 kubenswrapper[18592]: I0308 04:03:21.377642 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a26c661f-f843-45c5-85f0-2c2f72cbf580","Type":"ContainerDied","Data":"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1"} Mar 08 04:03:21.379973 master-0 kubenswrapper[18592]: I0308 04:03:21.377668 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a26c661f-f843-45c5-85f0-2c2f72cbf580","Type":"ContainerDied","Data":"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56"} Mar 08 04:03:21.379973 master-0 kubenswrapper[18592]: I0308 04:03:21.377679 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a26c661f-f843-45c5-85f0-2c2f72cbf580","Type":"ContainerDied","Data":"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f"} Mar 08 04:03:21.379973 master-0 kubenswrapper[18592]: I0308 04:03:21.377690 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a26c661f-f843-45c5-85f0-2c2f72cbf580","Type":"ContainerDied","Data":"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488"} Mar 08 04:03:21.379973 master-0 kubenswrapper[18592]: I0308 04:03:21.377719 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a26c661f-f843-45c5-85f0-2c2f72cbf580","Type":"ContainerDied","Data":"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf"} Mar 08 04:03:21.379973 master-0 kubenswrapper[18592]: I0308 04:03:21.377730 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a26c661f-f843-45c5-85f0-2c2f72cbf580","Type":"ContainerDied","Data":"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2"} Mar 08 04:03:21.379973 master-0 kubenswrapper[18592]: I0308 04:03:21.377738 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a26c661f-f843-45c5-85f0-2c2f72cbf580","Type":"ContainerDied","Data":"e71332738e2ae39fa0b5c7bf8bf8ed3118528a8aa66a7bb7d76359721eba9710"} Mar 08 04:03:21.379973 master-0 kubenswrapper[18592]: I0308 04:03:21.377752 18592 scope.go:117] "RemoveContainer" containerID="72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1" Mar 08 04:03:21.394745 master-0 kubenswrapper[18592]: I0308 04:03:21.393532 18592 scope.go:117] "RemoveContainer" containerID="4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56" Mar 08 04:03:21.432717 master-0 kubenswrapper[18592]: I0308 04:03:21.432664 18592 scope.go:117] "RemoveContainer" containerID="45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f" Mar 08 04:03:21.457987 master-0 kubenswrapper[18592]: I0308 04:03:21.457873 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"a26c661f-f843-45c5-85f0-2c2f72cbf580\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " Mar 08 04:03:21.458187 master-0 kubenswrapper[18592]: I0308 04:03:21.458004 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-web-config\") pod \"a26c661f-f843-45c5-85f0-2c2f72cbf580\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " Mar 08 04:03:21.458187 master-0 kubenswrapper[18592]: I0308 04:03:21.458081 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-grpc-tls\") pod \"a26c661f-f843-45c5-85f0-2c2f72cbf580\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " Mar 08 04:03:21.458284 master-0 kubenswrapper[18592]: I0308 04:03:21.458185 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a26c661f-f843-45c5-85f0-2c2f72cbf580-tls-assets\") pod \"a26c661f-f843-45c5-85f0-2c2f72cbf580\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " Mar 08 04:03:21.458284 master-0 kubenswrapper[18592]: I0308 04:03:21.458214 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-configmap-kubelet-serving-ca-bundle\") pod \"a26c661f-f843-45c5-85f0-2c2f72cbf580\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " Mar 08 04:03:21.458284 master-0 kubenswrapper[18592]: I0308 04:03:21.458243 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-kube-rbac-proxy\") pod \"a26c661f-f843-45c5-85f0-2c2f72cbf580\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " Mar 08 04:03:21.458405 master-0 kubenswrapper[18592]: I0308 04:03:21.458334 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"a26c661f-f843-45c5-85f0-2c2f72cbf580\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " Mar 08 04:03:21.458405 master-0 kubenswrapper[18592]: I0308 04:03:21.458384 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-configmap-serving-certs-ca-bundle\") pod \"a26c661f-f843-45c5-85f0-2c2f72cbf580\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " Mar 08 04:03:21.458489 master-0 kubenswrapper[18592]: I0308 04:03:21.458407 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-configmap-metrics-client-ca\") pod \"a26c661f-f843-45c5-85f0-2c2f72cbf580\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " Mar 08 04:03:21.458489 master-0 kubenswrapper[18592]: I0308 04:03:21.458433 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk8dj\" (UniqueName: \"kubernetes.io/projected/a26c661f-f843-45c5-85f0-2c2f72cbf580-kube-api-access-sk8dj\") pod \"a26c661f-f843-45c5-85f0-2c2f72cbf580\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " Mar 08 04:03:21.458489 master-0 kubenswrapper[18592]: I0308 04:03:21.458461 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-thanos-prometheus-http-client-file\") pod \"a26c661f-f843-45c5-85f0-2c2f72cbf580\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " Mar 08 04:03:21.458701 master-0 kubenswrapper[18592]: I0308 04:03:21.458511 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-config\") pod \"a26c661f-f843-45c5-85f0-2c2f72cbf580\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " Mar 08 04:03:21.458701 master-0 kubenswrapper[18592]: I0308 04:03:21.458564 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-prometheus-k8s-tls\") pod \"a26c661f-f843-45c5-85f0-2c2f72cbf580\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " Mar 08 04:03:21.458701 master-0 kubenswrapper[18592]: I0308 04:03:21.458590 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-metrics-client-certs\") pod \"a26c661f-f843-45c5-85f0-2c2f72cbf580\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " Mar 08 04:03:21.458701 master-0 kubenswrapper[18592]: I0308 04:03:21.458617 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-k8s-db\") pod \"a26c661f-f843-45c5-85f0-2c2f72cbf580\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " Mar 08 04:03:21.458701 master-0 kubenswrapper[18592]: I0308 04:03:21.458668 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle\") pod \"a26c661f-f843-45c5-85f0-2c2f72cbf580\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " Mar 08 04:03:21.458932 master-0 kubenswrapper[18592]: I0308 04:03:21.458710 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a26c661f-f843-45c5-85f0-2c2f72cbf580-config-out\") pod \"a26c661f-f843-45c5-85f0-2c2f72cbf580\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " Mar 08 04:03:21.458932 master-0 kubenswrapper[18592]: I0308 04:03:21.458740 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-k8s-rulefiles-0\") pod \"a26c661f-f843-45c5-85f0-2c2f72cbf580\" (UID: \"a26c661f-f843-45c5-85f0-2c2f72cbf580\") " Mar 08 04:03:21.459533 master-0 kubenswrapper[18592]: I0308 04:03:21.459442 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "a26c661f-f843-45c5-85f0-2c2f72cbf580" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:03:21.459533 master-0 kubenswrapper[18592]: I0308 04:03:21.459459 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "a26c661f-f843-45c5-85f0-2c2f72cbf580" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:03:21.460345 master-0 kubenswrapper[18592]: I0308 04:03:21.460313 18592 scope.go:117] "RemoveContainer" containerID="0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488" Mar 08 04:03:21.460760 master-0 kubenswrapper[18592]: I0308 04:03:21.460737 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "a26c661f-f843-45c5-85f0-2c2f72cbf580" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:03:21.461837 master-0 kubenswrapper[18592]: I0308 04:03:21.461589 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "a26c661f-f843-45c5-85f0-2c2f72cbf580" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:03:21.461972 master-0 kubenswrapper[18592]: I0308 04:03:21.461935 18592 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:21.462037 master-0 kubenswrapper[18592]: I0308 04:03:21.461971 18592 reconciler_common.go:293] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-configmap-serving-certs-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:21.462037 master-0 kubenswrapper[18592]: I0308 04:03:21.461987 18592 reconciler_common.go:293] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-configmap-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:21.462037 master-0 kubenswrapper[18592]: I0308 04:03:21.462004 18592 reconciler_common.go:293] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:21.462782 master-0 kubenswrapper[18592]: I0308 04:03:21.462744 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "a26c661f-f843-45c5-85f0-2c2f72cbf580" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:03:21.462971 master-0 kubenswrapper[18592]: I0308 04:03:21.462934 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a26c661f-f843-45c5-85f0-2c2f72cbf580-kube-api-access-sk8dj" (OuterVolumeSpecName: "kube-api-access-sk8dj") pod "a26c661f-f843-45c5-85f0-2c2f72cbf580" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580"). InnerVolumeSpecName "kube-api-access-sk8dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:03:21.463589 master-0 kubenswrapper[18592]: I0308 04:03:21.463506 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "a26c661f-f843-45c5-85f0-2c2f72cbf580" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:03:21.463689 master-0 kubenswrapper[18592]: I0308 04:03:21.463648 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "a26c661f-f843-45c5-85f0-2c2f72cbf580" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:03:21.464249 master-0 kubenswrapper[18592]: I0308 04:03:21.464180 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a26c661f-f843-45c5-85f0-2c2f72cbf580-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "a26c661f-f843-45c5-85f0-2c2f72cbf580" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:03:21.464249 master-0 kubenswrapper[18592]: I0308 04:03:21.464202 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-config" (OuterVolumeSpecName: "config") pod "a26c661f-f843-45c5-85f0-2c2f72cbf580" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:03:21.464412 master-0 kubenswrapper[18592]: I0308 04:03:21.464388 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "a26c661f-f843-45c5-85f0-2c2f72cbf580" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:03:21.465561 master-0 kubenswrapper[18592]: I0308 04:03:21.465490 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "a26c661f-f843-45c5-85f0-2c2f72cbf580" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:03:21.465936 master-0 kubenswrapper[18592]: I0308 04:03:21.465886 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a26c661f-f843-45c5-85f0-2c2f72cbf580-config-out" (OuterVolumeSpecName: "config-out") pod "a26c661f-f843-45c5-85f0-2c2f72cbf580" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:03:21.467114 master-0 kubenswrapper[18592]: I0308 04:03:21.467062 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "a26c661f-f843-45c5-85f0-2c2f72cbf580" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:03:21.468634 master-0 kubenswrapper[18592]: I0308 04:03:21.468263 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "a26c661f-f843-45c5-85f0-2c2f72cbf580" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:03:21.470868 master-0 kubenswrapper[18592]: I0308 04:03:21.470815 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "a26c661f-f843-45c5-85f0-2c2f72cbf580" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:03:21.480233 master-0 kubenswrapper[18592]: I0308 04:03:21.480177 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "a26c661f-f843-45c5-85f0-2c2f72cbf580" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:03:21.511806 master-0 kubenswrapper[18592]: I0308 04:03:21.511749 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-web-config" (OuterVolumeSpecName: "web-config") pod "a26c661f-f843-45c5-85f0-2c2f72cbf580" (UID: "a26c661f-f843-45c5-85f0-2c2f72cbf580"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:03:21.552963 master-0 kubenswrapper[18592]: I0308 04:03:21.552915 18592 scope.go:117] "RemoveContainer" containerID="d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf" Mar 08 04:03:21.563630 master-0 kubenswrapper[18592]: I0308 04:03:21.563574 18592 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a26c661f-f843-45c5-85f0-2c2f72cbf580-config-out\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:21.563630 master-0 kubenswrapper[18592]: I0308 04:03:21.563612 18592 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-k8s-rulefiles-0\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:21.563630 master-0 kubenswrapper[18592]: I0308 04:03:21.563624 18592 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:21.563836 master-0 kubenswrapper[18592]: I0308 04:03:21.563654 18592 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-web-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:21.563836 master-0 kubenswrapper[18592]: I0308 04:03:21.563664 18592 reconciler_common.go:293] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-grpc-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:21.563836 master-0 kubenswrapper[18592]: I0308 04:03:21.563673 18592 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a26c661f-f843-45c5-85f0-2c2f72cbf580-tls-assets\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:21.563836 master-0 kubenswrapper[18592]: I0308 04:03:21.563682 18592 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:21.563836 master-0 kubenswrapper[18592]: I0308 04:03:21.563690 18592 reconciler_common.go:293] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:21.563836 master-0 kubenswrapper[18592]: I0308 04:03:21.563699 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk8dj\" (UniqueName: \"kubernetes.io/projected/a26c661f-f843-45c5-85f0-2c2f72cbf580-kube-api-access-sk8dj\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:21.563836 master-0 kubenswrapper[18592]: I0308 04:03:21.563708 18592 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-thanos-prometheus-http-client-file\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:21.563836 master-0 kubenswrapper[18592]: I0308 04:03:21.563738 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:21.563836 master-0 kubenswrapper[18592]: I0308 04:03:21.563747 18592 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-prometheus-k8s-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:21.563836 master-0 kubenswrapper[18592]: I0308 04:03:21.563756 18592 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a26c661f-f843-45c5-85f0-2c2f72cbf580-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:21.563836 master-0 kubenswrapper[18592]: I0308 04:03:21.563764 18592 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a26c661f-f843-45c5-85f0-2c2f72cbf580-prometheus-k8s-db\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:21.571471 master-0 kubenswrapper[18592]: I0308 04:03:21.571437 18592 scope.go:117] "RemoveContainer" containerID="9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2" Mar 08 04:03:21.587918 master-0 kubenswrapper[18592]: I0308 04:03:21.587872 18592 scope.go:117] "RemoveContainer" containerID="a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36" Mar 08 04:03:21.607370 master-0 kubenswrapper[18592]: I0308 04:03:21.607335 18592 scope.go:117] "RemoveContainer" containerID="72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1" Mar 08 04:03:21.608230 master-0 kubenswrapper[18592]: E0308 04:03:21.608174 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1\": container with ID starting with 72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1 not found: ID does not exist" containerID="72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1" Mar 08 04:03:21.608317 master-0 kubenswrapper[18592]: I0308 04:03:21.608238 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1"} err="failed to get container status \"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1\": rpc error: code = NotFound desc = could not find container \"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1\": container with ID starting with 72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1 not found: ID does not exist" Mar 08 04:03:21.608317 master-0 kubenswrapper[18592]: I0308 04:03:21.608294 18592 scope.go:117] "RemoveContainer" containerID="4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56" Mar 08 04:03:21.608751 master-0 kubenswrapper[18592]: E0308 04:03:21.608712 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56\": container with ID starting with 4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56 not found: ID does not exist" containerID="4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56" Mar 08 04:03:21.608888 master-0 kubenswrapper[18592]: I0308 04:03:21.608763 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56"} err="failed to get container status \"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56\": rpc error: code = NotFound desc = could not find container \"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56\": container with ID starting with 4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56 not found: ID does not exist" Mar 08 04:03:21.608888 master-0 kubenswrapper[18592]: I0308 04:03:21.608805 18592 scope.go:117] "RemoveContainer" containerID="45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f" Mar 08 04:03:21.609254 master-0 kubenswrapper[18592]: E0308 04:03:21.609183 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f\": container with ID starting with 45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f not found: ID does not exist" containerID="45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f" Mar 08 04:03:21.609254 master-0 kubenswrapper[18592]: I0308 04:03:21.609229 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f"} err="failed to get container status \"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f\": rpc error: code = NotFound desc = could not find container \"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f\": container with ID starting with 45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f not found: ID does not exist" Mar 08 04:03:21.609254 master-0 kubenswrapper[18592]: I0308 04:03:21.609256 18592 scope.go:117] "RemoveContainer" containerID="0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488" Mar 08 04:03:21.609582 master-0 kubenswrapper[18592]: E0308 04:03:21.609547 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488\": container with ID starting with 0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488 not found: ID does not exist" containerID="0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488" Mar 08 04:03:21.609663 master-0 kubenswrapper[18592]: I0308 04:03:21.609590 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488"} err="failed to get container status \"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488\": rpc error: code = NotFound desc = could not find container \"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488\": container with ID starting with 0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488 not found: ID does not exist" Mar 08 04:03:21.609663 master-0 kubenswrapper[18592]: I0308 04:03:21.609614 18592 scope.go:117] "RemoveContainer" containerID="d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf" Mar 08 04:03:21.609981 master-0 kubenswrapper[18592]: E0308 04:03:21.609958 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf\": container with ID starting with d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf not found: ID does not exist" containerID="d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf" Mar 08 04:03:21.610085 master-0 kubenswrapper[18592]: I0308 04:03:21.609980 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf"} err="failed to get container status \"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf\": rpc error: code = NotFound desc = could not find container \"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf\": container with ID starting with d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf not found: ID does not exist" Mar 08 04:03:21.610085 master-0 kubenswrapper[18592]: I0308 04:03:21.609996 18592 scope.go:117] "RemoveContainer" containerID="9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2" Mar 08 04:03:21.610311 master-0 kubenswrapper[18592]: E0308 04:03:21.610238 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2\": container with ID starting with 9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2 not found: ID does not exist" containerID="9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2" Mar 08 04:03:21.610311 master-0 kubenswrapper[18592]: I0308 04:03:21.610267 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2"} err="failed to get container status \"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2\": rpc error: code = NotFound desc = could not find container \"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2\": container with ID starting with 9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2 not found: ID does not exist" Mar 08 04:03:21.610311 master-0 kubenswrapper[18592]: I0308 04:03:21.610287 18592 scope.go:117] "RemoveContainer" containerID="a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36" Mar 08 04:03:21.610586 master-0 kubenswrapper[18592]: E0308 04:03:21.610505 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36\": container with ID starting with a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36 not found: ID does not exist" containerID="a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36" Mar 08 04:03:21.610586 master-0 kubenswrapper[18592]: I0308 04:03:21.610525 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36"} err="failed to get container status \"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36\": rpc error: code = NotFound desc = could not find container \"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36\": container with ID starting with a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36 not found: ID does not exist" Mar 08 04:03:21.610586 master-0 kubenswrapper[18592]: I0308 04:03:21.610537 18592 scope.go:117] "RemoveContainer" containerID="72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1" Mar 08 04:03:21.611516 master-0 kubenswrapper[18592]: I0308 04:03:21.610765 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1"} err="failed to get container status \"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1\": rpc error: code = NotFound desc = could not find container \"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1\": container with ID starting with 72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1 not found: ID does not exist" Mar 08 04:03:21.611516 master-0 kubenswrapper[18592]: I0308 04:03:21.610790 18592 scope.go:117] "RemoveContainer" containerID="4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56" Mar 08 04:03:21.611516 master-0 kubenswrapper[18592]: I0308 04:03:21.611463 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56"} err="failed to get container status \"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56\": rpc error: code = NotFound desc = could not find container \"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56\": container with ID starting with 4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56 not found: ID does not exist" Mar 08 04:03:21.611516 master-0 kubenswrapper[18592]: I0308 04:03:21.611489 18592 scope.go:117] "RemoveContainer" containerID="45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f" Mar 08 04:03:21.611964 master-0 kubenswrapper[18592]: I0308 04:03:21.611810 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f"} err="failed to get container status \"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f\": rpc error: code = NotFound desc = could not find container \"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f\": container with ID starting with 45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f not found: ID does not exist" Mar 08 04:03:21.611964 master-0 kubenswrapper[18592]: I0308 04:03:21.611871 18592 scope.go:117] "RemoveContainer" containerID="0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488" Mar 08 04:03:21.612414 master-0 kubenswrapper[18592]: I0308 04:03:21.612390 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488"} err="failed to get container status \"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488\": rpc error: code = NotFound desc = could not find container \"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488\": container with ID starting with 0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488 not found: ID does not exist" Mar 08 04:03:21.612414 master-0 kubenswrapper[18592]: I0308 04:03:21.612410 18592 scope.go:117] "RemoveContainer" containerID="d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf" Mar 08 04:03:21.612674 master-0 kubenswrapper[18592]: I0308 04:03:21.612625 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf"} err="failed to get container status \"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf\": rpc error: code = NotFound desc = could not find container \"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf\": container with ID starting with d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf not found: ID does not exist" Mar 08 04:03:21.612674 master-0 kubenswrapper[18592]: I0308 04:03:21.612651 18592 scope.go:117] "RemoveContainer" containerID="9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2" Mar 08 04:03:21.613096 master-0 kubenswrapper[18592]: I0308 04:03:21.613070 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2"} err="failed to get container status \"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2\": rpc error: code = NotFound desc = could not find container \"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2\": container with ID starting with 9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2 not found: ID does not exist" Mar 08 04:03:21.613096 master-0 kubenswrapper[18592]: I0308 04:03:21.613092 18592 scope.go:117] "RemoveContainer" containerID="a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36" Mar 08 04:03:21.613396 master-0 kubenswrapper[18592]: I0308 04:03:21.613357 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36"} err="failed to get container status \"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36\": rpc error: code = NotFound desc = could not find container \"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36\": container with ID starting with a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36 not found: ID does not exist" Mar 08 04:03:21.613396 master-0 kubenswrapper[18592]: I0308 04:03:21.613379 18592 scope.go:117] "RemoveContainer" containerID="72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1" Mar 08 04:03:21.613630 master-0 kubenswrapper[18592]: I0308 04:03:21.613604 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1"} err="failed to get container status \"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1\": rpc error: code = NotFound desc = could not find container \"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1\": container with ID starting with 72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1 not found: ID does not exist" Mar 08 04:03:21.613630 master-0 kubenswrapper[18592]: I0308 04:03:21.613626 18592 scope.go:117] "RemoveContainer" containerID="4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56" Mar 08 04:03:21.613926 master-0 kubenswrapper[18592]: I0308 04:03:21.613866 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56"} err="failed to get container status \"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56\": rpc error: code = NotFound desc = could not find container \"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56\": container with ID starting with 4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56 not found: ID does not exist" Mar 08 04:03:21.613926 master-0 kubenswrapper[18592]: I0308 04:03:21.613886 18592 scope.go:117] "RemoveContainer" containerID="45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f" Mar 08 04:03:21.614234 master-0 kubenswrapper[18592]: I0308 04:03:21.614161 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f"} err="failed to get container status \"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f\": rpc error: code = NotFound desc = could not find container \"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f\": container with ID starting with 45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f not found: ID does not exist" Mar 08 04:03:21.614234 master-0 kubenswrapper[18592]: I0308 04:03:21.614184 18592 scope.go:117] "RemoveContainer" containerID="0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488" Mar 08 04:03:21.614518 master-0 kubenswrapper[18592]: I0308 04:03:21.614470 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488"} err="failed to get container status \"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488\": rpc error: code = NotFound desc = could not find container \"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488\": container with ID starting with 0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488 not found: ID does not exist" Mar 08 04:03:21.614518 master-0 kubenswrapper[18592]: I0308 04:03:21.614514 18592 scope.go:117] "RemoveContainer" containerID="d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf" Mar 08 04:03:21.614986 master-0 kubenswrapper[18592]: I0308 04:03:21.614947 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf"} err="failed to get container status \"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf\": rpc error: code = NotFound desc = could not find container \"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf\": container with ID starting with d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf not found: ID does not exist" Mar 08 04:03:21.614986 master-0 kubenswrapper[18592]: I0308 04:03:21.614974 18592 scope.go:117] "RemoveContainer" containerID="9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2" Mar 08 04:03:21.615236 master-0 kubenswrapper[18592]: I0308 04:03:21.615196 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2"} err="failed to get container status \"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2\": rpc error: code = NotFound desc = could not find container \"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2\": container with ID starting with 9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2 not found: ID does not exist" Mar 08 04:03:21.615236 master-0 kubenswrapper[18592]: I0308 04:03:21.615225 18592 scope.go:117] "RemoveContainer" containerID="a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36" Mar 08 04:03:21.615667 master-0 kubenswrapper[18592]: I0308 04:03:21.615634 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36"} err="failed to get container status \"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36\": rpc error: code = NotFound desc = could not find container \"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36\": container with ID starting with a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36 not found: ID does not exist" Mar 08 04:03:21.615667 master-0 kubenswrapper[18592]: I0308 04:03:21.615655 18592 scope.go:117] "RemoveContainer" containerID="72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1" Mar 08 04:03:21.615962 master-0 kubenswrapper[18592]: I0308 04:03:21.615923 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1"} err="failed to get container status \"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1\": rpc error: code = NotFound desc = could not find container \"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1\": container with ID starting with 72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1 not found: ID does not exist" Mar 08 04:03:21.615962 master-0 kubenswrapper[18592]: I0308 04:03:21.615950 18592 scope.go:117] "RemoveContainer" containerID="4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56" Mar 08 04:03:21.616334 master-0 kubenswrapper[18592]: I0308 04:03:21.616295 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56"} err="failed to get container status \"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56\": rpc error: code = NotFound desc = could not find container \"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56\": container with ID starting with 4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56 not found: ID does not exist" Mar 08 04:03:21.616334 master-0 kubenswrapper[18592]: I0308 04:03:21.616321 18592 scope.go:117] "RemoveContainer" containerID="45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f" Mar 08 04:03:21.616634 master-0 kubenswrapper[18592]: I0308 04:03:21.616596 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f"} err="failed to get container status \"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f\": rpc error: code = NotFound desc = could not find container \"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f\": container with ID starting with 45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f not found: ID does not exist" Mar 08 04:03:21.616634 master-0 kubenswrapper[18592]: I0308 04:03:21.616624 18592 scope.go:117] "RemoveContainer" containerID="0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488" Mar 08 04:03:21.616922 master-0 kubenswrapper[18592]: I0308 04:03:21.616885 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488"} err="failed to get container status \"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488\": rpc error: code = NotFound desc = could not find container \"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488\": container with ID starting with 0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488 not found: ID does not exist" Mar 08 04:03:21.616922 master-0 kubenswrapper[18592]: I0308 04:03:21.616907 18592 scope.go:117] "RemoveContainer" containerID="d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf" Mar 08 04:03:21.617245 master-0 kubenswrapper[18592]: I0308 04:03:21.617212 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf"} err="failed to get container status \"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf\": rpc error: code = NotFound desc = could not find container \"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf\": container with ID starting with d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf not found: ID does not exist" Mar 08 04:03:21.617245 master-0 kubenswrapper[18592]: I0308 04:03:21.617232 18592 scope.go:117] "RemoveContainer" containerID="9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2" Mar 08 04:03:21.617447 master-0 kubenswrapper[18592]: I0308 04:03:21.617413 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2"} err="failed to get container status \"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2\": rpc error: code = NotFound desc = could not find container \"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2\": container with ID starting with 9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2 not found: ID does not exist" Mar 08 04:03:21.617447 master-0 kubenswrapper[18592]: I0308 04:03:21.617435 18592 scope.go:117] "RemoveContainer" containerID="a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36" Mar 08 04:03:21.617674 master-0 kubenswrapper[18592]: I0308 04:03:21.617637 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36"} err="failed to get container status \"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36\": rpc error: code = NotFound desc = could not find container \"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36\": container with ID starting with a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36 not found: ID does not exist" Mar 08 04:03:21.617674 master-0 kubenswrapper[18592]: I0308 04:03:21.617665 18592 scope.go:117] "RemoveContainer" containerID="72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1" Mar 08 04:03:21.617917 master-0 kubenswrapper[18592]: I0308 04:03:21.617886 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1"} err="failed to get container status \"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1\": rpc error: code = NotFound desc = could not find container \"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1\": container with ID starting with 72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1 not found: ID does not exist" Mar 08 04:03:21.617917 master-0 kubenswrapper[18592]: I0308 04:03:21.617907 18592 scope.go:117] "RemoveContainer" containerID="4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56" Mar 08 04:03:21.618160 master-0 kubenswrapper[18592]: I0308 04:03:21.618126 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56"} err="failed to get container status \"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56\": rpc error: code = NotFound desc = could not find container \"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56\": container with ID starting with 4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56 not found: ID does not exist" Mar 08 04:03:21.618160 master-0 kubenswrapper[18592]: I0308 04:03:21.618152 18592 scope.go:117] "RemoveContainer" containerID="45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f" Mar 08 04:03:21.618405 master-0 kubenswrapper[18592]: I0308 04:03:21.618375 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f"} err="failed to get container status \"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f\": rpc error: code = NotFound desc = could not find container \"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f\": container with ID starting with 45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f not found: ID does not exist" Mar 08 04:03:21.618405 master-0 kubenswrapper[18592]: I0308 04:03:21.618399 18592 scope.go:117] "RemoveContainer" containerID="0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488" Mar 08 04:03:21.618760 master-0 kubenswrapper[18592]: I0308 04:03:21.618728 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488"} err="failed to get container status \"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488\": rpc error: code = NotFound desc = could not find container \"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488\": container with ID starting with 0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488 not found: ID does not exist" Mar 08 04:03:21.618865 master-0 kubenswrapper[18592]: I0308 04:03:21.618776 18592 scope.go:117] "RemoveContainer" containerID="d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf" Mar 08 04:03:21.619004 master-0 kubenswrapper[18592]: I0308 04:03:21.618973 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf"} err="failed to get container status \"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf\": rpc error: code = NotFound desc = could not find container \"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf\": container with ID starting with d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf not found: ID does not exist" Mar 08 04:03:21.619004 master-0 kubenswrapper[18592]: I0308 04:03:21.618998 18592 scope.go:117] "RemoveContainer" containerID="9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2" Mar 08 04:03:21.619196 master-0 kubenswrapper[18592]: I0308 04:03:21.619173 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2"} err="failed to get container status \"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2\": rpc error: code = NotFound desc = could not find container \"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2\": container with ID starting with 9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2 not found: ID does not exist" Mar 08 04:03:21.619196 master-0 kubenswrapper[18592]: I0308 04:03:21.619189 18592 scope.go:117] "RemoveContainer" containerID="a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36" Mar 08 04:03:21.619378 master-0 kubenswrapper[18592]: I0308 04:03:21.619359 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36"} err="failed to get container status \"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36\": rpc error: code = NotFound desc = could not find container \"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36\": container with ID starting with a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36 not found: ID does not exist" Mar 08 04:03:21.619378 master-0 kubenswrapper[18592]: I0308 04:03:21.619375 18592 scope.go:117] "RemoveContainer" containerID="72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1" Mar 08 04:03:21.619555 master-0 kubenswrapper[18592]: I0308 04:03:21.619513 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1"} err="failed to get container status \"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1\": rpc error: code = NotFound desc = could not find container \"72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1\": container with ID starting with 72eeeb9e8301aec92bcbf0c03c9b41cbd449ca5585e0010e8907c03068bd39b1 not found: ID does not exist" Mar 08 04:03:21.619555 master-0 kubenswrapper[18592]: I0308 04:03:21.619527 18592 scope.go:117] "RemoveContainer" containerID="4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56" Mar 08 04:03:21.620589 master-0 kubenswrapper[18592]: I0308 04:03:21.620557 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56"} err="failed to get container status \"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56\": rpc error: code = NotFound desc = could not find container \"4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56\": container with ID starting with 4e242806264dd4548803e4e95071aa00ff9cec6da04585217fd9296886017b56 not found: ID does not exist" Mar 08 04:03:21.620589 master-0 kubenswrapper[18592]: I0308 04:03:21.620582 18592 scope.go:117] "RemoveContainer" containerID="45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f" Mar 08 04:03:21.620846 master-0 kubenswrapper[18592]: I0308 04:03:21.620789 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f"} err="failed to get container status \"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f\": rpc error: code = NotFound desc = could not find container \"45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f\": container with ID starting with 45ff740367d4cd8aed88382d0c17b62ac56a016c1d16c630e57980524661978f not found: ID does not exist" Mar 08 04:03:21.620846 master-0 kubenswrapper[18592]: I0308 04:03:21.620814 18592 scope.go:117] "RemoveContainer" containerID="0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488" Mar 08 04:03:21.621113 master-0 kubenswrapper[18592]: I0308 04:03:21.621081 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488"} err="failed to get container status \"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488\": rpc error: code = NotFound desc = could not find container \"0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488\": container with ID starting with 0b1556d958a969007bd0a7dc73cb738262531fc4d85517dab18227f342d41488 not found: ID does not exist" Mar 08 04:03:21.621113 master-0 kubenswrapper[18592]: I0308 04:03:21.621102 18592 scope.go:117] "RemoveContainer" containerID="d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf" Mar 08 04:03:21.621439 master-0 kubenswrapper[18592]: I0308 04:03:21.621404 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf"} err="failed to get container status \"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf\": rpc error: code = NotFound desc = could not find container \"d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf\": container with ID starting with d994098b937be725aa395b7493095859b38e1ecb7ea832e2ee45597ef7cb9baf not found: ID does not exist" Mar 08 04:03:21.621439 master-0 kubenswrapper[18592]: I0308 04:03:21.621426 18592 scope.go:117] "RemoveContainer" containerID="9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2" Mar 08 04:03:21.621679 master-0 kubenswrapper[18592]: I0308 04:03:21.621640 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2"} err="failed to get container status \"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2\": rpc error: code = NotFound desc = could not find container \"9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2\": container with ID starting with 9af3f273ab2e4954fd8c62788ec7531287e1944e6d561bcb5c4debba6106c3d2 not found: ID does not exist" Mar 08 04:03:21.621679 master-0 kubenswrapper[18592]: I0308 04:03:21.621670 18592 scope.go:117] "RemoveContainer" containerID="a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36" Mar 08 04:03:21.621966 master-0 kubenswrapper[18592]: I0308 04:03:21.621931 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36"} err="failed to get container status \"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36\": rpc error: code = NotFound desc = could not find container \"a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36\": container with ID starting with a3114d26662f332759817838953335b2c8cca6aba781706cddff99047422be36 not found: ID does not exist" Mar 08 04:03:21.731252 master-0 kubenswrapper[18592]: I0308 04:03:21.731199 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 04:03:21.736686 master-0 kubenswrapper[18592]: I0308 04:03:21.736623 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: I0308 04:03:21.777567 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: E0308 04:03:21.777910 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="kube-rbac-proxy-thanos" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: I0308 04:03:21.777925 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="kube-rbac-proxy-thanos" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: E0308 04:03:21.777941 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="thanos-sidecar" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: I0308 04:03:21.777949 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="thanos-sidecar" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: E0308 04:03:21.777981 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="kube-rbac-proxy-web" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: I0308 04:03:21.777990 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="kube-rbac-proxy-web" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: E0308 04:03:21.778008 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="prometheus" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: I0308 04:03:21.778016 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="prometheus" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: E0308 04:03:21.778032 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="kube-rbac-proxy" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: I0308 04:03:21.778040 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="kube-rbac-proxy" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: E0308 04:03:21.778053 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="config-reloader" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: I0308 04:03:21.778061 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="config-reloader" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: E0308 04:03:21.778072 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="init-config-reloader" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: I0308 04:03:21.778081 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="init-config-reloader" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: I0308 04:03:21.778247 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="prometheus" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: I0308 04:03:21.778274 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="config-reloader" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: I0308 04:03:21.778293 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="kube-rbac-proxy-web" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: I0308 04:03:21.778305 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="kube-rbac-proxy" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: I0308 04:03:21.778317 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="thanos-sidecar" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: I0308 04:03:21.778339 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" containerName="kube-rbac-proxy-thanos" Mar 08 04:03:21.781935 master-0 kubenswrapper[18592]: I0308 04:03:21.781315 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.785469 master-0 kubenswrapper[18592]: I0308 04:03:21.784757 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 08 04:03:21.785469 master-0 kubenswrapper[18592]: I0308 04:03:21.785035 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 08 04:03:21.785469 master-0 kubenswrapper[18592]: I0308 04:03:21.785226 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 08 04:03:21.785469 master-0 kubenswrapper[18592]: I0308 04:03:21.785331 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-snc9w" Mar 08 04:03:21.786971 master-0 kubenswrapper[18592]: I0308 04:03:21.786549 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 08 04:03:21.789920 master-0 kubenswrapper[18592]: I0308 04:03:21.788274 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 08 04:03:21.789920 master-0 kubenswrapper[18592]: I0308 04:03:21.788572 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 08 04:03:21.789920 master-0 kubenswrapper[18592]: I0308 04:03:21.788896 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 08 04:03:21.789920 master-0 kubenswrapper[18592]: I0308 04:03:21.789051 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-44sfif47rohlm" Mar 08 04:03:21.789920 master-0 kubenswrapper[18592]: I0308 04:03:21.789225 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 08 04:03:21.789920 master-0 kubenswrapper[18592]: I0308 04:03:21.789378 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 08 04:03:21.795481 master-0 kubenswrapper[18592]: I0308 04:03:21.795263 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 08 04:03:21.800378 master-0 kubenswrapper[18592]: I0308 04:03:21.800020 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 08 04:03:21.803394 master-0 kubenswrapper[18592]: I0308 04:03:21.803341 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 04:03:21.868076 master-0 kubenswrapper[18592]: I0308 04:03:21.867953 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eece71b-beb6-49f4-96cd-6b7476337ded-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.868076 master-0 kubenswrapper[18592]: I0308 04:03:21.868045 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.868288 master-0 kubenswrapper[18592]: I0308 04:03:21.868122 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.868288 master-0 kubenswrapper[18592]: I0308 04:03:21.868156 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.868288 master-0 kubenswrapper[18592]: I0308 04:03:21.868189 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qltqw\" (UniqueName: \"kubernetes.io/projected/0eece71b-beb6-49f4-96cd-6b7476337ded-kube-api-access-qltqw\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.868288 master-0 kubenswrapper[18592]: I0308 04:03:21.868228 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0eece71b-beb6-49f4-96cd-6b7476337ded-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.868506 master-0 kubenswrapper[18592]: I0308 04:03:21.868305 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0eece71b-beb6-49f4-96cd-6b7476337ded-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.868590 master-0 kubenswrapper[18592]: I0308 04:03:21.868565 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eece71b-beb6-49f4-96cd-6b7476337ded-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.868644 master-0 kubenswrapper[18592]: I0308 04:03:21.868617 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.868680 master-0 kubenswrapper[18592]: I0308 04:03:21.868664 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-config\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.868724 master-0 kubenswrapper[18592]: I0308 04:03:21.868692 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.868781 master-0 kubenswrapper[18592]: I0308 04:03:21.868756 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0eece71b-beb6-49f4-96cd-6b7476337ded-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.868857 master-0 kubenswrapper[18592]: I0308 04:03:21.868802 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.869182 master-0 kubenswrapper[18592]: I0308 04:03:21.868901 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0eece71b-beb6-49f4-96cd-6b7476337ded-config-out\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.869182 master-0 kubenswrapper[18592]: I0308 04:03:21.868936 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0eece71b-beb6-49f4-96cd-6b7476337ded-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.869182 master-0 kubenswrapper[18592]: I0308 04:03:21.868973 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eece71b-beb6-49f4-96cd-6b7476337ded-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.869182 master-0 kubenswrapper[18592]: I0308 04:03:21.869135 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.869318 master-0 kubenswrapper[18592]: I0308 04:03:21.869202 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-web-config\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.971350 master-0 kubenswrapper[18592]: I0308 04:03:21.971220 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0eece71b-beb6-49f4-96cd-6b7476337ded-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.971350 master-0 kubenswrapper[18592]: I0308 04:03:21.971320 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eece71b-beb6-49f4-96cd-6b7476337ded-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.971884 master-0 kubenswrapper[18592]: I0308 04:03:21.971391 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.971884 master-0 kubenswrapper[18592]: I0308 04:03:21.971428 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-web-config\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.971884 master-0 kubenswrapper[18592]: I0308 04:03:21.971470 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eece71b-beb6-49f4-96cd-6b7476337ded-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.971884 master-0 kubenswrapper[18592]: I0308 04:03:21.971513 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.971884 master-0 kubenswrapper[18592]: I0308 04:03:21.971579 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.971884 master-0 kubenswrapper[18592]: I0308 04:03:21.971610 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.971884 master-0 kubenswrapper[18592]: I0308 04:03:21.971641 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qltqw\" (UniqueName: \"kubernetes.io/projected/0eece71b-beb6-49f4-96cd-6b7476337ded-kube-api-access-qltqw\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.971884 master-0 kubenswrapper[18592]: I0308 04:03:21.971676 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0eece71b-beb6-49f4-96cd-6b7476337ded-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.971884 master-0 kubenswrapper[18592]: I0308 04:03:21.971708 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0eece71b-beb6-49f4-96cd-6b7476337ded-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.971884 master-0 kubenswrapper[18592]: I0308 04:03:21.971751 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eece71b-beb6-49f4-96cd-6b7476337ded-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.971884 master-0 kubenswrapper[18592]: I0308 04:03:21.971790 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.971884 master-0 kubenswrapper[18592]: I0308 04:03:21.971847 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-config\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.971884 master-0 kubenswrapper[18592]: I0308 04:03:21.971888 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.973163 master-0 kubenswrapper[18592]: I0308 04:03:21.971931 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0eece71b-beb6-49f4-96cd-6b7476337ded-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.973163 master-0 kubenswrapper[18592]: I0308 04:03:21.972020 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.973163 master-0 kubenswrapper[18592]: I0308 04:03:21.972077 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0eece71b-beb6-49f4-96cd-6b7476337ded-config-out\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.980571 master-0 kubenswrapper[18592]: I0308 04:03:21.974197 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0eece71b-beb6-49f4-96cd-6b7476337ded-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.980571 master-0 kubenswrapper[18592]: I0308 04:03:21.974475 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0eece71b-beb6-49f4-96cd-6b7476337ded-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.980571 master-0 kubenswrapper[18592]: I0308 04:03:21.975310 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eece71b-beb6-49f4-96cd-6b7476337ded-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.980571 master-0 kubenswrapper[18592]: I0308 04:03:21.975761 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eece71b-beb6-49f4-96cd-6b7476337ded-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.980571 master-0 kubenswrapper[18592]: I0308 04:03:21.977987 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eece71b-beb6-49f4-96cd-6b7476337ded-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.980571 master-0 kubenswrapper[18592]: I0308 04:03:21.979199 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.981875 master-0 kubenswrapper[18592]: I0308 04:03:21.981231 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-web-config\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.981875 master-0 kubenswrapper[18592]: I0308 04:03:21.981341 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.981875 master-0 kubenswrapper[18592]: I0308 04:03:21.981680 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.981875 master-0 kubenswrapper[18592]: I0308 04:03:21.981869 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.983865 master-0 kubenswrapper[18592]: I0308 04:03:21.983725 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.984793 master-0 kubenswrapper[18592]: I0308 04:03:21.984747 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.984983 master-0 kubenswrapper[18592]: I0308 04:03:21.984808 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.984983 master-0 kubenswrapper[18592]: I0308 04:03:21.984908 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0eece71b-beb6-49f4-96cd-6b7476337ded-config-out\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.985151 master-0 kubenswrapper[18592]: I0308 04:03:21.985115 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0eece71b-beb6-49f4-96cd-6b7476337ded-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.985464 master-0 kubenswrapper[18592]: I0308 04:03:21.985372 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0eece71b-beb6-49f4-96cd-6b7476337ded-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:21.999683 master-0 kubenswrapper[18592]: I0308 04:03:21.999612 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0eece71b-beb6-49f4-96cd-6b7476337ded-config\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:22.007605 master-0 kubenswrapper[18592]: I0308 04:03:22.007549 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qltqw\" (UniqueName: \"kubernetes.io/projected/0eece71b-beb6-49f4-96cd-6b7476337ded-kube-api-access-qltqw\") pod \"prometheus-k8s-0\" (UID: \"0eece71b-beb6-49f4-96cd-6b7476337ded\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:22.104807 master-0 kubenswrapper[18592]: I0308 04:03:22.104738 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:22.156421 master-0 kubenswrapper[18592]: I0308 04:03:22.156359 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a26c661f-f843-45c5-85f0-2c2f72cbf580" path="/var/lib/kubelet/pods/a26c661f-f843-45c5-85f0-2c2f72cbf580/volumes" Mar 08 04:03:22.652750 master-0 kubenswrapper[18592]: I0308 04:03:22.652671 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 04:03:23.303314 master-0 kubenswrapper[18592]: I0308 04:03:23.303141 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:03:23.303314 master-0 kubenswrapper[18592]: I0308 04:03:23.303237 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:03:23.401351 master-0 kubenswrapper[18592]: I0308 04:03:23.401206 18592 generic.go:334] "Generic (PLEG): container finished" podID="0eece71b-beb6-49f4-96cd-6b7476337ded" containerID="688a711cce652964eeebad36d954d21898333ad24efd66e2f8f73ce5c4ace359" exitCode=0 Mar 08 04:03:23.401613 master-0 kubenswrapper[18592]: I0308 04:03:23.401365 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0eece71b-beb6-49f4-96cd-6b7476337ded","Type":"ContainerDied","Data":"688a711cce652964eeebad36d954d21898333ad24efd66e2f8f73ce5c4ace359"} Mar 08 04:03:23.401613 master-0 kubenswrapper[18592]: I0308 04:03:23.401516 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0eece71b-beb6-49f4-96cd-6b7476337ded","Type":"ContainerStarted","Data":"204c8381da92d8e3ef3dd2d7e0e17534220863c20bfdf8dff0ac5462727a1f22"} Mar 08 04:03:24.413855 master-0 kubenswrapper[18592]: I0308 04:03:24.413753 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0eece71b-beb6-49f4-96cd-6b7476337ded","Type":"ContainerStarted","Data":"61826b5f596d9f62e7fb67a673f840639c73714122712d98b16d6f426e6ffeee"} Mar 08 04:03:24.414603 master-0 kubenswrapper[18592]: I0308 04:03:24.413871 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0eece71b-beb6-49f4-96cd-6b7476337ded","Type":"ContainerStarted","Data":"dc1d8be47722fca7f83856cad3810a86701aa7d83d26dc31d57ad757838e2a7c"} Mar 08 04:03:24.414603 master-0 kubenswrapper[18592]: I0308 04:03:24.413904 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0eece71b-beb6-49f4-96cd-6b7476337ded","Type":"ContainerStarted","Data":"bb2318b80fdd776b4982b9b4f9dda5ea5d6e8cd32d0559c39ad870d79d900d30"} Mar 08 04:03:24.414603 master-0 kubenswrapper[18592]: I0308 04:03:24.413931 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0eece71b-beb6-49f4-96cd-6b7476337ded","Type":"ContainerStarted","Data":"32a99a9f836520d8adb9c9f3d774ef4774155f6b04510ceb8d6f7ca6f8d20be2"} Mar 08 04:03:24.414603 master-0 kubenswrapper[18592]: I0308 04:03:24.413956 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0eece71b-beb6-49f4-96cd-6b7476337ded","Type":"ContainerStarted","Data":"48330da01f1a31c41e375193a065e19ebfbec4a40aa343dea757ad2de1148490"} Mar 08 04:03:24.414603 master-0 kubenswrapper[18592]: I0308 04:03:24.413981 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0eece71b-beb6-49f4-96cd-6b7476337ded","Type":"ContainerStarted","Data":"9cf4b56c2735e7b0d50fe0f8970ede25bcb81f26a08157f0110561706ba40f7e"} Mar 08 04:03:24.458331 master-0 kubenswrapper[18592]: I0308 04:03:24.458243 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.458218201 podStartE2EDuration="3.458218201s" podCreationTimestamp="2026-03-08 04:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:03:24.453100784 +0000 UTC m=+616.551855174" watchObservedRunningTime="2026-03-08 04:03:24.458218201 +0000 UTC m=+616.556972591" Mar 08 04:03:27.105894 master-0 kubenswrapper[18592]: I0308 04:03:27.105748 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:03:29.032686 master-0 kubenswrapper[18592]: I0308 04:03:29.031557 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-5-retry-1-master-0"] Mar 08 04:03:29.034358 master-0 kubenswrapper[18592]: I0308 04:03:29.034319 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 04:03:29.037807 master-0 kubenswrapper[18592]: I0308 04:03:29.037196 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-retry-1-master-0"] Mar 08 04:03:29.038770 master-0 kubenswrapper[18592]: I0308 04:03:29.038695 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 04:03:29.039251 master-0 kubenswrapper[18592]: I0308 04:03:29.039191 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-ccz75" Mar 08 04:03:29.093690 master-0 kubenswrapper[18592]: I0308 04:03:29.093611 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b73e6391-2d8f-46e1-b275-a9106a730d60-kubelet-dir\") pod \"installer-5-retry-1-master-0\" (UID: \"b73e6391-2d8f-46e1-b275-a9106a730d60\") " pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 04:03:29.093690 master-0 kubenswrapper[18592]: I0308 04:03:29.093697 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b73e6391-2d8f-46e1-b275-a9106a730d60-var-lock\") pod \"installer-5-retry-1-master-0\" (UID: \"b73e6391-2d8f-46e1-b275-a9106a730d60\") " pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 04:03:29.094041 master-0 kubenswrapper[18592]: I0308 04:03:29.093745 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b73e6391-2d8f-46e1-b275-a9106a730d60-kube-api-access\") pod \"installer-5-retry-1-master-0\" (UID: \"b73e6391-2d8f-46e1-b275-a9106a730d60\") " pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 04:03:29.195531 master-0 kubenswrapper[18592]: I0308 04:03:29.195449 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b73e6391-2d8f-46e1-b275-a9106a730d60-kubelet-dir\") pod \"installer-5-retry-1-master-0\" (UID: \"b73e6391-2d8f-46e1-b275-a9106a730d60\") " pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 04:03:29.195531 master-0 kubenswrapper[18592]: I0308 04:03:29.195527 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b73e6391-2d8f-46e1-b275-a9106a730d60-var-lock\") pod \"installer-5-retry-1-master-0\" (UID: \"b73e6391-2d8f-46e1-b275-a9106a730d60\") " pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 04:03:29.195893 master-0 kubenswrapper[18592]: I0308 04:03:29.195618 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b73e6391-2d8f-46e1-b275-a9106a730d60-kubelet-dir\") pod \"installer-5-retry-1-master-0\" (UID: \"b73e6391-2d8f-46e1-b275-a9106a730d60\") " pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 04:03:29.195893 master-0 kubenswrapper[18592]: I0308 04:03:29.195578 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b73e6391-2d8f-46e1-b275-a9106a730d60-kube-api-access\") pod \"installer-5-retry-1-master-0\" (UID: \"b73e6391-2d8f-46e1-b275-a9106a730d60\") " pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 04:03:29.196028 master-0 kubenswrapper[18592]: I0308 04:03:29.195968 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b73e6391-2d8f-46e1-b275-a9106a730d60-var-lock\") pod \"installer-5-retry-1-master-0\" (UID: \"b73e6391-2d8f-46e1-b275-a9106a730d60\") " pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 04:03:29.225497 master-0 kubenswrapper[18592]: I0308 04:03:29.225428 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b73e6391-2d8f-46e1-b275-a9106a730d60-kube-api-access\") pod \"installer-5-retry-1-master-0\" (UID: \"b73e6391-2d8f-46e1-b275-a9106a730d60\") " pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 04:03:29.370396 master-0 kubenswrapper[18592]: I0308 04:03:29.370257 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 04:03:29.902046 master-0 kubenswrapper[18592]: I0308 04:03:29.901976 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-retry-1-master-0"] Mar 08 04:03:29.902209 master-0 kubenswrapper[18592]: W0308 04:03:29.902164 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb73e6391_2d8f_46e1_b275_a9106a730d60.slice/crio-908f5aef9c7d532c7deaec8b20a17221ae2870023e1e2740acecef674b075b52 WatchSource:0}: Error finding container 908f5aef9c7d532c7deaec8b20a17221ae2870023e1e2740acecef674b075b52: Status 404 returned error can't find the container with id 908f5aef9c7d532c7deaec8b20a17221ae2870023e1e2740acecef674b075b52 Mar 08 04:03:30.472624 master-0 kubenswrapper[18592]: I0308 04:03:30.472551 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" event={"ID":"b73e6391-2d8f-46e1-b275-a9106a730d60","Type":"ContainerStarted","Data":"908f5aef9c7d532c7deaec8b20a17221ae2870023e1e2740acecef674b075b52"} Mar 08 04:03:30.676221 master-0 kubenswrapper[18592]: I0308 04:03:30.676136 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:03:30.676536 master-0 kubenswrapper[18592]: I0308 04:03:30.676222 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:03:31.484652 master-0 kubenswrapper[18592]: I0308 04:03:31.484534 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" event={"ID":"b73e6391-2d8f-46e1-b275-a9106a730d60","Type":"ContainerStarted","Data":"d9bb3f958d2b5e248ca9acef3858fad3705cf981affbc68bf5ea66794147a61c"} Mar 08 04:03:31.523809 master-0 kubenswrapper[18592]: I0308 04:03:31.523680 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" podStartSLOduration=2.523642245 podStartE2EDuration="2.523642245s" podCreationTimestamp="2026-03-08 04:03:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:03:31.50986985 +0000 UTC m=+623.608624240" watchObservedRunningTime="2026-03-08 04:03:31.523642245 +0000 UTC m=+623.622396675" Mar 08 04:03:33.302812 master-0 kubenswrapper[18592]: I0308 04:03:33.302688 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:03:33.303877 master-0 kubenswrapper[18592]: I0308 04:03:33.302923 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:03:40.676167 master-0 kubenswrapper[18592]: I0308 04:03:40.676059 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:03:40.676167 master-0 kubenswrapper[18592]: I0308 04:03:40.676135 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:03:43.302153 master-0 kubenswrapper[18592]: I0308 04:03:43.302082 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:03:43.302649 master-0 kubenswrapper[18592]: I0308 04:03:43.302179 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:03:44.036197 master-0 kubenswrapper[18592]: I0308 04:03:44.036099 18592 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 04:03:44.036516 master-0 kubenswrapper[18592]: E0308 04:03:44.036320 18592 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-controller-manager-pod.yaml\": /etc/kubernetes/manifests/kube-controller-manager-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Mar 08 04:03:44.036621 master-0 kubenswrapper[18592]: I0308 04:03:44.036554 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5" gracePeriod=30 Mar 08 04:03:44.036754 master-0 kubenswrapper[18592]: I0308 04:03:44.036658 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c" gracePeriod=30 Mar 08 04:03:44.036882 master-0 kubenswrapper[18592]: I0308 04:03:44.036707 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" containerID="cri-o://8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379" gracePeriod=30 Mar 08 04:03:44.036971 master-0 kubenswrapper[18592]: I0308 04:03:44.036721 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="kube-controller-manager" containerID="cri-o://b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8" gracePeriod=30 Mar 08 04:03:44.041073 master-0 kubenswrapper[18592]: I0308 04:03:44.041025 18592 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 04:03:44.041651 master-0 kubenswrapper[18592]: E0308 04:03:44.041621 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="kube-controller-manager-recovery-controller" Mar 08 04:03:44.041805 master-0 kubenswrapper[18592]: I0308 04:03:44.041784 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="kube-controller-manager-recovery-controller" Mar 08 04:03:44.041984 master-0 kubenswrapper[18592]: E0308 04:03:44.041961 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="kube-controller-manager-cert-syncer" Mar 08 04:03:44.042108 master-0 kubenswrapper[18592]: I0308 04:03:44.042088 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="kube-controller-manager-cert-syncer" Mar 08 04:03:44.042230 master-0 kubenswrapper[18592]: E0308 04:03:44.042210 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="kube-controller-manager" Mar 08 04:03:44.042357 master-0 kubenswrapper[18592]: I0308 04:03:44.042335 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="kube-controller-manager" Mar 08 04:03:44.042481 master-0 kubenswrapper[18592]: E0308 04:03:44.042461 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" Mar 08 04:03:44.043376 master-0 kubenswrapper[18592]: I0308 04:03:44.042578 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" Mar 08 04:03:44.044025 master-0 kubenswrapper[18592]: E0308 04:03:44.043956 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" Mar 08 04:03:44.044327 master-0 kubenswrapper[18592]: I0308 04:03:44.044304 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" Mar 08 04:03:44.044472 master-0 kubenswrapper[18592]: E0308 04:03:44.044450 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" Mar 08 04:03:44.044601 master-0 kubenswrapper[18592]: I0308 04:03:44.044581 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" Mar 08 04:03:44.044788 master-0 kubenswrapper[18592]: E0308 04:03:44.044712 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="kube-controller-manager" Mar 08 04:03:44.045030 master-0 kubenswrapper[18592]: I0308 04:03:44.045001 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="kube-controller-manager" Mar 08 04:03:44.045520 master-0 kubenswrapper[18592]: I0308 04:03:44.045493 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" Mar 08 04:03:44.045677 master-0 kubenswrapper[18592]: I0308 04:03:44.045651 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" Mar 08 04:03:44.046016 master-0 kubenswrapper[18592]: I0308 04:03:44.045946 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" Mar 08 04:03:44.046371 master-0 kubenswrapper[18592]: I0308 04:03:44.046342 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="kube-controller-manager-recovery-controller" Mar 08 04:03:44.046643 master-0 kubenswrapper[18592]: I0308 04:03:44.046621 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" Mar 08 04:03:44.046895 master-0 kubenswrapper[18592]: I0308 04:03:44.046873 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="kube-controller-manager" Mar 08 04:03:44.047157 master-0 kubenswrapper[18592]: I0308 04:03:44.047099 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="kube-controller-manager-recovery-controller" Mar 08 04:03:44.047389 master-0 kubenswrapper[18592]: I0308 04:03:44.047367 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="kube-controller-manager" Mar 08 04:03:44.047658 master-0 kubenswrapper[18592]: I0308 04:03:44.047635 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="kube-controller-manager-cert-syncer" Mar 08 04:03:44.048450 master-0 kubenswrapper[18592]: E0308 04:03:44.048384 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="kube-controller-manager-recovery-controller" Mar 08 04:03:44.048740 master-0 kubenswrapper[18592]: I0308 04:03:44.048670 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="kube-controller-manager-recovery-controller" Mar 08 04:03:44.049218 master-0 kubenswrapper[18592]: E0308 04:03:44.049135 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" Mar 08 04:03:44.049592 master-0 kubenswrapper[18592]: I0308 04:03:44.049526 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" Mar 08 04:03:44.050015 master-0 kubenswrapper[18592]: E0308 04:03:44.049990 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" Mar 08 04:03:44.050274 master-0 kubenswrapper[18592]: I0308 04:03:44.050250 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" Mar 08 04:03:44.051094 master-0 kubenswrapper[18592]: I0308 04:03:44.051066 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="0580c83f64e952a7a614903b6fdf6965" containerName="cluster-policy-controller" Mar 08 04:03:44.262180 master-0 kubenswrapper[18592]: I0308 04:03:44.261757 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1a9d6aff6989e430a5479397d5c55a56-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"1a9d6aff6989e430a5479397d5c55a56\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:03:44.262180 master-0 kubenswrapper[18592]: I0308 04:03:44.261933 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1a9d6aff6989e430a5479397d5c55a56-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"1a9d6aff6989e430a5479397d5c55a56\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:03:44.294192 master-0 kubenswrapper[18592]: I0308 04:03:44.294074 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/cluster-policy-controller/3.log" Mar 08 04:03:44.295602 master-0 kubenswrapper[18592]: I0308 04:03:44.295576 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/kube-controller-manager-cert-syncer/0.log" Mar 08 04:03:44.296215 master-0 kubenswrapper[18592]: I0308 04:03:44.296188 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/kube-controller-manager/0.log" Mar 08 04:03:44.296323 master-0 kubenswrapper[18592]: I0308 04:03:44.296305 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:03:44.300333 master-0 kubenswrapper[18592]: I0308 04:03:44.300294 18592 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="0580c83f64e952a7a614903b6fdf6965" podUID="1a9d6aff6989e430a5479397d5c55a56" Mar 08 04:03:44.363427 master-0 kubenswrapper[18592]: I0308 04:03:44.363323 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0580c83f64e952a7a614903b6fdf6965-resource-dir\") pod \"0580c83f64e952a7a614903b6fdf6965\" (UID: \"0580c83f64e952a7a614903b6fdf6965\") " Mar 08 04:03:44.364411 master-0 kubenswrapper[18592]: I0308 04:03:44.363463 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0580c83f64e952a7a614903b6fdf6965-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "0580c83f64e952a7a614903b6fdf6965" (UID: "0580c83f64e952a7a614903b6fdf6965"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:03:44.364411 master-0 kubenswrapper[18592]: I0308 04:03:44.363491 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0580c83f64e952a7a614903b6fdf6965-cert-dir\") pod \"0580c83f64e952a7a614903b6fdf6965\" (UID: \"0580c83f64e952a7a614903b6fdf6965\") " Mar 08 04:03:44.364411 master-0 kubenswrapper[18592]: I0308 04:03:44.363572 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0580c83f64e952a7a614903b6fdf6965-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "0580c83f64e952a7a614903b6fdf6965" (UID: "0580c83f64e952a7a614903b6fdf6965"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:03:44.364411 master-0 kubenswrapper[18592]: I0308 04:03:44.364108 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1a9d6aff6989e430a5479397d5c55a56-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"1a9d6aff6989e430a5479397d5c55a56\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:03:44.364411 master-0 kubenswrapper[18592]: I0308 04:03:44.364188 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1a9d6aff6989e430a5479397d5c55a56-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"1a9d6aff6989e430a5479397d5c55a56\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:03:44.364411 master-0 kubenswrapper[18592]: I0308 04:03:44.364318 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1a9d6aff6989e430a5479397d5c55a56-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"1a9d6aff6989e430a5479397d5c55a56\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:03:44.364782 master-0 kubenswrapper[18592]: I0308 04:03:44.364410 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1a9d6aff6989e430a5479397d5c55a56-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"1a9d6aff6989e430a5479397d5c55a56\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:03:44.364782 master-0 kubenswrapper[18592]: I0308 04:03:44.364680 18592 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0580c83f64e952a7a614903b6fdf6965-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:44.364782 master-0 kubenswrapper[18592]: I0308 04:03:44.364698 18592 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0580c83f64e952a7a614903b6fdf6965-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:44.636793 master-0 kubenswrapper[18592]: I0308 04:03:44.636608 18592 generic.go:334] "Generic (PLEG): container finished" podID="514f8205-32f5-4a29-9779-fa9e339e452c" containerID="54c15bdbb93a0a3bdf8aa6cc57dfa2614b46751c191b6037f987467b62fd3f92" exitCode=0 Mar 08 04:03:44.636793 master-0 kubenswrapper[18592]: I0308 04:03:44.636725 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"514f8205-32f5-4a29-9779-fa9e339e452c","Type":"ContainerDied","Data":"54c15bdbb93a0a3bdf8aa6cc57dfa2614b46751c191b6037f987467b62fd3f92"} Mar 08 04:03:44.641348 master-0 kubenswrapper[18592]: I0308 04:03:44.641272 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/cluster-policy-controller/3.log" Mar 08 04:03:44.644481 master-0 kubenswrapper[18592]: I0308 04:03:44.644444 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/kube-controller-manager-cert-syncer/0.log" Mar 08 04:03:44.645487 master-0 kubenswrapper[18592]: I0308 04:03:44.645437 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0580c83f64e952a7a614903b6fdf6965/kube-controller-manager/0.log" Mar 08 04:03:44.645609 master-0 kubenswrapper[18592]: I0308 04:03:44.645515 18592 generic.go:334] "Generic (PLEG): container finished" podID="0580c83f64e952a7a614903b6fdf6965" containerID="8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379" exitCode=0 Mar 08 04:03:44.645609 master-0 kubenswrapper[18592]: I0308 04:03:44.645538 18592 generic.go:334] "Generic (PLEG): container finished" podID="0580c83f64e952a7a614903b6fdf6965" containerID="7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c" exitCode=0 Mar 08 04:03:44.645609 master-0 kubenswrapper[18592]: I0308 04:03:44.645550 18592 generic.go:334] "Generic (PLEG): container finished" podID="0580c83f64e952a7a614903b6fdf6965" containerID="b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8" exitCode=0 Mar 08 04:03:44.645609 master-0 kubenswrapper[18592]: I0308 04:03:44.645561 18592 generic.go:334] "Generic (PLEG): container finished" podID="0580c83f64e952a7a614903b6fdf6965" containerID="a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5" exitCode=2 Mar 08 04:03:44.645896 master-0 kubenswrapper[18592]: I0308 04:03:44.645639 18592 scope.go:117] "RemoveContainer" containerID="8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379" Mar 08 04:03:44.645896 master-0 kubenswrapper[18592]: I0308 04:03:44.645880 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:03:44.672926 master-0 kubenswrapper[18592]: I0308 04:03:44.670528 18592 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="0580c83f64e952a7a614903b6fdf6965" podUID="1a9d6aff6989e430a5479397d5c55a56" Mar 08 04:03:44.678506 master-0 kubenswrapper[18592]: I0308 04:03:44.678433 18592 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="0580c83f64e952a7a614903b6fdf6965" podUID="1a9d6aff6989e430a5479397d5c55a56" Mar 08 04:03:44.680814 master-0 kubenswrapper[18592]: I0308 04:03:44.680627 18592 scope.go:117] "RemoveContainer" containerID="7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c" Mar 08 04:03:44.705986 master-0 kubenswrapper[18592]: I0308 04:03:44.705916 18592 scope.go:117] "RemoveContainer" containerID="8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569" Mar 08 04:03:44.732874 master-0 kubenswrapper[18592]: I0308 04:03:44.732804 18592 scope.go:117] "RemoveContainer" containerID="b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8" Mar 08 04:03:44.758218 master-0 kubenswrapper[18592]: I0308 04:03:44.758169 18592 scope.go:117] "RemoveContainer" containerID="7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e" Mar 08 04:03:44.787634 master-0 kubenswrapper[18592]: I0308 04:03:44.787570 18592 scope.go:117] "RemoveContainer" containerID="a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5" Mar 08 04:03:44.817135 master-0 kubenswrapper[18592]: I0308 04:03:44.817056 18592 scope.go:117] "RemoveContainer" containerID="c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3" Mar 08 04:03:44.843476 master-0 kubenswrapper[18592]: I0308 04:03:44.843426 18592 scope.go:117] "RemoveContainer" containerID="8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379" Mar 08 04:03:44.844337 master-0 kubenswrapper[18592]: E0308 04:03:44.844104 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379\": container with ID starting with 8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379 not found: ID does not exist" containerID="8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379" Mar 08 04:03:44.844337 master-0 kubenswrapper[18592]: I0308 04:03:44.844190 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379"} err="failed to get container status \"8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379\": rpc error: code = NotFound desc = could not find container \"8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379\": container with ID starting with 8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379 not found: ID does not exist" Mar 08 04:03:44.844337 master-0 kubenswrapper[18592]: I0308 04:03:44.844228 18592 scope.go:117] "RemoveContainer" containerID="7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c" Mar 08 04:03:44.844893 master-0 kubenswrapper[18592]: E0308 04:03:44.844851 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c\": container with ID starting with 7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c not found: ID does not exist" containerID="7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c" Mar 08 04:03:44.845005 master-0 kubenswrapper[18592]: I0308 04:03:44.844915 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c"} err="failed to get container status \"7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c\": rpc error: code = NotFound desc = could not find container \"7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c\": container with ID starting with 7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c not found: ID does not exist" Mar 08 04:03:44.845005 master-0 kubenswrapper[18592]: I0308 04:03:44.844950 18592 scope.go:117] "RemoveContainer" containerID="8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569" Mar 08 04:03:44.845679 master-0 kubenswrapper[18592]: E0308 04:03:44.845509 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569\": container with ID starting with 8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569 not found: ID does not exist" containerID="8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569" Mar 08 04:03:44.845679 master-0 kubenswrapper[18592]: I0308 04:03:44.845552 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569"} err="failed to get container status \"8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569\": rpc error: code = NotFound desc = could not find container \"8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569\": container with ID starting with 8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569 not found: ID does not exist" Mar 08 04:03:44.845679 master-0 kubenswrapper[18592]: I0308 04:03:44.845579 18592 scope.go:117] "RemoveContainer" containerID="b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8" Mar 08 04:03:44.846139 master-0 kubenswrapper[18592]: E0308 04:03:44.846097 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8\": container with ID starting with b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8 not found: ID does not exist" containerID="b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8" Mar 08 04:03:44.846227 master-0 kubenswrapper[18592]: I0308 04:03:44.846143 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8"} err="failed to get container status \"b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8\": rpc error: code = NotFound desc = could not find container \"b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8\": container with ID starting with b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8 not found: ID does not exist" Mar 08 04:03:44.846227 master-0 kubenswrapper[18592]: I0308 04:03:44.846175 18592 scope.go:117] "RemoveContainer" containerID="7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e" Mar 08 04:03:44.847338 master-0 kubenswrapper[18592]: E0308 04:03:44.846644 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e\": container with ID starting with 7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e not found: ID does not exist" containerID="7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e" Mar 08 04:03:44.847723 master-0 kubenswrapper[18592]: I0308 04:03:44.846686 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e"} err="failed to get container status \"7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e\": rpc error: code = NotFound desc = could not find container \"7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e\": container with ID starting with 7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e not found: ID does not exist" Mar 08 04:03:44.847723 master-0 kubenswrapper[18592]: I0308 04:03:44.847587 18592 scope.go:117] "RemoveContainer" containerID="a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5" Mar 08 04:03:44.848295 master-0 kubenswrapper[18592]: E0308 04:03:44.848113 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5\": container with ID starting with a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5 not found: ID does not exist" containerID="a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5" Mar 08 04:03:44.848295 master-0 kubenswrapper[18592]: I0308 04:03:44.848153 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5"} err="failed to get container status \"a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5\": rpc error: code = NotFound desc = could not find container \"a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5\": container with ID starting with a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5 not found: ID does not exist" Mar 08 04:03:44.848295 master-0 kubenswrapper[18592]: I0308 04:03:44.848182 18592 scope.go:117] "RemoveContainer" containerID="c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3" Mar 08 04:03:44.848896 master-0 kubenswrapper[18592]: E0308 04:03:44.848848 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3\": container with ID starting with c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3 not found: ID does not exist" containerID="c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3" Mar 08 04:03:44.849017 master-0 kubenswrapper[18592]: I0308 04:03:44.848909 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3"} err="failed to get container status \"c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3\": rpc error: code = NotFound desc = could not find container \"c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3\": container with ID starting with c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3 not found: ID does not exist" Mar 08 04:03:44.849017 master-0 kubenswrapper[18592]: I0308 04:03:44.848960 18592 scope.go:117] "RemoveContainer" containerID="8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379" Mar 08 04:03:44.849623 master-0 kubenswrapper[18592]: I0308 04:03:44.849483 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379"} err="failed to get container status \"8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379\": rpc error: code = NotFound desc = could not find container \"8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379\": container with ID starting with 8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379 not found: ID does not exist" Mar 08 04:03:44.849623 master-0 kubenswrapper[18592]: I0308 04:03:44.849519 18592 scope.go:117] "RemoveContainer" containerID="7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c" Mar 08 04:03:44.850300 master-0 kubenswrapper[18592]: I0308 04:03:44.850257 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c"} err="failed to get container status \"7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c\": rpc error: code = NotFound desc = could not find container \"7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c\": container with ID starting with 7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c not found: ID does not exist" Mar 08 04:03:44.850300 master-0 kubenswrapper[18592]: I0308 04:03:44.850300 18592 scope.go:117] "RemoveContainer" containerID="8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569" Mar 08 04:03:44.851573 master-0 kubenswrapper[18592]: I0308 04:03:44.851436 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569"} err="failed to get container status \"8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569\": rpc error: code = NotFound desc = could not find container \"8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569\": container with ID starting with 8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569 not found: ID does not exist" Mar 08 04:03:44.851923 master-0 kubenswrapper[18592]: I0308 04:03:44.851866 18592 scope.go:117] "RemoveContainer" containerID="b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8" Mar 08 04:03:44.852463 master-0 kubenswrapper[18592]: I0308 04:03:44.852422 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8"} err="failed to get container status \"b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8\": rpc error: code = NotFound desc = could not find container \"b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8\": container with ID starting with b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8 not found: ID does not exist" Mar 08 04:03:44.852556 master-0 kubenswrapper[18592]: I0308 04:03:44.852462 18592 scope.go:117] "RemoveContainer" containerID="7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e" Mar 08 04:03:44.853169 master-0 kubenswrapper[18592]: I0308 04:03:44.853122 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e"} err="failed to get container status \"7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e\": rpc error: code = NotFound desc = could not find container \"7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e\": container with ID starting with 7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e not found: ID does not exist" Mar 08 04:03:44.853291 master-0 kubenswrapper[18592]: I0308 04:03:44.853167 18592 scope.go:117] "RemoveContainer" containerID="a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5" Mar 08 04:03:44.853764 master-0 kubenswrapper[18592]: I0308 04:03:44.853704 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5"} err="failed to get container status \"a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5\": rpc error: code = NotFound desc = could not find container \"a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5\": container with ID starting with a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5 not found: ID does not exist" Mar 08 04:03:44.853885 master-0 kubenswrapper[18592]: I0308 04:03:44.853762 18592 scope.go:117] "RemoveContainer" containerID="c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3" Mar 08 04:03:44.854235 master-0 kubenswrapper[18592]: I0308 04:03:44.854175 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3"} err="failed to get container status \"c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3\": rpc error: code = NotFound desc = could not find container \"c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3\": container with ID starting with c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3 not found: ID does not exist" Mar 08 04:03:44.854341 master-0 kubenswrapper[18592]: I0308 04:03:44.854215 18592 scope.go:117] "RemoveContainer" containerID="8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379" Mar 08 04:03:44.854879 master-0 kubenswrapper[18592]: I0308 04:03:44.854804 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379"} err="failed to get container status \"8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379\": rpc error: code = NotFound desc = could not find container \"8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379\": container with ID starting with 8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379 not found: ID does not exist" Mar 08 04:03:44.854980 master-0 kubenswrapper[18592]: I0308 04:03:44.854879 18592 scope.go:117] "RemoveContainer" containerID="7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c" Mar 08 04:03:44.855456 master-0 kubenswrapper[18592]: I0308 04:03:44.855411 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c"} err="failed to get container status \"7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c\": rpc error: code = NotFound desc = could not find container \"7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c\": container with ID starting with 7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c not found: ID does not exist" Mar 08 04:03:44.855456 master-0 kubenswrapper[18592]: I0308 04:03:44.855452 18592 scope.go:117] "RemoveContainer" containerID="8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569" Mar 08 04:03:44.856031 master-0 kubenswrapper[18592]: I0308 04:03:44.855963 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569"} err="failed to get container status \"8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569\": rpc error: code = NotFound desc = could not find container \"8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569\": container with ID starting with 8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569 not found: ID does not exist" Mar 08 04:03:44.856135 master-0 kubenswrapper[18592]: I0308 04:03:44.856050 18592 scope.go:117] "RemoveContainer" containerID="b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8" Mar 08 04:03:44.856517 master-0 kubenswrapper[18592]: I0308 04:03:44.856470 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8"} err="failed to get container status \"b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8\": rpc error: code = NotFound desc = could not find container \"b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8\": container with ID starting with b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8 not found: ID does not exist" Mar 08 04:03:44.856615 master-0 kubenswrapper[18592]: I0308 04:03:44.856523 18592 scope.go:117] "RemoveContainer" containerID="7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e" Mar 08 04:03:44.857043 master-0 kubenswrapper[18592]: I0308 04:03:44.857003 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e"} err="failed to get container status \"7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e\": rpc error: code = NotFound desc = could not find container \"7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e\": container with ID starting with 7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e not found: ID does not exist" Mar 08 04:03:44.857043 master-0 kubenswrapper[18592]: I0308 04:03:44.857039 18592 scope.go:117] "RemoveContainer" containerID="a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5" Mar 08 04:03:44.857574 master-0 kubenswrapper[18592]: I0308 04:03:44.857531 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5"} err="failed to get container status \"a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5\": rpc error: code = NotFound desc = could not find container \"a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5\": container with ID starting with a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5 not found: ID does not exist" Mar 08 04:03:44.857574 master-0 kubenswrapper[18592]: I0308 04:03:44.857572 18592 scope.go:117] "RemoveContainer" containerID="c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3" Mar 08 04:03:44.858147 master-0 kubenswrapper[18592]: I0308 04:03:44.858107 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3"} err="failed to get container status \"c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3\": rpc error: code = NotFound desc = could not find container \"c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3\": container with ID starting with c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3 not found: ID does not exist" Mar 08 04:03:44.858147 master-0 kubenswrapper[18592]: I0308 04:03:44.858144 18592 scope.go:117] "RemoveContainer" containerID="8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379" Mar 08 04:03:44.858610 master-0 kubenswrapper[18592]: I0308 04:03:44.858553 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379"} err="failed to get container status \"8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379\": rpc error: code = NotFound desc = could not find container \"8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379\": container with ID starting with 8c148cbb0915a5eeb34b064563cd84c2bd79992dadfa51f04ff27c52ca0a1379 not found: ID does not exist" Mar 08 04:03:44.858723 master-0 kubenswrapper[18592]: I0308 04:03:44.858610 18592 scope.go:117] "RemoveContainer" containerID="7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c" Mar 08 04:03:44.859213 master-0 kubenswrapper[18592]: I0308 04:03:44.859172 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c"} err="failed to get container status \"7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c\": rpc error: code = NotFound desc = could not find container \"7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c\": container with ID starting with 7e6e0a1dd4628ac65a5a1b06d4e9688e783a295582fe830c670261a611a4156c not found: ID does not exist" Mar 08 04:03:44.859213 master-0 kubenswrapper[18592]: I0308 04:03:44.859209 18592 scope.go:117] "RemoveContainer" containerID="8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569" Mar 08 04:03:44.859638 master-0 kubenswrapper[18592]: I0308 04:03:44.859599 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569"} err="failed to get container status \"8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569\": rpc error: code = NotFound desc = could not find container \"8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569\": container with ID starting with 8a427b90a89c6374fb3a2672156c087d21ba1d11bd55939844dd05cb8406e569 not found: ID does not exist" Mar 08 04:03:44.859638 master-0 kubenswrapper[18592]: I0308 04:03:44.859634 18592 scope.go:117] "RemoveContainer" containerID="b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8" Mar 08 04:03:44.860198 master-0 kubenswrapper[18592]: I0308 04:03:44.860153 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8"} err="failed to get container status \"b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8\": rpc error: code = NotFound desc = could not find container \"b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8\": container with ID starting with b86660b89db34f60351c03ded37334fd4a438f92a5c1c1cdc379852f41cf84f8 not found: ID does not exist" Mar 08 04:03:44.860301 master-0 kubenswrapper[18592]: I0308 04:03:44.860197 18592 scope.go:117] "RemoveContainer" containerID="7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e" Mar 08 04:03:44.860618 master-0 kubenswrapper[18592]: I0308 04:03:44.860578 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e"} err="failed to get container status \"7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e\": rpc error: code = NotFound desc = could not find container \"7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e\": container with ID starting with 7f62bc4b6acd53da006a6081ef3e761680d62fc9f72141ffe7169f424211ea5e not found: ID does not exist" Mar 08 04:03:44.860618 master-0 kubenswrapper[18592]: I0308 04:03:44.860616 18592 scope.go:117] "RemoveContainer" containerID="a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5" Mar 08 04:03:44.861155 master-0 kubenswrapper[18592]: I0308 04:03:44.861118 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5"} err="failed to get container status \"a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5\": rpc error: code = NotFound desc = could not find container \"a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5\": container with ID starting with a202a6ef0655d9c023cb7f6ed3d9196a81b1bf82cb7639b6b968c5efc0a0ade5 not found: ID does not exist" Mar 08 04:03:44.861155 master-0 kubenswrapper[18592]: I0308 04:03:44.861152 18592 scope.go:117] "RemoveContainer" containerID="c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3" Mar 08 04:03:44.861642 master-0 kubenswrapper[18592]: I0308 04:03:44.861597 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3"} err="failed to get container status \"c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3\": rpc error: code = NotFound desc = could not find container \"c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3\": container with ID starting with c527e941996e05aa6d3486c73a6d0bb7b7ce8bc9a0ef0c0e159892bbc6674ce3 not found: ID does not exist" Mar 08 04:03:46.159291 master-0 kubenswrapper[18592]: I0308 04:03:46.159208 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0580c83f64e952a7a614903b6fdf6965" path="/var/lib/kubelet/pods/0580c83f64e952a7a614903b6fdf6965/volumes" Mar 08 04:03:46.181729 master-0 kubenswrapper[18592]: I0308 04:03:46.181623 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 04:03:46.316980 master-0 kubenswrapper[18592]: I0308 04:03:46.316889 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/514f8205-32f5-4a29-9779-fa9e339e452c-var-lock\") pod \"514f8205-32f5-4a29-9779-fa9e339e452c\" (UID: \"514f8205-32f5-4a29-9779-fa9e339e452c\") " Mar 08 04:03:46.317256 master-0 kubenswrapper[18592]: I0308 04:03:46.317003 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/514f8205-32f5-4a29-9779-fa9e339e452c-kubelet-dir\") pod \"514f8205-32f5-4a29-9779-fa9e339e452c\" (UID: \"514f8205-32f5-4a29-9779-fa9e339e452c\") " Mar 08 04:03:46.317256 master-0 kubenswrapper[18592]: I0308 04:03:46.317078 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/514f8205-32f5-4a29-9779-fa9e339e452c-var-lock" (OuterVolumeSpecName: "var-lock") pod "514f8205-32f5-4a29-9779-fa9e339e452c" (UID: "514f8205-32f5-4a29-9779-fa9e339e452c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:03:46.317256 master-0 kubenswrapper[18592]: I0308 04:03:46.317092 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/514f8205-32f5-4a29-9779-fa9e339e452c-kube-api-access\") pod \"514f8205-32f5-4a29-9779-fa9e339e452c\" (UID: \"514f8205-32f5-4a29-9779-fa9e339e452c\") " Mar 08 04:03:46.317256 master-0 kubenswrapper[18592]: I0308 04:03:46.317188 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/514f8205-32f5-4a29-9779-fa9e339e452c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "514f8205-32f5-4a29-9779-fa9e339e452c" (UID: "514f8205-32f5-4a29-9779-fa9e339e452c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:03:46.318178 master-0 kubenswrapper[18592]: I0308 04:03:46.317902 18592 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/514f8205-32f5-4a29-9779-fa9e339e452c-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:46.318178 master-0 kubenswrapper[18592]: I0308 04:03:46.317933 18592 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/514f8205-32f5-4a29-9779-fa9e339e452c-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:46.320651 master-0 kubenswrapper[18592]: I0308 04:03:46.320579 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/514f8205-32f5-4a29-9779-fa9e339e452c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "514f8205-32f5-4a29-9779-fa9e339e452c" (UID: "514f8205-32f5-4a29-9779-fa9e339e452c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:03:46.419215 master-0 kubenswrapper[18592]: I0308 04:03:46.419131 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/514f8205-32f5-4a29-9779-fa9e339e452c-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:46.667986 master-0 kubenswrapper[18592]: I0308 04:03:46.667741 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"514f8205-32f5-4a29-9779-fa9e339e452c","Type":"ContainerDied","Data":"e3923e9598929e481d263721937694f2dec1eff395749cbd223cb65432526b8a"} Mar 08 04:03:46.667986 master-0 kubenswrapper[18592]: I0308 04:03:46.667898 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3923e9598929e481d263721937694f2dec1eff395749cbd223cb65432526b8a" Mar 08 04:03:46.667986 master-0 kubenswrapper[18592]: I0308 04:03:46.667793 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 08 04:03:50.060892 master-0 kubenswrapper[18592]: I0308 04:03:50.060730 18592 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 08 04:03:50.064082 master-0 kubenswrapper[18592]: I0308 04:03:50.061032 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" containerID="cri-o://e04f902e64611088fc5b2a33ae22c063c66ae87560c79f3c3e32ced196e50876" gracePeriod=30 Mar 08 04:03:50.064082 master-0 kubenswrapper[18592]: I0308 04:03:50.062089 18592 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 08 04:03:50.064082 master-0 kubenswrapper[18592]: E0308 04:03:50.062533 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 04:03:50.064082 master-0 kubenswrapper[18592]: I0308 04:03:50.062557 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 04:03:50.064082 master-0 kubenswrapper[18592]: E0308 04:03:50.062581 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 04:03:50.064082 master-0 kubenswrapper[18592]: I0308 04:03:50.062592 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 04:03:50.064082 master-0 kubenswrapper[18592]: E0308 04:03:50.062610 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514f8205-32f5-4a29-9779-fa9e339e452c" containerName="installer" Mar 08 04:03:50.064082 master-0 kubenswrapper[18592]: I0308 04:03:50.062622 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="514f8205-32f5-4a29-9779-fa9e339e452c" containerName="installer" Mar 08 04:03:50.064082 master-0 kubenswrapper[18592]: E0308 04:03:50.062642 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 04:03:50.064082 master-0 kubenswrapper[18592]: I0308 04:03:50.062654 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 04:03:50.064082 master-0 kubenswrapper[18592]: I0308 04:03:50.062905 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 04:03:50.064082 master-0 kubenswrapper[18592]: I0308 04:03:50.062954 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 04:03:50.064082 master-0 kubenswrapper[18592]: I0308 04:03:50.062977 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="514f8205-32f5-4a29-9779-fa9e339e452c" containerName="installer" Mar 08 04:03:50.064082 master-0 kubenswrapper[18592]: E0308 04:03:50.063172 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 04:03:50.064082 master-0 kubenswrapper[18592]: I0308 04:03:50.063186 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 04:03:50.064082 master-0 kubenswrapper[18592]: I0308 04:03:50.063398 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 04:03:50.064082 master-0 kubenswrapper[18592]: I0308 04:03:50.063436 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 04:03:50.074036 master-0 kubenswrapper[18592]: I0308 04:03:50.065019 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 04:03:50.184378 master-0 kubenswrapper[18592]: I0308 04:03:50.183938 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/aa6a75ab47c06be4e74d05f552da4470-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"aa6a75ab47c06be4e74d05f552da4470\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 04:03:50.184378 master-0 kubenswrapper[18592]: I0308 04:03:50.184015 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/aa6a75ab47c06be4e74d05f552da4470-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"aa6a75ab47c06be4e74d05f552da4470\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 04:03:50.287629 master-0 kubenswrapper[18592]: I0308 04:03:50.287432 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/aa6a75ab47c06be4e74d05f552da4470-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"aa6a75ab47c06be4e74d05f552da4470\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 04:03:50.287629 master-0 kubenswrapper[18592]: I0308 04:03:50.287559 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/aa6a75ab47c06be4e74d05f552da4470-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"aa6a75ab47c06be4e74d05f552da4470\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 04:03:50.288152 master-0 kubenswrapper[18592]: I0308 04:03:50.287900 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/aa6a75ab47c06be4e74d05f552da4470-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"aa6a75ab47c06be4e74d05f552da4470\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 04:03:50.288152 master-0 kubenswrapper[18592]: I0308 04:03:50.287970 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/aa6a75ab47c06be4e74d05f552da4470-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"aa6a75ab47c06be4e74d05f552da4470\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 04:03:50.313625 master-0 kubenswrapper[18592]: I0308 04:03:50.313467 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 08 04:03:50.353770 master-0 kubenswrapper[18592]: I0308 04:03:50.353714 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 04:03:50.388365 master-0 kubenswrapper[18592]: I0308 04:03:50.388289 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"a1a56802af72ce1aac6b5077f1695ac0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " Mar 08 04:03:50.388573 master-0 kubenswrapper[18592]: I0308 04:03:50.388388 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs" (OuterVolumeSpecName: "logs") pod "a1a56802af72ce1aac6b5077f1695ac0" (UID: "a1a56802af72ce1aac6b5077f1695ac0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:03:50.388749 master-0 kubenswrapper[18592]: I0308 04:03:50.388712 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"a1a56802af72ce1aac6b5077f1695ac0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " Mar 08 04:03:50.388788 master-0 kubenswrapper[18592]: I0308 04:03:50.388754 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets" (OuterVolumeSpecName: "secrets") pod "a1a56802af72ce1aac6b5077f1695ac0" (UID: "a1a56802af72ce1aac6b5077f1695ac0"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:03:50.389175 master-0 kubenswrapper[18592]: I0308 04:03:50.389148 18592 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:50.389230 master-0 kubenswrapper[18592]: I0308 04:03:50.389184 18592 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:50.587747 master-0 kubenswrapper[18592]: I0308 04:03:50.587523 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 04:03:50.614303 master-0 kubenswrapper[18592]: W0308 04:03:50.613981 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa6a75ab47c06be4e74d05f552da4470.slice/crio-f1a6eeaff59f094b79b80e8b1a11e512d2e4be3f46d6f2a989c6d86db2640d52 WatchSource:0}: Error finding container f1a6eeaff59f094b79b80e8b1a11e512d2e4be3f46d6f2a989c6d86db2640d52: Status 404 returned error can't find the container with id f1a6eeaff59f094b79b80e8b1a11e512d2e4be3f46d6f2a989c6d86db2640d52 Mar 08 04:03:50.675829 master-0 kubenswrapper[18592]: I0308 04:03:50.675741 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:03:50.676056 master-0 kubenswrapper[18592]: I0308 04:03:50.675825 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:03:50.722385 master-0 kubenswrapper[18592]: I0308 04:03:50.720246 18592 scope.go:117] "RemoveContainer" containerID="e04f902e64611088fc5b2a33ae22c063c66ae87560c79f3c3e32ced196e50876" Mar 08 04:03:50.722385 master-0 kubenswrapper[18592]: I0308 04:03:50.720316 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 04:03:50.722385 master-0 kubenswrapper[18592]: I0308 04:03:50.720182 18592 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="e04f902e64611088fc5b2a33ae22c063c66ae87560c79f3c3e32ced196e50876" exitCode=0 Mar 08 04:03:50.726417 master-0 kubenswrapper[18592]: I0308 04:03:50.726296 18592 generic.go:334] "Generic (PLEG): container finished" podID="52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb" containerID="fb353c6e6f508946d6e48864215c0bcbb1578b78f6c37fc4f5747b87ef99a44e" exitCode=0 Mar 08 04:03:50.726629 master-0 kubenswrapper[18592]: I0308 04:03:50.726489 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb","Type":"ContainerDied","Data":"fb353c6e6f508946d6e48864215c0bcbb1578b78f6c37fc4f5747b87ef99a44e"} Mar 08 04:03:50.731114 master-0 kubenswrapper[18592]: I0308 04:03:50.730741 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"aa6a75ab47c06be4e74d05f552da4470","Type":"ContainerStarted","Data":"f1a6eeaff59f094b79b80e8b1a11e512d2e4be3f46d6f2a989c6d86db2640d52"} Mar 08 04:03:50.762032 master-0 kubenswrapper[18592]: I0308 04:03:50.761419 18592 scope.go:117] "RemoveContainer" containerID="f7171701edb795064e29edd4a52aeb0af591e01a8efb0166607b6c1961305d36" Mar 08 04:03:50.795460 master-0 kubenswrapper[18592]: I0308 04:03:50.795420 18592 scope.go:117] "RemoveContainer" containerID="e04f902e64611088fc5b2a33ae22c063c66ae87560c79f3c3e32ced196e50876" Mar 08 04:03:50.795976 master-0 kubenswrapper[18592]: E0308 04:03:50.795938 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e04f902e64611088fc5b2a33ae22c063c66ae87560c79f3c3e32ced196e50876\": container with ID starting with e04f902e64611088fc5b2a33ae22c063c66ae87560c79f3c3e32ced196e50876 not found: ID does not exist" containerID="e04f902e64611088fc5b2a33ae22c063c66ae87560c79f3c3e32ced196e50876" Mar 08 04:03:50.796078 master-0 kubenswrapper[18592]: I0308 04:03:50.795970 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04f902e64611088fc5b2a33ae22c063c66ae87560c79f3c3e32ced196e50876"} err="failed to get container status \"e04f902e64611088fc5b2a33ae22c063c66ae87560c79f3c3e32ced196e50876\": rpc error: code = NotFound desc = could not find container \"e04f902e64611088fc5b2a33ae22c063c66ae87560c79f3c3e32ced196e50876\": container with ID starting with e04f902e64611088fc5b2a33ae22c063c66ae87560c79f3c3e32ced196e50876 not found: ID does not exist" Mar 08 04:03:50.796078 master-0 kubenswrapper[18592]: I0308 04:03:50.795992 18592 scope.go:117] "RemoveContainer" containerID="f7171701edb795064e29edd4a52aeb0af591e01a8efb0166607b6c1961305d36" Mar 08 04:03:50.796271 master-0 kubenswrapper[18592]: E0308 04:03:50.796241 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7171701edb795064e29edd4a52aeb0af591e01a8efb0166607b6c1961305d36\": container with ID starting with f7171701edb795064e29edd4a52aeb0af591e01a8efb0166607b6c1961305d36 not found: ID does not exist" containerID="f7171701edb795064e29edd4a52aeb0af591e01a8efb0166607b6c1961305d36" Mar 08 04:03:50.796271 master-0 kubenswrapper[18592]: I0308 04:03:50.796262 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7171701edb795064e29edd4a52aeb0af591e01a8efb0166607b6c1961305d36"} err="failed to get container status \"f7171701edb795064e29edd4a52aeb0af591e01a8efb0166607b6c1961305d36\": rpc error: code = NotFound desc = could not find container \"f7171701edb795064e29edd4a52aeb0af591e01a8efb0166607b6c1961305d36\": container with ID starting with f7171701edb795064e29edd4a52aeb0af591e01a8efb0166607b6c1961305d36 not found: ID does not exist" Mar 08 04:03:51.746605 master-0 kubenswrapper[18592]: I0308 04:03:51.746515 18592 generic.go:334] "Generic (PLEG): container finished" podID="aa6a75ab47c06be4e74d05f552da4470" containerID="ea4d28a8e0a35554e6eab7f1c0d0a017c1e04bb0df57e48d19de47f12eba5a87" exitCode=0 Mar 08 04:03:51.746605 master-0 kubenswrapper[18592]: I0308 04:03:51.746583 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"aa6a75ab47c06be4e74d05f552da4470","Type":"ContainerDied","Data":"ea4d28a8e0a35554e6eab7f1c0d0a017c1e04bb0df57e48d19de47f12eba5a87"} Mar 08 04:03:52.152915 master-0 kubenswrapper[18592]: I0308 04:03:52.152578 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1a56802af72ce1aac6b5077f1695ac0" path="/var/lib/kubelet/pods/a1a56802af72ce1aac6b5077f1695ac0/volumes" Mar 08 04:03:52.153558 master-0 kubenswrapper[18592]: I0308 04:03:52.153514 18592 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Mar 08 04:03:52.170940 master-0 kubenswrapper[18592]: I0308 04:03:52.170891 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 04:03:52.178744 master-0 kubenswrapper[18592]: I0308 04:03:52.178691 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 08 04:03:52.178744 master-0 kubenswrapper[18592]: I0308 04:03:52.178736 18592 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="465d4321-1fd3-4d76-a8b4-f9aee09bd574" Mar 08 04:03:52.191343 master-0 kubenswrapper[18592]: I0308 04:03:52.191287 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 08 04:03:52.191343 master-0 kubenswrapper[18592]: I0308 04:03:52.191337 18592 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="465d4321-1fd3-4d76-a8b4-f9aee09bd574" Mar 08 04:03:52.318289 master-0 kubenswrapper[18592]: I0308 04:03:52.318233 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb-kubelet-dir\") pod \"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb\" (UID: \"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb\") " Mar 08 04:03:52.318464 master-0 kubenswrapper[18592]: I0308 04:03:52.318425 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb-kube-api-access\") pod \"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb\" (UID: \"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb\") " Mar 08 04:03:52.318562 master-0 kubenswrapper[18592]: I0308 04:03:52.318476 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb-var-lock\") pod \"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb\" (UID: \"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb\") " Mar 08 04:03:52.318562 master-0 kubenswrapper[18592]: I0308 04:03:52.318517 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb" (UID: "52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:03:52.318849 master-0 kubenswrapper[18592]: I0308 04:03:52.318813 18592 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:52.319129 master-0 kubenswrapper[18592]: I0308 04:03:52.319097 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb-var-lock" (OuterVolumeSpecName: "var-lock") pod "52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb" (UID: "52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:03:52.322714 master-0 kubenswrapper[18592]: I0308 04:03:52.322679 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb" (UID: "52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:03:52.419621 master-0 kubenswrapper[18592]: I0308 04:03:52.419562 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:52.419621 master-0 kubenswrapper[18592]: I0308 04:03:52.419605 18592 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 04:03:52.759921 master-0 kubenswrapper[18592]: I0308 04:03:52.759856 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"aa6a75ab47c06be4e74d05f552da4470","Type":"ContainerStarted","Data":"fff5395e5b129a13c3f80e42b14089b343efff54fa93d4702e67b183b5a0443b"} Mar 08 04:03:52.759921 master-0 kubenswrapper[18592]: I0308 04:03:52.759944 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"aa6a75ab47c06be4e74d05f552da4470","Type":"ContainerStarted","Data":"29cf237c8634de03c2e3add4c987e5196114f519b7a82771c9e389046aafd29b"} Mar 08 04:03:52.760683 master-0 kubenswrapper[18592]: I0308 04:03:52.759966 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"aa6a75ab47c06be4e74d05f552da4470","Type":"ContainerStarted","Data":"30f61262c619808a683d77b66741f73867107a5ba4895e6064a9536e490c6640"} Mar 08 04:03:52.760683 master-0 kubenswrapper[18592]: I0308 04:03:52.759992 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 04:03:52.762913 master-0 kubenswrapper[18592]: I0308 04:03:52.762165 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb","Type":"ContainerDied","Data":"75626aea3d66d0ae47b2146334586f11fc2d98cf49a021c7d5078b6701b55f22"} Mar 08 04:03:52.762913 master-0 kubenswrapper[18592]: I0308 04:03:52.762209 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75626aea3d66d0ae47b2146334586f11fc2d98cf49a021c7d5078b6701b55f22" Mar 08 04:03:52.762913 master-0 kubenswrapper[18592]: I0308 04:03:52.762217 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Mar 08 04:03:52.803362 master-0 kubenswrapper[18592]: I0308 04:03:52.803261 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=2.80323326 podStartE2EDuration="2.80323326s" podCreationTimestamp="2026-03-08 04:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:03:52.79906749 +0000 UTC m=+644.897821880" watchObservedRunningTime="2026-03-08 04:03:52.80323326 +0000 UTC m=+644.901987650" Mar 08 04:03:53.302817 master-0 kubenswrapper[18592]: I0308 04:03:53.302719 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:03:53.303141 master-0 kubenswrapper[18592]: I0308 04:03:53.302805 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:03:56.142589 master-0 kubenswrapper[18592]: I0308 04:03:56.142512 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:03:56.177391 master-0 kubenswrapper[18592]: I0308 04:03:56.177327 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0b206b98-8908-4245-ab4a-86555d2574e2" Mar 08 04:03:56.177391 master-0 kubenswrapper[18592]: I0308 04:03:56.177365 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="0b206b98-8908-4245-ab4a-86555d2574e2" Mar 08 04:03:56.195354 master-0 kubenswrapper[18592]: I0308 04:03:56.195226 18592 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:03:56.215142 master-0 kubenswrapper[18592]: I0308 04:03:56.215036 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 04:03:56.215470 master-0 kubenswrapper[18592]: I0308 04:03:56.215164 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:03:56.222618 master-0 kubenswrapper[18592]: I0308 04:03:56.222562 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 04:03:56.230686 master-0 kubenswrapper[18592]: I0308 04:03:56.230591 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 04:03:56.260360 master-0 kubenswrapper[18592]: W0308 04:03:56.260244 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a9d6aff6989e430a5479397d5c55a56.slice/crio-dad72972e0d5f12b0c3daaed7f8489be43b21fe41b8aebc4ace20c3774138f25 WatchSource:0}: Error finding container dad72972e0d5f12b0c3daaed7f8489be43b21fe41b8aebc4ace20c3774138f25: Status 404 returned error can't find the container with id dad72972e0d5f12b0c3daaed7f8489be43b21fe41b8aebc4ace20c3774138f25 Mar 08 04:03:56.806200 master-0 kubenswrapper[18592]: I0308 04:03:56.806104 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1a9d6aff6989e430a5479397d5c55a56","Type":"ContainerStarted","Data":"0d25b4d4c9e6c2e7fd5af92e4bc85c9553735f475010116b289e961fedcd94d9"} Mar 08 04:03:56.806310 master-0 kubenswrapper[18592]: I0308 04:03:56.806198 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1a9d6aff6989e430a5479397d5c55a56","Type":"ContainerStarted","Data":"dad72972e0d5f12b0c3daaed7f8489be43b21fe41b8aebc4ace20c3774138f25"} Mar 08 04:03:57.822013 master-0 kubenswrapper[18592]: I0308 04:03:57.821917 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1a9d6aff6989e430a5479397d5c55a56","Type":"ContainerStarted","Data":"b5cfaf4d10274b09585973d545f52b40ae8030c571d6c24324c256cf51f0c811"} Mar 08 04:03:57.822013 master-0 kubenswrapper[18592]: I0308 04:03:57.822004 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1a9d6aff6989e430a5479397d5c55a56","Type":"ContainerStarted","Data":"cafb2b1c7fc5f25ffa8e5de7e2608271cbb979fb6f206fc1b90dc0d97569f7d9"} Mar 08 04:03:57.822668 master-0 kubenswrapper[18592]: I0308 04:03:57.822031 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1a9d6aff6989e430a5479397d5c55a56","Type":"ContainerStarted","Data":"4ff9d8ebc4c4fbf7f72271d97c70774abcc3ba4f168511dc1ce373ac1c1f528c"} Mar 08 04:03:57.853988 master-0 kubenswrapper[18592]: I0308 04:03:57.853862 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=1.853798044 podStartE2EDuration="1.853798044s" podCreationTimestamp="2026-03-08 04:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:03:57.843989605 +0000 UTC m=+649.942743985" watchObservedRunningTime="2026-03-08 04:03:57.853798044 +0000 UTC m=+649.952552434" Mar 08 04:04:00.675635 master-0 kubenswrapper[18592]: I0308 04:04:00.675375 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:04:00.675635 master-0 kubenswrapper[18592]: I0308 04:04:00.675470 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:04:03.302410 master-0 kubenswrapper[18592]: I0308 04:04:03.302295 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:04:03.302410 master-0 kubenswrapper[18592]: I0308 04:04:03.302381 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:04:06.215402 master-0 kubenswrapper[18592]: I0308 04:04:06.215320 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:04:06.216255 master-0 kubenswrapper[18592]: I0308 04:04:06.215441 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:04:06.216255 master-0 kubenswrapper[18592]: I0308 04:04:06.215681 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:04:06.216255 master-0 kubenswrapper[18592]: I0308 04:04:06.215738 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:04:06.225202 master-0 kubenswrapper[18592]: I0308 04:04:06.225129 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:04:06.225470 master-0 kubenswrapper[18592]: I0308 04:04:06.225425 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:04:06.916882 master-0 kubenswrapper[18592]: I0308 04:04:06.916773 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:04:06.919682 master-0 kubenswrapper[18592]: I0308 04:04:06.918937 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 04:04:07.179998 master-0 kubenswrapper[18592]: I0308 04:04:07.179814 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 08 04:04:08.200525 master-0 kubenswrapper[18592]: I0308 04:04:08.200407 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=1.200383255 podStartE2EDuration="1.200383255s" podCreationTimestamp="2026-03-08 04:04:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:04:08.19753409 +0000 UTC m=+660.296288480" watchObservedRunningTime="2026-03-08 04:04:08.200383255 +0000 UTC m=+660.299137635" Mar 08 04:04:08.539112 master-0 kubenswrapper[18592]: I0308 04:04:08.538957 18592 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 04:04:08.539389 master-0 kubenswrapper[18592]: E0308 04:04:08.539352 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb" containerName="installer" Mar 08 04:04:08.539389 master-0 kubenswrapper[18592]: I0308 04:04:08.539384 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb" containerName="installer" Mar 08 04:04:08.539724 master-0 kubenswrapper[18592]: I0308 04:04:08.539681 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="52aa7ab6-5de2-4af0-a4a5-b5d0f524dadb" containerName="installer" Mar 08 04:04:08.540549 master-0 kubenswrapper[18592]: I0308 04:04:08.540505 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:08.540907 master-0 kubenswrapper[18592]: I0308 04:04:08.540864 18592 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 04:04:08.541571 master-0 kubenswrapper[18592]: I0308 04:04:08.541512 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver" containerID="cri-o://101e91519fa54cd23690f6a36dea77798470079f4382e04ac20e58f00a7180da" gracePeriod=15 Mar 08 04:04:08.541942 master-0 kubenswrapper[18592]: I0308 04:04:08.541553 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://da6fb8324087a24afc43afc026ff79a419163b34f188875acd9bb6ca7456a181" gracePeriod=15 Mar 08 04:04:08.542100 master-0 kubenswrapper[18592]: I0308 04:04:08.541557 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4d0d18c78b2f0279d72c1060fc00718b3f77f330ad1bd6f44039320acf418c8d" gracePeriod=15 Mar 08 04:04:08.542227 master-0 kubenswrapper[18592]: I0308 04:04:08.541592 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-syncer" containerID="cri-o://b4a05e47377b25ea4d9a9af26e5631afe65dee09a108205893226dbc248e594c" gracePeriod=15 Mar 08 04:04:08.542227 master-0 kubenswrapper[18592]: I0308 04:04:08.541594 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://3d937ddeff4a562c97267b5e3e7157a51ad8f928d1db69fb1e607bd363c3cf25" gracePeriod=15 Mar 08 04:04:08.543677 master-0 kubenswrapper[18592]: I0308 04:04:08.542913 18592 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 04:04:08.543677 master-0 kubenswrapper[18592]: E0308 04:04:08.543231 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-syncer" Mar 08 04:04:08.543677 master-0 kubenswrapper[18592]: I0308 04:04:08.543256 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-syncer" Mar 08 04:04:08.543677 master-0 kubenswrapper[18592]: E0308 04:04:08.543287 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-insecure-readyz" Mar 08 04:04:08.543677 master-0 kubenswrapper[18592]: I0308 04:04:08.543303 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-insecure-readyz" Mar 08 04:04:08.543677 master-0 kubenswrapper[18592]: E0308 04:04:08.543334 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 04:04:08.543677 master-0 kubenswrapper[18592]: I0308 04:04:08.543351 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 04:04:08.543677 master-0 kubenswrapper[18592]: E0308 04:04:08.543386 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" Mar 08 04:04:08.543677 master-0 kubenswrapper[18592]: I0308 04:04:08.543405 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" Mar 08 04:04:08.543677 master-0 kubenswrapper[18592]: E0308 04:04:08.543431 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver" Mar 08 04:04:08.543677 master-0 kubenswrapper[18592]: I0308 04:04:08.543446 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver" Mar 08 04:04:08.543677 master-0 kubenswrapper[18592]: E0308 04:04:08.543496 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="setup" Mar 08 04:04:08.543677 master-0 kubenswrapper[18592]: I0308 04:04:08.543513 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="setup" Mar 08 04:04:08.545371 master-0 kubenswrapper[18592]: I0308 04:04:08.543868 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" Mar 08 04:04:08.545371 master-0 kubenswrapper[18592]: I0308 04:04:08.543908 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-insecure-readyz" Mar 08 04:04:08.545371 master-0 kubenswrapper[18592]: I0308 04:04:08.543971 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 04:04:08.545371 master-0 kubenswrapper[18592]: I0308 04:04:08.544002 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver" Mar 08 04:04:08.545371 master-0 kubenswrapper[18592]: I0308 04:04:08.544045 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-syncer" Mar 08 04:04:08.656067 master-0 kubenswrapper[18592]: E0308 04:04:08.655928 18592 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:08.709488 master-0 kubenswrapper[18592]: I0308 04:04:08.709432 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:08.709488 master-0 kubenswrapper[18592]: I0308 04:04:08.709501 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:08.709863 master-0 kubenswrapper[18592]: I0308 04:04:08.709540 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:08.709863 master-0 kubenswrapper[18592]: I0308 04:04:08.709565 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:08.709863 master-0 kubenswrapper[18592]: I0308 04:04:08.709620 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:08.709863 master-0 kubenswrapper[18592]: I0308 04:04:08.709729 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:08.710068 master-0 kubenswrapper[18592]: I0308 04:04:08.709941 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:08.710068 master-0 kubenswrapper[18592]: I0308 04:04:08.710057 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:08.812358 master-0 kubenswrapper[18592]: I0308 04:04:08.812173 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:08.812358 master-0 kubenswrapper[18592]: I0308 04:04:08.812351 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:08.812642 master-0 kubenswrapper[18592]: I0308 04:04:08.812425 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:08.812642 master-0 kubenswrapper[18592]: I0308 04:04:08.812516 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:08.812787 master-0 kubenswrapper[18592]: I0308 04:04:08.812631 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:08.812974 master-0 kubenswrapper[18592]: I0308 04:04:08.812784 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:08.812974 master-0 kubenswrapper[18592]: I0308 04:04:08.812861 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:08.812974 master-0 kubenswrapper[18592]: I0308 04:04:08.812912 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:08.813230 master-0 kubenswrapper[18592]: I0308 04:04:08.813080 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:08.813230 master-0 kubenswrapper[18592]: I0308 04:04:08.813093 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:08.813230 master-0 kubenswrapper[18592]: I0308 04:04:08.813223 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:08.813431 master-0 kubenswrapper[18592]: I0308 04:04:08.813228 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:08.813431 master-0 kubenswrapper[18592]: I0308 04:04:08.813345 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:08.813431 master-0 kubenswrapper[18592]: I0308 04:04:08.813389 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:08.813431 master-0 kubenswrapper[18592]: I0308 04:04:08.813417 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:08.813692 master-0 kubenswrapper[18592]: I0308 04:04:08.813452 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:08.933764 master-0 kubenswrapper[18592]: I0308 04:04:08.933695 18592 generic.go:334] "Generic (PLEG): container finished" podID="b73e6391-2d8f-46e1-b275-a9106a730d60" containerID="d9bb3f958d2b5e248ca9acef3858fad3705cf981affbc68bf5ea66794147a61c" exitCode=0 Mar 08 04:04:08.933990 master-0 kubenswrapper[18592]: I0308 04:04:08.933755 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" event={"ID":"b73e6391-2d8f-46e1-b275-a9106a730d60","Type":"ContainerDied","Data":"d9bb3f958d2b5e248ca9acef3858fad3705cf981affbc68bf5ea66794147a61c"} Mar 08 04:04:08.935535 master-0 kubenswrapper[18592]: I0308 04:04:08.935467 18592 status_manager.go:851] "Failed to get status for pod" podUID="b73e6391-2d8f-46e1-b275-a9106a730d60" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:08.936444 master-0 kubenswrapper[18592]: I0308 04:04:08.936385 18592 status_manager.go:851] "Failed to get status for pod" podUID="077dd10388b9e3e48a07382126e86621" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:08.937541 master-0 kubenswrapper[18592]: I0308 04:04:08.937515 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_077dd10388b9e3e48a07382126e86621/kube-apiserver-cert-syncer/0.log" Mar 08 04:04:08.938309 master-0 kubenswrapper[18592]: I0308 04:04:08.938260 18592 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="4d0d18c78b2f0279d72c1060fc00718b3f77f330ad1bd6f44039320acf418c8d" exitCode=0 Mar 08 04:04:08.938309 master-0 kubenswrapper[18592]: I0308 04:04:08.938302 18592 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="da6fb8324087a24afc43afc026ff79a419163b34f188875acd9bb6ca7456a181" exitCode=0 Mar 08 04:04:08.938499 master-0 kubenswrapper[18592]: I0308 04:04:08.938320 18592 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="3d937ddeff4a562c97267b5e3e7157a51ad8f928d1db69fb1e607bd363c3cf25" exitCode=0 Mar 08 04:04:08.938499 master-0 kubenswrapper[18592]: I0308 04:04:08.938340 18592 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="b4a05e47377b25ea4d9a9af26e5631afe65dee09a108205893226dbc248e594c" exitCode=2 Mar 08 04:04:08.957531 master-0 kubenswrapper[18592]: I0308 04:04:08.957469 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:08.987820 master-0 kubenswrapper[18592]: W0308 04:04:08.987744 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda814bd60de133d95cf99630a978c017e.slice/crio-e95d8951a83a826a6f4baeadc9834aa60f8ed19b47a7edc55e932ea6ea3cb287 WatchSource:0}: Error finding container e95d8951a83a826a6f4baeadc9834aa60f8ed19b47a7edc55e932ea6ea3cb287: Status 404 returned error can't find the container with id e95d8951a83a826a6f4baeadc9834aa60f8ed19b47a7edc55e932ea6ea3cb287 Mar 08 04:04:08.991643 master-0 kubenswrapper[18592]: E0308 04:04:08.991447 18592 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189ac1f2203726c1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:a814bd60de133d95cf99630a978c017e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 04:04:08.990443201 +0000 UTC m=+661.089197591,LastTimestamp:2026-03-08 04:04:08.990443201 +0000 UTC m=+661.089197591,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 04:04:09.168782 master-0 kubenswrapper[18592]: I0308 04:04:09.168649 18592 patch_prober.go:28] interesting pod/kube-apiserver-master-0 container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.32.10:6443/readyz\": dial tcp 192.168.32.10:6443: connect: connection refused" start-of-body= Mar 08 04:04:09.168782 master-0 kubenswrapper[18592]: I0308 04:04:09.168719 18592 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.32.10:6443/readyz\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:09.168782 master-0 kubenswrapper[18592]: I0308 04:04:09.168726 18592 patch_prober.go:28] interesting pod/kube-apiserver-master-0 container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.32.10:17697/healthz\": dial tcp 192.168.32.10:17697: connect: connection refused" start-of-body= Mar 08 04:04:09.169168 master-0 kubenswrapper[18592]: I0308 04:04:09.168803 18592 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.32.10:17697/healthz\": dial tcp 192.168.32.10:17697: connect: connection refused" Mar 08 04:04:09.325375 master-0 kubenswrapper[18592]: E0308 04:04:09.324694 18592 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podb73e6391_2d8f_46e1_b275_a9106a730d60.slice/crio-d9bb3f958d2b5e248ca9acef3858fad3705cf981affbc68bf5ea66794147a61c.scope\": RecentStats: unable to find data in memory cache]" Mar 08 04:04:09.954109 master-0 kubenswrapper[18592]: I0308 04:04:09.953995 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"a814bd60de133d95cf99630a978c017e","Type":"ContainerStarted","Data":"38d58eb6cc1f0fc948fc0e241422f9cbe055b447427adb52e3e099a976cca861"} Mar 08 04:04:09.954408 master-0 kubenswrapper[18592]: I0308 04:04:09.954119 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"a814bd60de133d95cf99630a978c017e","Type":"ContainerStarted","Data":"e95d8951a83a826a6f4baeadc9834aa60f8ed19b47a7edc55e932ea6ea3cb287"} Mar 08 04:04:09.956307 master-0 kubenswrapper[18592]: E0308 04:04:09.956241 18592 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:09.956468 master-0 kubenswrapper[18592]: I0308 04:04:09.956359 18592 status_manager.go:851] "Failed to get status for pod" podUID="b73e6391-2d8f-46e1-b275-a9106a730d60" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:09.957753 master-0 kubenswrapper[18592]: I0308 04:04:09.957678 18592 status_manager.go:851] "Failed to get status for pod" podUID="077dd10388b9e3e48a07382126e86621" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:10.411258 master-0 kubenswrapper[18592]: I0308 04:04:10.411193 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 04:04:10.412279 master-0 kubenswrapper[18592]: I0308 04:04:10.412218 18592 status_manager.go:851] "Failed to get status for pod" podUID="b73e6391-2d8f-46e1-b275-a9106a730d60" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:10.545565 master-0 kubenswrapper[18592]: I0308 04:04:10.545491 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b73e6391-2d8f-46e1-b275-a9106a730d60-var-lock\") pod \"b73e6391-2d8f-46e1-b275-a9106a730d60\" (UID: \"b73e6391-2d8f-46e1-b275-a9106a730d60\") " Mar 08 04:04:10.545815 master-0 kubenswrapper[18592]: I0308 04:04:10.545610 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b73e6391-2d8f-46e1-b275-a9106a730d60-var-lock" (OuterVolumeSpecName: "var-lock") pod "b73e6391-2d8f-46e1-b275-a9106a730d60" (UID: "b73e6391-2d8f-46e1-b275-a9106a730d60"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:04:10.545815 master-0 kubenswrapper[18592]: I0308 04:04:10.545662 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b73e6391-2d8f-46e1-b275-a9106a730d60-kubelet-dir\") pod \"b73e6391-2d8f-46e1-b275-a9106a730d60\" (UID: \"b73e6391-2d8f-46e1-b275-a9106a730d60\") " Mar 08 04:04:10.545815 master-0 kubenswrapper[18592]: I0308 04:04:10.545724 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b73e6391-2d8f-46e1-b275-a9106a730d60-kube-api-access\") pod \"b73e6391-2d8f-46e1-b275-a9106a730d60\" (UID: \"b73e6391-2d8f-46e1-b275-a9106a730d60\") " Mar 08 04:04:10.545815 master-0 kubenswrapper[18592]: I0308 04:04:10.545799 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b73e6391-2d8f-46e1-b275-a9106a730d60-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b73e6391-2d8f-46e1-b275-a9106a730d60" (UID: "b73e6391-2d8f-46e1-b275-a9106a730d60"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:04:10.546406 master-0 kubenswrapper[18592]: I0308 04:04:10.546356 18592 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b73e6391-2d8f-46e1-b275-a9106a730d60-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 04:04:10.546406 master-0 kubenswrapper[18592]: I0308 04:04:10.546391 18592 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b73e6391-2d8f-46e1-b275-a9106a730d60-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 04:04:10.550366 master-0 kubenswrapper[18592]: I0308 04:04:10.550294 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b73e6391-2d8f-46e1-b275-a9106a730d60-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b73e6391-2d8f-46e1-b275-a9106a730d60" (UID: "b73e6391-2d8f-46e1-b275-a9106a730d60"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:04:10.666100 master-0 kubenswrapper[18592]: I0308 04:04:10.659568 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b73e6391-2d8f-46e1-b275-a9106a730d60-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 04:04:10.716059 master-0 kubenswrapper[18592]: I0308 04:04:10.713459 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:04:10.716059 master-0 kubenswrapper[18592]: I0308 04:04:10.713546 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:04:10.966198 master-0 kubenswrapper[18592]: I0308 04:04:10.966128 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" event={"ID":"b73e6391-2d8f-46e1-b275-a9106a730d60","Type":"ContainerDied","Data":"908f5aef9c7d532c7deaec8b20a17221ae2870023e1e2740acecef674b075b52"} Mar 08 04:04:10.966198 master-0 kubenswrapper[18592]: I0308 04:04:10.966180 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" Mar 08 04:04:10.966795 master-0 kubenswrapper[18592]: I0308 04:04:10.966197 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="908f5aef9c7d532c7deaec8b20a17221ae2870023e1e2740acecef674b075b52" Mar 08 04:04:10.992399 master-0 kubenswrapper[18592]: I0308 04:04:10.992237 18592 status_manager.go:851] "Failed to get status for pod" podUID="b73e6391-2d8f-46e1-b275-a9106a730d60" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:11.074730 master-0 kubenswrapper[18592]: I0308 04:04:11.074675 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_077dd10388b9e3e48a07382126e86621/kube-apiserver-cert-syncer/0.log" Mar 08 04:04:11.075803 master-0 kubenswrapper[18592]: I0308 04:04:11.075778 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:11.077488 master-0 kubenswrapper[18592]: I0308 04:04:11.077405 18592 status_manager.go:851] "Failed to get status for pod" podUID="077dd10388b9e3e48a07382126e86621" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:11.078568 master-0 kubenswrapper[18592]: I0308 04:04:11.078501 18592 status_manager.go:851] "Failed to get status for pod" podUID="b73e6391-2d8f-46e1-b275-a9106a730d60" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:11.165577 master-0 kubenswrapper[18592]: I0308 04:04:11.165501 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"077dd10388b9e3e48a07382126e86621\" (UID: \"077dd10388b9e3e48a07382126e86621\") " Mar 08 04:04:11.165864 master-0 kubenswrapper[18592]: I0308 04:04:11.165665 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "077dd10388b9e3e48a07382126e86621" (UID: "077dd10388b9e3e48a07382126e86621"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:04:11.165957 master-0 kubenswrapper[18592]: I0308 04:04:11.165937 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"077dd10388b9e3e48a07382126e86621\" (UID: \"077dd10388b9e3e48a07382126e86621\") " Mar 08 04:04:11.166165 master-0 kubenswrapper[18592]: I0308 04:04:11.166018 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"077dd10388b9e3e48a07382126e86621\" (UID: \"077dd10388b9e3e48a07382126e86621\") " Mar 08 04:04:11.166318 master-0 kubenswrapper[18592]: I0308 04:04:11.166113 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "077dd10388b9e3e48a07382126e86621" (UID: "077dd10388b9e3e48a07382126e86621"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:04:11.166445 master-0 kubenswrapper[18592]: I0308 04:04:11.166151 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "077dd10388b9e3e48a07382126e86621" (UID: "077dd10388b9e3e48a07382126e86621"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:04:11.167149 master-0 kubenswrapper[18592]: I0308 04:04:11.167111 18592 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 04:04:11.167149 master-0 kubenswrapper[18592]: I0308 04:04:11.167148 18592 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 04:04:11.167357 master-0 kubenswrapper[18592]: I0308 04:04:11.167167 18592 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 04:04:11.984794 master-0 kubenswrapper[18592]: I0308 04:04:11.984611 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_077dd10388b9e3e48a07382126e86621/kube-apiserver-cert-syncer/0.log" Mar 08 04:04:11.985731 master-0 kubenswrapper[18592]: I0308 04:04:11.985681 18592 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="101e91519fa54cd23690f6a36dea77798470079f4382e04ac20e58f00a7180da" exitCode=0 Mar 08 04:04:11.985851 master-0 kubenswrapper[18592]: I0308 04:04:11.985743 18592 scope.go:117] "RemoveContainer" containerID="4d0d18c78b2f0279d72c1060fc00718b3f77f330ad1bd6f44039320acf418c8d" Mar 08 04:04:11.986089 master-0 kubenswrapper[18592]: I0308 04:04:11.985994 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:12.010709 master-0 kubenswrapper[18592]: I0308 04:04:12.009647 18592 scope.go:117] "RemoveContainer" containerID="da6fb8324087a24afc43afc026ff79a419163b34f188875acd9bb6ca7456a181" Mar 08 04:04:12.014619 master-0 kubenswrapper[18592]: I0308 04:04:12.014501 18592 status_manager.go:851] "Failed to get status for pod" podUID="077dd10388b9e3e48a07382126e86621" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:12.015549 master-0 kubenswrapper[18592]: I0308 04:04:12.015486 18592 status_manager.go:851] "Failed to get status for pod" podUID="b73e6391-2d8f-46e1-b275-a9106a730d60" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:12.029201 master-0 kubenswrapper[18592]: I0308 04:04:12.029153 18592 scope.go:117] "RemoveContainer" containerID="3d937ddeff4a562c97267b5e3e7157a51ad8f928d1db69fb1e607bd363c3cf25" Mar 08 04:04:12.058729 master-0 kubenswrapper[18592]: I0308 04:04:12.058666 18592 scope.go:117] "RemoveContainer" containerID="b4a05e47377b25ea4d9a9af26e5631afe65dee09a108205893226dbc248e594c" Mar 08 04:04:12.086269 master-0 kubenswrapper[18592]: I0308 04:04:12.085271 18592 scope.go:117] "RemoveContainer" containerID="101e91519fa54cd23690f6a36dea77798470079f4382e04ac20e58f00a7180da" Mar 08 04:04:12.104950 master-0 kubenswrapper[18592]: I0308 04:04:12.104862 18592 scope.go:117] "RemoveContainer" containerID="a35597610df1fae247e29fc70e7550b17b768468fb6d442a71528493ce8f3635" Mar 08 04:04:12.121670 master-0 kubenswrapper[18592]: I0308 04:04:12.121622 18592 scope.go:117] "RemoveContainer" containerID="4d0d18c78b2f0279d72c1060fc00718b3f77f330ad1bd6f44039320acf418c8d" Mar 08 04:04:12.122139 master-0 kubenswrapper[18592]: E0308 04:04:12.122068 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d0d18c78b2f0279d72c1060fc00718b3f77f330ad1bd6f44039320acf418c8d\": container with ID starting with 4d0d18c78b2f0279d72c1060fc00718b3f77f330ad1bd6f44039320acf418c8d not found: ID does not exist" containerID="4d0d18c78b2f0279d72c1060fc00718b3f77f330ad1bd6f44039320acf418c8d" Mar 08 04:04:12.122333 master-0 kubenswrapper[18592]: I0308 04:04:12.122128 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d0d18c78b2f0279d72c1060fc00718b3f77f330ad1bd6f44039320acf418c8d"} err="failed to get container status \"4d0d18c78b2f0279d72c1060fc00718b3f77f330ad1bd6f44039320acf418c8d\": rpc error: code = NotFound desc = could not find container \"4d0d18c78b2f0279d72c1060fc00718b3f77f330ad1bd6f44039320acf418c8d\": container with ID starting with 4d0d18c78b2f0279d72c1060fc00718b3f77f330ad1bd6f44039320acf418c8d not found: ID does not exist" Mar 08 04:04:12.122333 master-0 kubenswrapper[18592]: I0308 04:04:12.122170 18592 scope.go:117] "RemoveContainer" containerID="da6fb8324087a24afc43afc026ff79a419163b34f188875acd9bb6ca7456a181" Mar 08 04:04:12.122600 master-0 kubenswrapper[18592]: E0308 04:04:12.122568 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da6fb8324087a24afc43afc026ff79a419163b34f188875acd9bb6ca7456a181\": container with ID starting with da6fb8324087a24afc43afc026ff79a419163b34f188875acd9bb6ca7456a181 not found: ID does not exist" containerID="da6fb8324087a24afc43afc026ff79a419163b34f188875acd9bb6ca7456a181" Mar 08 04:04:12.122720 master-0 kubenswrapper[18592]: I0308 04:04:12.122602 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da6fb8324087a24afc43afc026ff79a419163b34f188875acd9bb6ca7456a181"} err="failed to get container status \"da6fb8324087a24afc43afc026ff79a419163b34f188875acd9bb6ca7456a181\": rpc error: code = NotFound desc = could not find container \"da6fb8324087a24afc43afc026ff79a419163b34f188875acd9bb6ca7456a181\": container with ID starting with da6fb8324087a24afc43afc026ff79a419163b34f188875acd9bb6ca7456a181 not found: ID does not exist" Mar 08 04:04:12.122720 master-0 kubenswrapper[18592]: I0308 04:04:12.122626 18592 scope.go:117] "RemoveContainer" containerID="3d937ddeff4a562c97267b5e3e7157a51ad8f928d1db69fb1e607bd363c3cf25" Mar 08 04:04:12.123142 master-0 kubenswrapper[18592]: E0308 04:04:12.123097 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d937ddeff4a562c97267b5e3e7157a51ad8f928d1db69fb1e607bd363c3cf25\": container with ID starting with 3d937ddeff4a562c97267b5e3e7157a51ad8f928d1db69fb1e607bd363c3cf25 not found: ID does not exist" containerID="3d937ddeff4a562c97267b5e3e7157a51ad8f928d1db69fb1e607bd363c3cf25" Mar 08 04:04:12.123232 master-0 kubenswrapper[18592]: I0308 04:04:12.123139 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d937ddeff4a562c97267b5e3e7157a51ad8f928d1db69fb1e607bd363c3cf25"} err="failed to get container status \"3d937ddeff4a562c97267b5e3e7157a51ad8f928d1db69fb1e607bd363c3cf25\": rpc error: code = NotFound desc = could not find container \"3d937ddeff4a562c97267b5e3e7157a51ad8f928d1db69fb1e607bd363c3cf25\": container with ID starting with 3d937ddeff4a562c97267b5e3e7157a51ad8f928d1db69fb1e607bd363c3cf25 not found: ID does not exist" Mar 08 04:04:12.123232 master-0 kubenswrapper[18592]: I0308 04:04:12.123165 18592 scope.go:117] "RemoveContainer" containerID="b4a05e47377b25ea4d9a9af26e5631afe65dee09a108205893226dbc248e594c" Mar 08 04:04:12.123646 master-0 kubenswrapper[18592]: E0308 04:04:12.123623 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4a05e47377b25ea4d9a9af26e5631afe65dee09a108205893226dbc248e594c\": container with ID starting with b4a05e47377b25ea4d9a9af26e5631afe65dee09a108205893226dbc248e594c not found: ID does not exist" containerID="b4a05e47377b25ea4d9a9af26e5631afe65dee09a108205893226dbc248e594c" Mar 08 04:04:12.123739 master-0 kubenswrapper[18592]: I0308 04:04:12.123643 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4a05e47377b25ea4d9a9af26e5631afe65dee09a108205893226dbc248e594c"} err="failed to get container status \"b4a05e47377b25ea4d9a9af26e5631afe65dee09a108205893226dbc248e594c\": rpc error: code = NotFound desc = could not find container \"b4a05e47377b25ea4d9a9af26e5631afe65dee09a108205893226dbc248e594c\": container with ID starting with b4a05e47377b25ea4d9a9af26e5631afe65dee09a108205893226dbc248e594c not found: ID does not exist" Mar 08 04:04:12.123739 master-0 kubenswrapper[18592]: I0308 04:04:12.123658 18592 scope.go:117] "RemoveContainer" containerID="101e91519fa54cd23690f6a36dea77798470079f4382e04ac20e58f00a7180da" Mar 08 04:04:12.123981 master-0 kubenswrapper[18592]: E0308 04:04:12.123891 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"101e91519fa54cd23690f6a36dea77798470079f4382e04ac20e58f00a7180da\": container with ID starting with 101e91519fa54cd23690f6a36dea77798470079f4382e04ac20e58f00a7180da not found: ID does not exist" containerID="101e91519fa54cd23690f6a36dea77798470079f4382e04ac20e58f00a7180da" Mar 08 04:04:12.123981 master-0 kubenswrapper[18592]: I0308 04:04:12.123912 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"101e91519fa54cd23690f6a36dea77798470079f4382e04ac20e58f00a7180da"} err="failed to get container status \"101e91519fa54cd23690f6a36dea77798470079f4382e04ac20e58f00a7180da\": rpc error: code = NotFound desc = could not find container \"101e91519fa54cd23690f6a36dea77798470079f4382e04ac20e58f00a7180da\": container with ID starting with 101e91519fa54cd23690f6a36dea77798470079f4382e04ac20e58f00a7180da not found: ID does not exist" Mar 08 04:04:12.123981 master-0 kubenswrapper[18592]: I0308 04:04:12.123925 18592 scope.go:117] "RemoveContainer" containerID="a35597610df1fae247e29fc70e7550b17b768468fb6d442a71528493ce8f3635" Mar 08 04:04:12.124333 master-0 kubenswrapper[18592]: E0308 04:04:12.124147 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a35597610df1fae247e29fc70e7550b17b768468fb6d442a71528493ce8f3635\": container with ID starting with a35597610df1fae247e29fc70e7550b17b768468fb6d442a71528493ce8f3635 not found: ID does not exist" containerID="a35597610df1fae247e29fc70e7550b17b768468fb6d442a71528493ce8f3635" Mar 08 04:04:12.124333 master-0 kubenswrapper[18592]: I0308 04:04:12.124164 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a35597610df1fae247e29fc70e7550b17b768468fb6d442a71528493ce8f3635"} err="failed to get container status \"a35597610df1fae247e29fc70e7550b17b768468fb6d442a71528493ce8f3635\": rpc error: code = NotFound desc = could not find container \"a35597610df1fae247e29fc70e7550b17b768468fb6d442a71528493ce8f3635\": container with ID starting with a35597610df1fae247e29fc70e7550b17b768468fb6d442a71528493ce8f3635 not found: ID does not exist" Mar 08 04:04:12.151481 master-0 kubenswrapper[18592]: I0308 04:04:12.151413 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="077dd10388b9e3e48a07382126e86621" path="/var/lib/kubelet/pods/077dd10388b9e3e48a07382126e86621/volumes" Mar 08 04:04:13.303361 master-0 kubenswrapper[18592]: I0308 04:04:13.303276 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:04:13.304301 master-0 kubenswrapper[18592]: I0308 04:04:13.303375 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:04:13.746201 master-0 kubenswrapper[18592]: E0308 04:04:13.746098 18592 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podb73e6391_2d8f_46e1_b275_a9106a730d60.slice/crio-d9bb3f958d2b5e248ca9acef3858fad3705cf981affbc68bf5ea66794147a61c.scope\": RecentStats: unable to find data in memory cache]" Mar 08 04:04:17.129116 master-0 kubenswrapper[18592]: E0308 04:04:17.128928 18592 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189ac1f2203726c1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:a814bd60de133d95cf99630a978c017e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 04:04:08.990443201 +0000 UTC m=+661.089197591,LastTimestamp:2026-03-08 04:04:08.990443201 +0000 UTC m=+661.089197591,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 04:04:18.149099 master-0 kubenswrapper[18592]: I0308 04:04:18.148948 18592 status_manager.go:851] "Failed to get status for pod" podUID="b73e6391-2d8f-46e1-b275-a9106a730d60" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:18.357807 master-0 kubenswrapper[18592]: E0308 04:04:18.357699 18592 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:18.358764 master-0 kubenswrapper[18592]: E0308 04:04:18.358667 18592 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:18.360401 master-0 kubenswrapper[18592]: E0308 04:04:18.360331 18592 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:18.361463 master-0 kubenswrapper[18592]: E0308 04:04:18.361388 18592 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:18.362457 master-0 kubenswrapper[18592]: E0308 04:04:18.362377 18592 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:18.362457 master-0 kubenswrapper[18592]: I0308 04:04:18.362431 18592 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 08 04:04:18.363273 master-0 kubenswrapper[18592]: E0308 04:04:18.363203 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 08 04:04:18.564804 master-0 kubenswrapper[18592]: E0308 04:04:18.564698 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 08 04:04:18.966747 master-0 kubenswrapper[18592]: E0308 04:04:18.966659 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 08 04:04:19.520707 master-0 kubenswrapper[18592]: E0308 04:04:19.520633 18592 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podb73e6391_2d8f_46e1_b275_a9106a730d60.slice/crio-d9bb3f958d2b5e248ca9acef3858fad3705cf981affbc68bf5ea66794147a61c.scope\": RecentStats: unable to find data in memory cache]" Mar 08 04:04:19.770193 master-0 kubenswrapper[18592]: E0308 04:04:19.770099 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 08 04:04:20.677817 master-0 kubenswrapper[18592]: I0308 04:04:20.677738 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:04:20.678696 master-0 kubenswrapper[18592]: I0308 04:04:20.677844 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:04:21.372115 master-0 kubenswrapper[18592]: E0308 04:04:21.372013 18592 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 08 04:04:22.105889 master-0 kubenswrapper[18592]: I0308 04:04:22.105761 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:04:22.143248 master-0 kubenswrapper[18592]: I0308 04:04:22.143179 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:22.145573 master-0 kubenswrapper[18592]: I0308 04:04:22.145474 18592 status_manager.go:851] "Failed to get status for pod" podUID="b73e6391-2d8f-46e1-b275-a9106a730d60" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:22.160744 master-0 kubenswrapper[18592]: I0308 04:04:22.160683 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:04:22.162455 master-0 kubenswrapper[18592]: I0308 04:04:22.162388 18592 status_manager.go:851] "Failed to get status for pod" podUID="b73e6391-2d8f-46e1-b275-a9106a730d60" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:22.163625 master-0 kubenswrapper[18592]: I0308 04:04:22.163561 18592 status_manager.go:851] "Failed to get status for pod" podUID="0eece71b-beb6-49f4-96cd-6b7476337ded" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:22.176971 master-0 kubenswrapper[18592]: I0308 04:04:22.176919 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="8266feeb-c6af-4652-9055-79274b93e9bb" Mar 08 04:04:22.177250 master-0 kubenswrapper[18592]: I0308 04:04:22.177225 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="8266feeb-c6af-4652-9055-79274b93e9bb" Mar 08 04:04:22.178517 master-0 kubenswrapper[18592]: E0308 04:04:22.178464 18592 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:22.179499 master-0 kubenswrapper[18592]: I0308 04:04:22.179465 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:22.225401 master-0 kubenswrapper[18592]: W0308 04:04:22.225334 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36d4251d3504cdc0ec85144c1379056c.slice/crio-3349bf5d116616a16a9fd52b348224bd6bd7a7505bd4a143c53f90c41fcf208f WatchSource:0}: Error finding container 3349bf5d116616a16a9fd52b348224bd6bd7a7505bd4a143c53f90c41fcf208f: Status 404 returned error can't find the container with id 3349bf5d116616a16a9fd52b348224bd6bd7a7505bd4a143c53f90c41fcf208f Mar 08 04:04:23.101565 master-0 kubenswrapper[18592]: I0308 04:04:23.101486 18592 generic.go:334] "Generic (PLEG): container finished" podID="36d4251d3504cdc0ec85144c1379056c" containerID="4b9350bd09ba462ca556537b05e5b835598cec9ba027dbfdef5a242e44464ee6" exitCode=0 Mar 08 04:04:23.101990 master-0 kubenswrapper[18592]: I0308 04:04:23.101629 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerDied","Data":"4b9350bd09ba462ca556537b05e5b835598cec9ba027dbfdef5a242e44464ee6"} Mar 08 04:04:23.101990 master-0 kubenswrapper[18592]: I0308 04:04:23.101694 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"3349bf5d116616a16a9fd52b348224bd6bd7a7505bd4a143c53f90c41fcf208f"} Mar 08 04:04:23.102635 master-0 kubenswrapper[18592]: I0308 04:04:23.102542 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="8266feeb-c6af-4652-9055-79274b93e9bb" Mar 08 04:04:23.102635 master-0 kubenswrapper[18592]: I0308 04:04:23.102605 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="8266feeb-c6af-4652-9055-79274b93e9bb" Mar 08 04:04:23.103630 master-0 kubenswrapper[18592]: I0308 04:04:23.103566 18592 status_manager.go:851] "Failed to get status for pod" podUID="0eece71b-beb6-49f4-96cd-6b7476337ded" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:23.103801 master-0 kubenswrapper[18592]: E0308 04:04:23.103604 18592 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:23.104549 master-0 kubenswrapper[18592]: I0308 04:04:23.104503 18592 status_manager.go:851] "Failed to get status for pod" podUID="b73e6391-2d8f-46e1-b275-a9106a730d60" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:23.149977 master-0 kubenswrapper[18592]: I0308 04:04:23.149882 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 04:04:23.151334 master-0 kubenswrapper[18592]: I0308 04:04:23.151260 18592 status_manager.go:851] "Failed to get status for pod" podUID="0eece71b-beb6-49f4-96cd-6b7476337ded" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:23.152348 master-0 kubenswrapper[18592]: I0308 04:04:23.152266 18592 status_manager.go:851] "Failed to get status for pod" podUID="b73e6391-2d8f-46e1-b275-a9106a730d60" pod="openshift-kube-apiserver/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 04:04:23.303185 master-0 kubenswrapper[18592]: I0308 04:04:23.303065 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:04:23.303497 master-0 kubenswrapper[18592]: I0308 04:04:23.303192 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:04:24.122062 master-0 kubenswrapper[18592]: I0308 04:04:24.121989 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"3713de753ef3ab2d50930bfb051dd8a580f1bd856ddf5d3dc98cb23cdb86b001"} Mar 08 04:04:24.122062 master-0 kubenswrapper[18592]: I0308 04:04:24.122058 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"07d09dcd8294b63a56b10b56cf3053a6900c16dfd6ebd0de847aa01eec409ddb"} Mar 08 04:04:25.130545 master-0 kubenswrapper[18592]: I0308 04:04:25.130501 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"dddd68873c70f8c2f095ee28526110fd67d4ffc1c54008b66fb905afd566a4cd"} Mar 08 04:04:25.130545 master-0 kubenswrapper[18592]: I0308 04:04:25.130546 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"47c10d53ad455568e72e01d669a2a22c661d6400a717b2ceacee48e33aaf95a8"} Mar 08 04:04:25.130545 master-0 kubenswrapper[18592]: I0308 04:04:25.130556 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"a00f865b19714a93946adf17c679a505afbedcb972b9aef45b924955d6f55ddb"} Mar 08 04:04:25.131165 master-0 kubenswrapper[18592]: I0308 04:04:25.130682 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:25.131165 master-0 kubenswrapper[18592]: I0308 04:04:25.130796 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="8266feeb-c6af-4652-9055-79274b93e9bb" Mar 08 04:04:25.131165 master-0 kubenswrapper[18592]: I0308 04:04:25.130889 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="8266feeb-c6af-4652-9055-79274b93e9bb" Mar 08 04:04:27.180756 master-0 kubenswrapper[18592]: I0308 04:04:27.180556 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:27.180756 master-0 kubenswrapper[18592]: I0308 04:04:27.180751 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:27.185981 master-0 kubenswrapper[18592]: I0308 04:04:27.185851 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:28.909363 master-0 kubenswrapper[18592]: E0308 04:04:28.909295 18592 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podb73e6391_2d8f_46e1_b275_a9106a730d60.slice/crio-d9bb3f958d2b5e248ca9acef3858fad3705cf981affbc68bf5ea66794147a61c.scope\": RecentStats: unable to find data in memory cache]" Mar 08 04:04:29.571607 master-0 kubenswrapper[18592]: E0308 04:04:29.570743 18592 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podb73e6391_2d8f_46e1_b275_a9106a730d60.slice/crio-d9bb3f958d2b5e248ca9acef3858fad3705cf981affbc68bf5ea66794147a61c.scope\": RecentStats: unable to find data in memory cache]" Mar 08 04:04:30.152675 master-0 kubenswrapper[18592]: I0308 04:04:30.152583 18592 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:30.183861 master-0 kubenswrapper[18592]: I0308 04:04:30.183312 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="8266feeb-c6af-4652-9055-79274b93e9bb" Mar 08 04:04:30.183861 master-0 kubenswrapper[18592]: I0308 04:04:30.183365 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="8266feeb-c6af-4652-9055-79274b93e9bb" Mar 08 04:04:30.188921 master-0 kubenswrapper[18592]: I0308 04:04:30.188068 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:30.214140 master-0 kubenswrapper[18592]: I0308 04:04:30.214047 18592 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="36d4251d3504cdc0ec85144c1379056c" podUID="a9b3d921-fbe9-4fae-8f8d-9009087bd5ab" Mar 08 04:04:30.676493 master-0 kubenswrapper[18592]: I0308 04:04:30.676443 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:04:30.676947 master-0 kubenswrapper[18592]: I0308 04:04:30.676885 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:04:31.197412 master-0 kubenswrapper[18592]: I0308 04:04:31.197326 18592 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="8266feeb-c6af-4652-9055-79274b93e9bb" Mar 08 04:04:31.197412 master-0 kubenswrapper[18592]: I0308 04:04:31.197379 18592 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="8266feeb-c6af-4652-9055-79274b93e9bb" Mar 08 04:04:33.302776 master-0 kubenswrapper[18592]: I0308 04:04:33.302680 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:04:33.302776 master-0 kubenswrapper[18592]: I0308 04:04:33.302768 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:04:37.700599 master-0 kubenswrapper[18592]: E0308 04:04:37.700506 18592 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podb73e6391_2d8f_46e1_b275_a9106a730d60.slice/crio-d9bb3f958d2b5e248ca9acef3858fad3705cf981affbc68bf5ea66794147a61c.scope\": RecentStats: unable to find data in memory cache]" Mar 08 04:04:37.701633 master-0 kubenswrapper[18592]: E0308 04:04:37.700640 18592 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podb73e6391_2d8f_46e1_b275_a9106a730d60.slice/crio-d9bb3f958d2b5e248ca9acef3858fad3705cf981affbc68bf5ea66794147a61c.scope\": RecentStats: unable to find data in memory cache]" Mar 08 04:04:38.167899 master-0 kubenswrapper[18592]: I0308 04:04:38.167737 18592 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="36d4251d3504cdc0ec85144c1379056c" podUID="a9b3d921-fbe9-4fae-8f8d-9009087bd5ab" Mar 08 04:04:39.633621 master-0 kubenswrapper[18592]: E0308 04:04:39.633519 18592 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podb73e6391_2d8f_46e1_b275_a9106a730d60.slice/crio-d9bb3f958d2b5e248ca9acef3858fad3705cf981affbc68bf5ea66794147a61c.scope\": RecentStats: unable to find data in memory cache]" Mar 08 04:04:39.664640 master-0 kubenswrapper[18592]: I0308 04:04:39.664560 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 04:04:39.888748 master-0 kubenswrapper[18592]: I0308 04:04:39.888581 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 04:04:39.941868 master-0 kubenswrapper[18592]: I0308 04:04:39.941763 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 08 04:04:40.216487 master-0 kubenswrapper[18592]: I0308 04:04:40.216390 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 08 04:04:40.223571 master-0 kubenswrapper[18592]: I0308 04:04:40.223517 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 04:04:40.394021 master-0 kubenswrapper[18592]: I0308 04:04:40.393974 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 04:04:40.520784 master-0 kubenswrapper[18592]: I0308 04:04:40.520620 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 04:04:40.565680 master-0 kubenswrapper[18592]: I0308 04:04:40.565585 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 04:04:40.598417 master-0 kubenswrapper[18592]: I0308 04:04:40.598337 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 04:04:40.680404 master-0 kubenswrapper[18592]: I0308 04:04:40.676908 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:04:40.680404 master-0 kubenswrapper[18592]: I0308 04:04:40.677012 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:04:40.733676 master-0 kubenswrapper[18592]: I0308 04:04:40.733582 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-55li3r6nupslu" Mar 08 04:04:40.944860 master-0 kubenswrapper[18592]: I0308 04:04:40.944756 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 04:04:41.155550 master-0 kubenswrapper[18592]: I0308 04:04:41.155472 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 08 04:04:41.361292 master-0 kubenswrapper[18592]: I0308 04:04:41.361133 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 08 04:04:41.363637 master-0 kubenswrapper[18592]: I0308 04:04:41.362094 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 08 04:04:41.406405 master-0 kubenswrapper[18592]: I0308 04:04:41.406268 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 04:04:41.572187 master-0 kubenswrapper[18592]: I0308 04:04:41.572028 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 04:04:41.682381 master-0 kubenswrapper[18592]: I0308 04:04:41.682205 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 04:04:41.952768 master-0 kubenswrapper[18592]: I0308 04:04:41.952686 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 08 04:04:41.962513 master-0 kubenswrapper[18592]: I0308 04:04:41.962006 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 08 04:04:42.012450 master-0 kubenswrapper[18592]: I0308 04:04:42.012387 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-rnqsm" Mar 08 04:04:42.018552 master-0 kubenswrapper[18592]: I0308 04:04:42.018493 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 04:04:42.185300 master-0 kubenswrapper[18592]: I0308 04:04:42.185222 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 08 04:04:42.269928 master-0 kubenswrapper[18592]: I0308 04:04:42.269750 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 08 04:04:42.351487 master-0 kubenswrapper[18592]: I0308 04:04:42.351413 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 08 04:04:42.446849 master-0 kubenswrapper[18592]: I0308 04:04:42.444351 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 08 04:04:42.446849 master-0 kubenswrapper[18592]: I0308 04:04:42.446214 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 08 04:04:42.475846 master-0 kubenswrapper[18592]: I0308 04:04:42.475572 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 04:04:42.478042 master-0 kubenswrapper[18592]: I0308 04:04:42.476233 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 04:04:42.485320 master-0 kubenswrapper[18592]: I0308 04:04:42.485276 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 04:04:42.574022 master-0 kubenswrapper[18592]: I0308 04:04:42.573886 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 04:04:42.585849 master-0 kubenswrapper[18592]: I0308 04:04:42.585802 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 04:04:42.591877 master-0 kubenswrapper[18592]: I0308 04:04:42.591810 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 08 04:04:42.663402 master-0 kubenswrapper[18592]: I0308 04:04:42.663342 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 04:04:42.685574 master-0 kubenswrapper[18592]: I0308 04:04:42.685503 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 08 04:04:42.903470 master-0 kubenswrapper[18592]: I0308 04:04:42.903343 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-3ipfg9gfac918" Mar 08 04:04:42.976649 master-0 kubenswrapper[18592]: I0308 04:04:42.976568 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-65bbw" Mar 08 04:04:43.061182 master-0 kubenswrapper[18592]: I0308 04:04:43.061062 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 04:04:43.061182 master-0 kubenswrapper[18592]: I0308 04:04:43.061135 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-n6zd8" Mar 08 04:04:43.207866 master-0 kubenswrapper[18592]: I0308 04:04:43.207740 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 04:04:43.300149 master-0 kubenswrapper[18592]: I0308 04:04:43.300092 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 08 04:04:43.301143 master-0 kubenswrapper[18592]: I0308 04:04:43.301071 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 08 04:04:43.302671 master-0 kubenswrapper[18592]: I0308 04:04:43.302621 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:04:43.302812 master-0 kubenswrapper[18592]: I0308 04:04:43.302700 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:04:43.376585 master-0 kubenswrapper[18592]: I0308 04:04:43.376484 18592 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 04:04:43.400947 master-0 kubenswrapper[18592]: I0308 04:04:43.400806 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 04:04:43.534030 master-0 kubenswrapper[18592]: I0308 04:04:43.533880 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-4rdlv" Mar 08 04:04:43.682740 master-0 kubenswrapper[18592]: I0308 04:04:43.682694 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 04:04:43.743473 master-0 kubenswrapper[18592]: E0308 04:04:43.743383 18592 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podb73e6391_2d8f_46e1_b275_a9106a730d60.slice/crio-d9bb3f958d2b5e248ca9acef3858fad3705cf981affbc68bf5ea66794147a61c.scope\": RecentStats: unable to find data in memory cache]" Mar 08 04:04:43.772090 master-0 kubenswrapper[18592]: I0308 04:04:43.771761 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 04:04:43.870065 master-0 kubenswrapper[18592]: I0308 04:04:43.869768 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 04:04:43.907877 master-0 kubenswrapper[18592]: I0308 04:04:43.907785 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 04:04:43.910274 master-0 kubenswrapper[18592]: I0308 04:04:43.910194 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 04:04:44.000920 master-0 kubenswrapper[18592]: I0308 04:04:44.000864 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 08 04:04:44.056719 master-0 kubenswrapper[18592]: I0308 04:04:44.056285 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 04:04:44.077763 master-0 kubenswrapper[18592]: I0308 04:04:44.077699 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 04:04:44.187003 master-0 kubenswrapper[18592]: I0308 04:04:44.186933 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-hvbwb" Mar 08 04:04:44.355389 master-0 kubenswrapper[18592]: I0308 04:04:44.355314 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 08 04:04:44.505218 master-0 kubenswrapper[18592]: I0308 04:04:44.505069 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 08 04:04:44.585840 master-0 kubenswrapper[18592]: I0308 04:04:44.585785 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 04:04:44.587996 master-0 kubenswrapper[18592]: I0308 04:04:44.587919 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-snc9w" Mar 08 04:04:44.732568 master-0 kubenswrapper[18592]: I0308 04:04:44.732497 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 08 04:04:44.755656 master-0 kubenswrapper[18592]: I0308 04:04:44.755503 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 04:04:44.818781 master-0 kubenswrapper[18592]: I0308 04:04:44.818688 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 04:04:44.852019 master-0 kubenswrapper[18592]: I0308 04:04:44.851960 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-4nfrl" Mar 08 04:04:44.919031 master-0 kubenswrapper[18592]: I0308 04:04:44.918961 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 08 04:04:45.144967 master-0 kubenswrapper[18592]: I0308 04:04:45.144779 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 04:04:45.145747 master-0 kubenswrapper[18592]: I0308 04:04:45.145692 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-hbsn4" Mar 08 04:04:45.151354 master-0 kubenswrapper[18592]: I0308 04:04:45.151285 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 08 04:04:45.153786 master-0 kubenswrapper[18592]: I0308 04:04:45.153703 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 04:04:45.159677 master-0 kubenswrapper[18592]: I0308 04:04:45.159616 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 08 04:04:45.228040 master-0 kubenswrapper[18592]: I0308 04:04:45.227959 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 04:04:45.231864 master-0 kubenswrapper[18592]: I0308 04:04:45.231789 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 04:04:45.250905 master-0 kubenswrapper[18592]: I0308 04:04:45.250816 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 08 04:04:45.339720 master-0 kubenswrapper[18592]: I0308 04:04:45.339625 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 04:04:45.373193 master-0 kubenswrapper[18592]: I0308 04:04:45.373139 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-92fqc" Mar 08 04:04:45.494717 master-0 kubenswrapper[18592]: I0308 04:04:45.494648 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 04:04:45.510959 master-0 kubenswrapper[18592]: I0308 04:04:45.510885 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 04:04:45.511165 master-0 kubenswrapper[18592]: I0308 04:04:45.511065 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 04:04:45.698818 master-0 kubenswrapper[18592]: I0308 04:04:45.698760 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 08 04:04:45.700452 master-0 kubenswrapper[18592]: I0308 04:04:45.700413 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 08 04:04:45.754727 master-0 kubenswrapper[18592]: I0308 04:04:45.754581 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 04:04:45.826649 master-0 kubenswrapper[18592]: I0308 04:04:45.826583 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 04:04:45.857725 master-0 kubenswrapper[18592]: I0308 04:04:45.857641 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 04:04:45.864186 master-0 kubenswrapper[18592]: I0308 04:04:45.864149 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 08 04:04:45.923694 master-0 kubenswrapper[18592]: I0308 04:04:45.923619 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 04:04:45.972233 master-0 kubenswrapper[18592]: I0308 04:04:45.972121 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 04:04:46.027984 master-0 kubenswrapper[18592]: I0308 04:04:46.027253 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 04:04:46.060553 master-0 kubenswrapper[18592]: I0308 04:04:46.060466 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-cg9nh" Mar 08 04:04:46.069188 master-0 kubenswrapper[18592]: I0308 04:04:46.069024 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-27jjw" Mar 08 04:04:46.069382 master-0 kubenswrapper[18592]: I0308 04:04:46.069197 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 04:04:46.082446 master-0 kubenswrapper[18592]: I0308 04:04:46.082343 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 04:04:46.088425 master-0 kubenswrapper[18592]: I0308 04:04:46.088361 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 04:04:46.089503 master-0 kubenswrapper[18592]: I0308 04:04:46.089450 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 04:04:46.175943 master-0 kubenswrapper[18592]: I0308 04:04:46.175187 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 08 04:04:46.263714 master-0 kubenswrapper[18592]: I0308 04:04:46.263624 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 04:04:46.265582 master-0 kubenswrapper[18592]: I0308 04:04:46.265520 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 04:04:46.270939 master-0 kubenswrapper[18592]: I0308 04:04:46.270882 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 08 04:04:46.281987 master-0 kubenswrapper[18592]: I0308 04:04:46.281852 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 04:04:46.286760 master-0 kubenswrapper[18592]: I0308 04:04:46.286280 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 08 04:04:46.327340 master-0 kubenswrapper[18592]: I0308 04:04:46.327232 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 04:04:46.432744 master-0 kubenswrapper[18592]: I0308 04:04:46.432656 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 04:04:46.448503 master-0 kubenswrapper[18592]: I0308 04:04:46.448432 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 04:04:46.464048 master-0 kubenswrapper[18592]: I0308 04:04:46.463983 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 08 04:04:46.617801 master-0 kubenswrapper[18592]: I0308 04:04:46.617661 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 04:04:46.645342 master-0 kubenswrapper[18592]: I0308 04:04:46.645253 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 04:04:46.674972 master-0 kubenswrapper[18592]: I0308 04:04:46.674915 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 04:04:46.694913 master-0 kubenswrapper[18592]: I0308 04:04:46.694873 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 04:04:46.753688 master-0 kubenswrapper[18592]: I0308 04:04:46.753631 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 08 04:04:46.772609 master-0 kubenswrapper[18592]: I0308 04:04:46.772520 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 04:04:46.784244 master-0 kubenswrapper[18592]: I0308 04:04:46.784195 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 08 04:04:46.817582 master-0 kubenswrapper[18592]: I0308 04:04:46.817498 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 08 04:04:46.880637 master-0 kubenswrapper[18592]: I0308 04:04:46.880472 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 04:04:46.926140 master-0 kubenswrapper[18592]: I0308 04:04:46.924655 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 04:04:46.977443 master-0 kubenswrapper[18592]: I0308 04:04:46.977374 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 08 04:04:47.088535 master-0 kubenswrapper[18592]: I0308 04:04:47.088452 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 04:04:47.112942 master-0 kubenswrapper[18592]: I0308 04:04:47.112873 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 08 04:04:47.127759 master-0 kubenswrapper[18592]: I0308 04:04:47.127668 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-7cjnr" Mar 08 04:04:47.144669 master-0 kubenswrapper[18592]: I0308 04:04:47.144434 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 08 04:04:47.167883 master-0 kubenswrapper[18592]: I0308 04:04:47.167784 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 08 04:04:47.208446 master-0 kubenswrapper[18592]: I0308 04:04:47.208359 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 04:04:47.222024 master-0 kubenswrapper[18592]: I0308 04:04:47.221872 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 04:04:47.229142 master-0 kubenswrapper[18592]: I0308 04:04:47.229088 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 08 04:04:47.236228 master-0 kubenswrapper[18592]: I0308 04:04:47.236185 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 08 04:04:47.416672 master-0 kubenswrapper[18592]: I0308 04:04:47.416495 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 08 04:04:47.466763 master-0 kubenswrapper[18592]: I0308 04:04:47.466686 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 08 04:04:47.516529 master-0 kubenswrapper[18592]: I0308 04:04:47.516458 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 08 04:04:47.616534 master-0 kubenswrapper[18592]: I0308 04:04:47.616449 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 04:04:47.707770 master-0 kubenswrapper[18592]: I0308 04:04:47.707694 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 08 04:04:47.800795 master-0 kubenswrapper[18592]: I0308 04:04:47.800682 18592 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 04:04:47.807198 master-0 kubenswrapper[18592]: I0308 04:04:47.807106 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 04:04:47.807377 master-0 kubenswrapper[18592]: I0308 04:04:47.807250 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 04:04:47.812129 master-0 kubenswrapper[18592]: I0308 04:04:47.812012 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 04:04:47.821420 master-0 kubenswrapper[18592]: I0308 04:04:47.821362 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 08 04:04:47.841087 master-0 kubenswrapper[18592]: I0308 04:04:47.841000 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=17.840972103 podStartE2EDuration="17.840972103s" podCreationTimestamp="2026-03-08 04:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:04:47.837919233 +0000 UTC m=+699.936673623" watchObservedRunningTime="2026-03-08 04:04:47.840972103 +0000 UTC m=+699.939726493" Mar 08 04:04:47.858965 master-0 kubenswrapper[18592]: I0308 04:04:47.858855 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 04:04:47.890573 master-0 kubenswrapper[18592]: I0308 04:04:47.890474 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 08 04:04:47.929439 master-0 kubenswrapper[18592]: I0308 04:04:47.929334 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 04:04:48.018711 master-0 kubenswrapper[18592]: I0308 04:04:48.018536 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 04:04:48.074430 master-0 kubenswrapper[18592]: I0308 04:04:48.074361 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4cbgb" Mar 08 04:04:48.095349 master-0 kubenswrapper[18592]: I0308 04:04:48.095281 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 04:04:48.124872 master-0 kubenswrapper[18592]: I0308 04:04:48.119285 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 04:04:48.124872 master-0 kubenswrapper[18592]: I0308 04:04:48.119721 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 08 04:04:48.124872 master-0 kubenswrapper[18592]: I0308 04:04:48.119967 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 04:04:48.136794 master-0 kubenswrapper[18592]: I0308 04:04:48.136616 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 08 04:04:48.152198 master-0 kubenswrapper[18592]: I0308 04:04:48.152139 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 04:04:48.364011 master-0 kubenswrapper[18592]: I0308 04:04:48.363865 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 08 04:04:48.403144 master-0 kubenswrapper[18592]: I0308 04:04:48.402686 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 08 04:04:48.439531 master-0 kubenswrapper[18592]: I0308 04:04:48.439405 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-lnbcj" Mar 08 04:04:48.503161 master-0 kubenswrapper[18592]: I0308 04:04:48.503092 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 04:04:48.593868 master-0 kubenswrapper[18592]: I0308 04:04:48.593777 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 04:04:48.612613 master-0 kubenswrapper[18592]: I0308 04:04:48.612529 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 04:04:48.635270 master-0 kubenswrapper[18592]: I0308 04:04:48.635120 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 04:04:48.635270 master-0 kubenswrapper[18592]: I0308 04:04:48.635247 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 04:04:48.644670 master-0 kubenswrapper[18592]: I0308 04:04:48.644632 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 04:04:48.687812 master-0 kubenswrapper[18592]: I0308 04:04:48.687719 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 04:04:48.704378 master-0 kubenswrapper[18592]: I0308 04:04:48.704307 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 04:04:48.769280 master-0 kubenswrapper[18592]: I0308 04:04:48.769177 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 04:04:48.866737 master-0 kubenswrapper[18592]: I0308 04:04:48.866653 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 04:04:48.890358 master-0 kubenswrapper[18592]: I0308 04:04:48.890187 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 08 04:04:48.945319 master-0 kubenswrapper[18592]: I0308 04:04:48.945222 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 04:04:48.989810 master-0 kubenswrapper[18592]: I0308 04:04:48.989719 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 04:04:49.033362 master-0 kubenswrapper[18592]: I0308 04:04:49.033287 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 08 04:04:49.044918 master-0 kubenswrapper[18592]: I0308 04:04:49.044801 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-qfnls" Mar 08 04:04:49.120260 master-0 kubenswrapper[18592]: I0308 04:04:49.120130 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 04:04:49.129386 master-0 kubenswrapper[18592]: I0308 04:04:49.129338 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 08 04:04:49.209983 master-0 kubenswrapper[18592]: I0308 04:04:49.209582 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 04:04:49.227072 master-0 kubenswrapper[18592]: I0308 04:04:49.226991 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 04:04:49.231629 master-0 kubenswrapper[18592]: I0308 04:04:49.231570 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 04:04:49.281101 master-0 kubenswrapper[18592]: I0308 04:04:49.280086 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 04:04:49.289820 master-0 kubenswrapper[18592]: I0308 04:04:49.289757 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 08 04:04:49.318778 master-0 kubenswrapper[18592]: I0308 04:04:49.318708 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 04:04:49.336679 master-0 kubenswrapper[18592]: I0308 04:04:49.336632 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 04:04:49.423858 master-0 kubenswrapper[18592]: I0308 04:04:49.423751 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 08 04:04:49.508435 master-0 kubenswrapper[18592]: I0308 04:04:49.508275 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 08 04:04:49.548167 master-0 kubenswrapper[18592]: I0308 04:04:49.548104 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-c6d9k" Mar 08 04:04:49.592963 master-0 kubenswrapper[18592]: I0308 04:04:49.592869 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 04:04:49.633146 master-0 kubenswrapper[18592]: I0308 04:04:49.633074 18592 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 04:04:49.657270 master-0 kubenswrapper[18592]: I0308 04:04:49.657195 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 08 04:04:49.670556 master-0 kubenswrapper[18592]: I0308 04:04:49.670450 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 08 04:04:49.685527 master-0 kubenswrapper[18592]: I0308 04:04:49.685471 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 08 04:04:49.699358 master-0 kubenswrapper[18592]: I0308 04:04:49.699287 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 04:04:49.702957 master-0 kubenswrapper[18592]: I0308 04:04:49.702817 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 04:04:49.722530 master-0 kubenswrapper[18592]: I0308 04:04:49.721464 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 04:04:49.768777 master-0 kubenswrapper[18592]: I0308 04:04:49.768643 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 08 04:04:49.773012 master-0 kubenswrapper[18592]: I0308 04:04:49.772980 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 04:04:49.826090 master-0 kubenswrapper[18592]: I0308 04:04:49.826037 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 04:04:49.831658 master-0 kubenswrapper[18592]: I0308 04:04:49.831591 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 04:04:49.877683 master-0 kubenswrapper[18592]: E0308 04:04:49.877599 18592 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podb73e6391_2d8f_46e1_b275_a9106a730d60.slice/crio-d9bb3f958d2b5e248ca9acef3858fad3705cf981affbc68bf5ea66794147a61c.scope\": RecentStats: unable to find data in memory cache]" Mar 08 04:04:49.888633 master-0 kubenswrapper[18592]: I0308 04:04:49.888574 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 04:04:49.905860 master-0 kubenswrapper[18592]: I0308 04:04:49.905804 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 04:04:50.009171 master-0 kubenswrapper[18592]: I0308 04:04:50.009094 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 08 04:04:50.048775 master-0 kubenswrapper[18592]: I0308 04:04:50.048639 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 04:04:50.223504 master-0 kubenswrapper[18592]: I0308 04:04:50.223389 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 08 04:04:50.223867 master-0 kubenswrapper[18592]: I0308 04:04:50.223689 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 04:04:50.250460 master-0 kubenswrapper[18592]: I0308 04:04:50.250372 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 08 04:04:50.307370 master-0 kubenswrapper[18592]: I0308 04:04:50.307208 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 04:04:50.386300 master-0 kubenswrapper[18592]: I0308 04:04:50.386189 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 04:04:50.425724 master-0 kubenswrapper[18592]: I0308 04:04:50.425616 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 04:04:50.428158 master-0 kubenswrapper[18592]: I0308 04:04:50.428104 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 04:04:50.518726 master-0 kubenswrapper[18592]: I0308 04:04:50.518635 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 04:04:50.537392 master-0 kubenswrapper[18592]: I0308 04:04:50.537307 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-slxzk" Mar 08 04:04:50.551206 master-0 kubenswrapper[18592]: I0308 04:04:50.551104 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 08 04:04:50.552885 master-0 kubenswrapper[18592]: I0308 04:04:50.552800 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 04:04:50.584762 master-0 kubenswrapper[18592]: I0308 04:04:50.584605 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 08 04:04:50.675759 master-0 kubenswrapper[18592]: I0308 04:04:50.675673 18592 patch_prober.go:28] interesting pod/console-744db48f96-lgsd4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" start-of-body= Mar 08 04:04:50.676041 master-0 kubenswrapper[18592]: I0308 04:04:50.675758 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.96:8443/health\": dial tcp 10.128.0.96:8443: connect: connection refused" Mar 08 04:04:50.739719 master-0 kubenswrapper[18592]: I0308 04:04:50.739633 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 04:04:50.746307 master-0 kubenswrapper[18592]: I0308 04:04:50.746235 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-hdmbr" Mar 08 04:04:50.762740 master-0 kubenswrapper[18592]: I0308 04:04:50.762665 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 04:04:50.811056 master-0 kubenswrapper[18592]: I0308 04:04:50.810951 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 08 04:04:50.825214 master-0 kubenswrapper[18592]: I0308 04:04:50.825124 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 08 04:04:50.827872 master-0 kubenswrapper[18592]: I0308 04:04:50.827765 18592 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 04:04:50.836990 master-0 kubenswrapper[18592]: I0308 04:04:50.836763 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 04:04:51.056745 master-0 kubenswrapper[18592]: I0308 04:04:51.056652 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 04:04:51.069013 master-0 kubenswrapper[18592]: I0308 04:04:51.068767 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 08 04:04:51.145122 master-0 kubenswrapper[18592]: I0308 04:04:51.143115 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 04:04:51.221450 master-0 kubenswrapper[18592]: I0308 04:04:51.221370 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 08 04:04:51.310581 master-0 kubenswrapper[18592]: I0308 04:04:51.310316 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 08 04:04:51.390049 master-0 kubenswrapper[18592]: I0308 04:04:51.389995 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 04:04:51.433300 master-0 kubenswrapper[18592]: I0308 04:04:51.433130 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 08 04:04:51.457367 master-0 kubenswrapper[18592]: I0308 04:04:51.457287 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 04:04:51.457749 master-0 kubenswrapper[18592]: I0308 04:04:51.457376 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 04:04:51.547811 master-0 kubenswrapper[18592]: I0308 04:04:51.547723 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 08 04:04:51.558273 master-0 kubenswrapper[18592]: I0308 04:04:51.558226 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 08 04:04:51.561031 master-0 kubenswrapper[18592]: I0308 04:04:51.560982 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-5s872" Mar 08 04:04:51.707547 master-0 kubenswrapper[18592]: I0308 04:04:51.707468 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 08 04:04:51.718411 master-0 kubenswrapper[18592]: I0308 04:04:51.718319 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 08 04:04:51.729088 master-0 kubenswrapper[18592]: I0308 04:04:51.729027 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 08 04:04:51.733555 master-0 kubenswrapper[18592]: I0308 04:04:51.733515 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-xbmc7" Mar 08 04:04:51.733780 master-0 kubenswrapper[18592]: I0308 04:04:51.733676 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 04:04:51.820896 master-0 kubenswrapper[18592]: I0308 04:04:51.820759 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 04:04:51.975114 master-0 kubenswrapper[18592]: I0308 04:04:51.974952 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 08 04:04:51.977154 master-0 kubenswrapper[18592]: I0308 04:04:51.977075 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 04:04:51.983044 master-0 kubenswrapper[18592]: I0308 04:04:51.982999 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 08 04:04:52.005137 master-0 kubenswrapper[18592]: I0308 04:04:52.005068 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 04:04:52.006156 master-0 kubenswrapper[18592]: I0308 04:04:52.005288 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 04:04:52.006801 master-0 kubenswrapper[18592]: I0308 04:04:52.006760 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 04:04:52.058294 master-0 kubenswrapper[18592]: I0308 04:04:52.058239 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 04:04:52.199403 master-0 kubenswrapper[18592]: I0308 04:04:52.199332 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 08 04:04:52.228249 master-0 kubenswrapper[18592]: I0308 04:04:52.227754 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 08 04:04:52.294556 master-0 kubenswrapper[18592]: I0308 04:04:52.294493 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 04:04:52.344593 master-0 kubenswrapper[18592]: I0308 04:04:52.344516 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 04:04:52.381008 master-0 kubenswrapper[18592]: I0308 04:04:52.380892 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 04:04:52.404341 master-0 kubenswrapper[18592]: I0308 04:04:52.404287 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-7dkp2" Mar 08 04:04:52.416256 master-0 kubenswrapper[18592]: I0308 04:04:52.416184 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 04:04:52.474428 master-0 kubenswrapper[18592]: I0308 04:04:52.474360 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 04:04:52.550037 master-0 kubenswrapper[18592]: I0308 04:04:52.549874 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 08 04:04:52.648771 master-0 kubenswrapper[18592]: I0308 04:04:52.648703 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 04:04:52.656179 master-0 kubenswrapper[18592]: I0308 04:04:52.656130 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 04:04:52.660109 master-0 kubenswrapper[18592]: I0308 04:04:52.660043 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 04:04:52.722563 master-0 kubenswrapper[18592]: I0308 04:04:52.722421 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 08 04:04:52.743200 master-0 kubenswrapper[18592]: I0308 04:04:52.743107 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 04:04:52.743461 master-0 kubenswrapper[18592]: I0308 04:04:52.743428 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 08 04:04:52.745160 master-0 kubenswrapper[18592]: I0308 04:04:52.745076 18592 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 04:04:52.745489 master-0 kubenswrapper[18592]: I0308 04:04:52.745427 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" containerID="cri-o://38d58eb6cc1f0fc948fc0e241422f9cbe055b447427adb52e3e099a976cca861" gracePeriod=5 Mar 08 04:04:52.800214 master-0 kubenswrapper[18592]: I0308 04:04:52.800065 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 08 04:04:52.853120 master-0 kubenswrapper[18592]: I0308 04:04:52.852999 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-44sfif47rohlm" Mar 08 04:04:52.872777 master-0 kubenswrapper[18592]: I0308 04:04:52.872700 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 08 04:04:52.952613 master-0 kubenswrapper[18592]: I0308 04:04:52.952393 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 04:04:52.979165 master-0 kubenswrapper[18592]: I0308 04:04:52.979072 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 04:04:52.996443 master-0 kubenswrapper[18592]: I0308 04:04:52.996382 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 04:04:53.105747 master-0 kubenswrapper[18592]: I0308 04:04:53.105584 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 04:04:53.182172 master-0 kubenswrapper[18592]: I0308 04:04:53.182093 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 04:04:53.269752 master-0 kubenswrapper[18592]: I0308 04:04:53.269680 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-n8nt7" Mar 08 04:04:53.273382 master-0 kubenswrapper[18592]: I0308 04:04:53.273337 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 08 04:04:53.302858 master-0 kubenswrapper[18592]: I0308 04:04:53.302710 18592 patch_prober.go:28] interesting pod/console-7748864899-8p6h5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 08 04:04:53.302858 master-0 kubenswrapper[18592]: I0308 04:04:53.302793 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 08 04:04:53.413336 master-0 kubenswrapper[18592]: I0308 04:04:53.413179 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 08 04:04:53.419761 master-0 kubenswrapper[18592]: I0308 04:04:53.419703 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 08 04:04:53.457674 master-0 kubenswrapper[18592]: I0308 04:04:53.457603 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 04:04:53.588236 master-0 kubenswrapper[18592]: I0308 04:04:53.588131 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 08 04:04:53.628881 master-0 kubenswrapper[18592]: I0308 04:04:53.628757 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 04:04:53.658762 master-0 kubenswrapper[18592]: I0308 04:04:53.658703 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 08 04:04:53.662437 master-0 kubenswrapper[18592]: I0308 04:04:53.662398 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 04:04:53.665998 master-0 kubenswrapper[18592]: I0308 04:04:53.665898 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 04:04:53.692694 master-0 kubenswrapper[18592]: I0308 04:04:53.692483 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 08 04:04:53.703782 master-0 kubenswrapper[18592]: I0308 04:04:53.703736 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 04:04:53.704478 master-0 kubenswrapper[18592]: I0308 04:04:53.704448 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 08 04:04:53.723255 master-0 kubenswrapper[18592]: I0308 04:04:53.723029 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 04:04:53.730011 master-0 kubenswrapper[18592]: I0308 04:04:53.729947 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 04:04:53.797253 master-0 kubenswrapper[18592]: I0308 04:04:53.797187 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 04:04:53.857701 master-0 kubenswrapper[18592]: I0308 04:04:53.857640 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 04:04:53.941448 master-0 kubenswrapper[18592]: I0308 04:04:53.941412 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 08 04:04:54.196886 master-0 kubenswrapper[18592]: I0308 04:04:54.196672 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 08 04:04:54.291270 master-0 kubenswrapper[18592]: I0308 04:04:54.291202 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 08 04:04:54.300879 master-0 kubenswrapper[18592]: I0308 04:04:54.300792 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 08 04:04:54.327407 master-0 kubenswrapper[18592]: I0308 04:04:54.327372 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 08 04:04:54.422545 master-0 kubenswrapper[18592]: I0308 04:04:54.422491 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-6m6zv" Mar 08 04:04:54.513786 master-0 kubenswrapper[18592]: I0308 04:04:54.513616 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 04:04:54.535229 master-0 kubenswrapper[18592]: I0308 04:04:54.535159 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 08 04:04:54.569764 master-0 kubenswrapper[18592]: I0308 04:04:54.569717 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 04:04:54.590059 master-0 kubenswrapper[18592]: I0308 04:04:54.589953 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 08 04:04:54.685639 master-0 kubenswrapper[18592]: I0308 04:04:54.685556 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 08 04:04:54.693053 master-0 kubenswrapper[18592]: I0308 04:04:54.693003 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 04:04:54.693547 master-0 kubenswrapper[18592]: I0308 04:04:54.693497 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-gnr8f" Mar 08 04:04:54.708131 master-0 kubenswrapper[18592]: I0308 04:04:54.708041 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 04:04:54.816442 master-0 kubenswrapper[18592]: I0308 04:04:54.816258 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 04:04:54.877789 master-0 kubenswrapper[18592]: I0308 04:04:54.877749 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 08 04:04:54.884529 master-0 kubenswrapper[18592]: I0308 04:04:54.884473 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 04:04:55.026078 master-0 kubenswrapper[18592]: I0308 04:04:55.026020 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 04:04:55.319224 master-0 kubenswrapper[18592]: I0308 04:04:55.319170 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 04:04:55.423794 master-0 kubenswrapper[18592]: I0308 04:04:55.423718 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 04:04:55.454774 master-0 kubenswrapper[18592]: I0308 04:04:55.454710 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 04:04:55.522783 master-0 kubenswrapper[18592]: I0308 04:04:55.522701 18592 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 04:04:55.665934 master-0 kubenswrapper[18592]: I0308 04:04:55.665767 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 08 04:04:55.691743 master-0 kubenswrapper[18592]: I0308 04:04:55.691684 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 04:04:55.695611 master-0 kubenswrapper[18592]: I0308 04:04:55.695556 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 08 04:04:55.908410 master-0 kubenswrapper[18592]: I0308 04:04:55.908335 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 04:04:55.917682 master-0 kubenswrapper[18592]: I0308 04:04:55.917539 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 08 04:04:55.934854 master-0 kubenswrapper[18592]: I0308 04:04:55.932703 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 08 04:04:55.987261 master-0 kubenswrapper[18592]: I0308 04:04:55.987165 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 04:04:56.136819 master-0 kubenswrapper[18592]: I0308 04:04:56.136730 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 08 04:04:56.470381 master-0 kubenswrapper[18592]: I0308 04:04:56.470284 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 08 04:04:56.617465 master-0 kubenswrapper[18592]: I0308 04:04:56.617393 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 08 04:04:56.620724 master-0 kubenswrapper[18592]: I0308 04:04:56.620676 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 08 04:04:56.757798 master-0 kubenswrapper[18592]: I0308 04:04:56.757661 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 04:04:56.935729 master-0 kubenswrapper[18592]: I0308 04:04:56.935562 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 08 04:04:57.201759 master-0 kubenswrapper[18592]: I0308 04:04:57.201637 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 04:04:57.324214 master-0 kubenswrapper[18592]: I0308 04:04:57.324114 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 04:04:57.353242 master-0 kubenswrapper[18592]: I0308 04:04:57.353171 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 04:04:58.265724 master-0 kubenswrapper[18592]: I0308 04:04:58.265633 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-g8pwp" Mar 08 04:04:58.351346 master-0 kubenswrapper[18592]: I0308 04:04:58.351283 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_a814bd60de133d95cf99630a978c017e/startup-monitor/0.log" Mar 08 04:04:58.351570 master-0 kubenswrapper[18592]: I0308 04:04:58.351386 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:58.427329 master-0 kubenswrapper[18592]: I0308 04:04:58.427208 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 08 04:04:58.427329 master-0 kubenswrapper[18592]: I0308 04:04:58.427338 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 08 04:04:58.427853 master-0 kubenswrapper[18592]: I0308 04:04:58.427348 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock" (OuterVolumeSpecName: "var-lock") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:04:58.427853 master-0 kubenswrapper[18592]: I0308 04:04:58.427386 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 08 04:04:58.427853 master-0 kubenswrapper[18592]: I0308 04:04:58.427416 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:04:58.427853 master-0 kubenswrapper[18592]: I0308 04:04:58.427412 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests" (OuterVolumeSpecName: "manifests") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:04:58.427853 master-0 kubenswrapper[18592]: I0308 04:04:58.427449 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 08 04:04:58.427853 master-0 kubenswrapper[18592]: I0308 04:04:58.427480 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log" (OuterVolumeSpecName: "var-log") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:04:58.427853 master-0 kubenswrapper[18592]: I0308 04:04:58.427605 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 08 04:04:58.428542 master-0 kubenswrapper[18592]: I0308 04:04:58.428283 18592 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 04:04:58.428542 master-0 kubenswrapper[18592]: I0308 04:04:58.428314 18592 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") on node \"master-0\" DevicePath \"\"" Mar 08 04:04:58.428542 master-0 kubenswrapper[18592]: I0308 04:04:58.428338 18592 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 04:04:58.428542 master-0 kubenswrapper[18592]: I0308 04:04:58.428368 18592 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") on node \"master-0\" DevicePath \"\"" Mar 08 04:04:58.433656 master-0 kubenswrapper[18592]: I0308 04:04:58.433561 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:04:58.489184 master-0 kubenswrapper[18592]: I0308 04:04:58.489132 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_a814bd60de133d95cf99630a978c017e/startup-monitor/0.log" Mar 08 04:04:58.489413 master-0 kubenswrapper[18592]: I0308 04:04:58.489189 18592 generic.go:334] "Generic (PLEG): container finished" podID="a814bd60de133d95cf99630a978c017e" containerID="38d58eb6cc1f0fc948fc0e241422f9cbe055b447427adb52e3e099a976cca861" exitCode=137 Mar 08 04:04:58.489413 master-0 kubenswrapper[18592]: I0308 04:04:58.489246 18592 scope.go:117] "RemoveContainer" containerID="38d58eb6cc1f0fc948fc0e241422f9cbe055b447427adb52e3e099a976cca861" Mar 08 04:04:58.489413 master-0 kubenswrapper[18592]: I0308 04:04:58.489351 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 04:04:58.515248 master-0 kubenswrapper[18592]: I0308 04:04:58.515201 18592 scope.go:117] "RemoveContainer" containerID="38d58eb6cc1f0fc948fc0e241422f9cbe055b447427adb52e3e099a976cca861" Mar 08 04:04:58.515869 master-0 kubenswrapper[18592]: E0308 04:04:58.515694 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38d58eb6cc1f0fc948fc0e241422f9cbe055b447427adb52e3e099a976cca861\": container with ID starting with 38d58eb6cc1f0fc948fc0e241422f9cbe055b447427adb52e3e099a976cca861 not found: ID does not exist" containerID="38d58eb6cc1f0fc948fc0e241422f9cbe055b447427adb52e3e099a976cca861" Mar 08 04:04:58.515869 master-0 kubenswrapper[18592]: I0308 04:04:58.515727 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38d58eb6cc1f0fc948fc0e241422f9cbe055b447427adb52e3e099a976cca861"} err="failed to get container status \"38d58eb6cc1f0fc948fc0e241422f9cbe055b447427adb52e3e099a976cca861\": rpc error: code = NotFound desc = could not find container \"38d58eb6cc1f0fc948fc0e241422f9cbe055b447427adb52e3e099a976cca861\": container with ID starting with 38d58eb6cc1f0fc948fc0e241422f9cbe055b447427adb52e3e099a976cca861 not found: ID does not exist" Mar 08 04:04:58.529817 master-0 kubenswrapper[18592]: I0308 04:04:58.529748 18592 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 04:04:58.901013 master-0 kubenswrapper[18592]: E0308 04:04:58.900345 18592 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podb73e6391_2d8f_46e1_b275_a9106a730d60.slice/crio-d9bb3f958d2b5e248ca9acef3858fad3705cf981affbc68bf5ea66794147a61c.scope\": RecentStats: unable to find data in memory cache]" Mar 08 04:04:59.918307 master-0 kubenswrapper[18592]: E0308 04:04:59.918221 18592 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podb73e6391_2d8f_46e1_b275_a9106a730d60.slice/crio-d9bb3f958d2b5e248ca9acef3858fad3705cf981affbc68bf5ea66794147a61c.scope\": RecentStats: unable to find data in memory cache]" Mar 08 04:05:00.186260 master-0 kubenswrapper[18592]: I0308 04:05:00.158415 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a814bd60de133d95cf99630a978c017e" path="/var/lib/kubelet/pods/a814bd60de133d95cf99630a978c017e/volumes" Mar 08 04:05:00.682964 master-0 kubenswrapper[18592]: I0308 04:05:00.682892 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-744db48f96-lgsd4" Mar 08 04:05:00.693105 master-0 kubenswrapper[18592]: I0308 04:05:00.693056 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-744db48f96-lgsd4" Mar 08 04:05:03.312035 master-0 kubenswrapper[18592]: I0308 04:05:03.311937 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7748864899-8p6h5" Mar 08 04:05:03.323372 master-0 kubenswrapper[18592]: I0308 04:05:03.323297 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7748864899-8p6h5" Mar 08 04:05:03.462762 master-0 kubenswrapper[18592]: I0308 04:05:03.462678 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-744db48f96-lgsd4"] Mar 08 04:05:28.503964 master-0 kubenswrapper[18592]: I0308 04:05:28.503424 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-744db48f96-lgsd4" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" containerID="cri-o://32a66192def097d021b93aa8df6620f28795ead81c5ef79a07d5eac921422344" gracePeriod=15 Mar 08 04:05:28.786332 master-0 kubenswrapper[18592]: I0308 04:05:28.783634 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-744db48f96-lgsd4_935ab7fb-b097-41c3-8926-8343eb29e7fc/console/1.log" Mar 08 04:05:28.786332 master-0 kubenswrapper[18592]: I0308 04:05:28.784464 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-744db48f96-lgsd4_935ab7fb-b097-41c3-8926-8343eb29e7fc/console/0.log" Mar 08 04:05:28.786332 master-0 kubenswrapper[18592]: I0308 04:05:28.784514 18592 generic.go:334] "Generic (PLEG): container finished" podID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerID="32a66192def097d021b93aa8df6620f28795ead81c5ef79a07d5eac921422344" exitCode=2 Mar 08 04:05:28.786332 master-0 kubenswrapper[18592]: I0308 04:05:28.784559 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-744db48f96-lgsd4" event={"ID":"935ab7fb-b097-41c3-8926-8343eb29e7fc","Type":"ContainerDied","Data":"32a66192def097d021b93aa8df6620f28795ead81c5ef79a07d5eac921422344"} Mar 08 04:05:28.786332 master-0 kubenswrapper[18592]: I0308 04:05:28.784602 18592 scope.go:117] "RemoveContainer" containerID="576ba1799553100c4affee23b8bf267786a47451fbe5e13058a6627861ec622f" Mar 08 04:05:28.935328 master-0 kubenswrapper[18592]: I0308 04:05:28.935084 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-744db48f96-lgsd4_935ab7fb-b097-41c3-8926-8343eb29e7fc/console/1.log" Mar 08 04:05:28.935680 master-0 kubenswrapper[18592]: I0308 04:05:28.935353 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-744db48f96-lgsd4" Mar 08 04:05:29.066052 master-0 kubenswrapper[18592]: I0308 04:05:29.065311 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/935ab7fb-b097-41c3-8926-8343eb29e7fc-console-config\") pod \"935ab7fb-b097-41c3-8926-8343eb29e7fc\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " Mar 08 04:05:29.066052 master-0 kubenswrapper[18592]: I0308 04:05:29.065981 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/935ab7fb-b097-41c3-8926-8343eb29e7fc-console-serving-cert\") pod \"935ab7fb-b097-41c3-8926-8343eb29e7fc\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " Mar 08 04:05:29.066052 master-0 kubenswrapper[18592]: I0308 04:05:29.065919 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/935ab7fb-b097-41c3-8926-8343eb29e7fc-console-config" (OuterVolumeSpecName: "console-config") pod "935ab7fb-b097-41c3-8926-8343eb29e7fc" (UID: "935ab7fb-b097-41c3-8926-8343eb29e7fc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:05:29.066657 master-0 kubenswrapper[18592]: I0308 04:05:29.066629 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/935ab7fb-b097-41c3-8926-8343eb29e7fc-console-oauth-config\") pod \"935ab7fb-b097-41c3-8926-8343eb29e7fc\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " Mar 08 04:05:29.066731 master-0 kubenswrapper[18592]: I0308 04:05:29.066718 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc4rd\" (UniqueName: \"kubernetes.io/projected/935ab7fb-b097-41c3-8926-8343eb29e7fc-kube-api-access-mc4rd\") pod \"935ab7fb-b097-41c3-8926-8343eb29e7fc\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " Mar 08 04:05:29.067247 master-0 kubenswrapper[18592]: I0308 04:05:29.067221 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/935ab7fb-b097-41c3-8926-8343eb29e7fc-service-ca\") pod \"935ab7fb-b097-41c3-8926-8343eb29e7fc\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " Mar 08 04:05:29.067371 master-0 kubenswrapper[18592]: I0308 04:05:29.067344 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/935ab7fb-b097-41c3-8926-8343eb29e7fc-oauth-serving-cert\") pod \"935ab7fb-b097-41c3-8926-8343eb29e7fc\" (UID: \"935ab7fb-b097-41c3-8926-8343eb29e7fc\") " Mar 08 04:05:29.067775 master-0 kubenswrapper[18592]: I0308 04:05:29.067719 18592 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/935ab7fb-b097-41c3-8926-8343eb29e7fc-console-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:05:29.068309 master-0 kubenswrapper[18592]: I0308 04:05:29.068215 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/935ab7fb-b097-41c3-8926-8343eb29e7fc-service-ca" (OuterVolumeSpecName: "service-ca") pod "935ab7fb-b097-41c3-8926-8343eb29e7fc" (UID: "935ab7fb-b097-41c3-8926-8343eb29e7fc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:05:29.068524 master-0 kubenswrapper[18592]: I0308 04:05:29.068428 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/935ab7fb-b097-41c3-8926-8343eb29e7fc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "935ab7fb-b097-41c3-8926-8343eb29e7fc" (UID: "935ab7fb-b097-41c3-8926-8343eb29e7fc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:05:29.070373 master-0 kubenswrapper[18592]: I0308 04:05:29.070323 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/935ab7fb-b097-41c3-8926-8343eb29e7fc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "935ab7fb-b097-41c3-8926-8343eb29e7fc" (UID: "935ab7fb-b097-41c3-8926-8343eb29e7fc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:05:29.072524 master-0 kubenswrapper[18592]: I0308 04:05:29.072479 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/935ab7fb-b097-41c3-8926-8343eb29e7fc-kube-api-access-mc4rd" (OuterVolumeSpecName: "kube-api-access-mc4rd") pod "935ab7fb-b097-41c3-8926-8343eb29e7fc" (UID: "935ab7fb-b097-41c3-8926-8343eb29e7fc"). InnerVolumeSpecName "kube-api-access-mc4rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:05:29.074391 master-0 kubenswrapper[18592]: I0308 04:05:29.074020 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/935ab7fb-b097-41c3-8926-8343eb29e7fc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "935ab7fb-b097-41c3-8926-8343eb29e7fc" (UID: "935ab7fb-b097-41c3-8926-8343eb29e7fc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:05:29.169342 master-0 kubenswrapper[18592]: I0308 04:05:29.169267 18592 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/935ab7fb-b097-41c3-8926-8343eb29e7fc-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 04:05:29.169342 master-0 kubenswrapper[18592]: I0308 04:05:29.169321 18592 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/935ab7fb-b097-41c3-8926-8343eb29e7fc-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:05:29.169342 master-0 kubenswrapper[18592]: I0308 04:05:29.169334 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc4rd\" (UniqueName: \"kubernetes.io/projected/935ab7fb-b097-41c3-8926-8343eb29e7fc-kube-api-access-mc4rd\") on node \"master-0\" DevicePath \"\"" Mar 08 04:05:29.169342 master-0 kubenswrapper[18592]: I0308 04:05:29.169345 18592 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/935ab7fb-b097-41c3-8926-8343eb29e7fc-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 04:05:29.169342 master-0 kubenswrapper[18592]: I0308 04:05:29.169360 18592 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/935ab7fb-b097-41c3-8926-8343eb29e7fc-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 04:05:29.218304 master-0 kubenswrapper[18592]: I0308 04:05:29.218237 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-648fbfb658-kprxt"] Mar 08 04:05:29.218608 master-0 kubenswrapper[18592]: E0308 04:05:29.218578 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" Mar 08 04:05:29.218608 master-0 kubenswrapper[18592]: I0308 04:05:29.218599 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" Mar 08 04:05:29.218765 master-0 kubenswrapper[18592]: E0308 04:05:29.218623 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b73e6391-2d8f-46e1-b275-a9106a730d60" containerName="installer" Mar 08 04:05:29.218765 master-0 kubenswrapper[18592]: I0308 04:05:29.218634 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="b73e6391-2d8f-46e1-b275-a9106a730d60" containerName="installer" Mar 08 04:05:29.218765 master-0 kubenswrapper[18592]: E0308 04:05:29.218656 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" Mar 08 04:05:29.218765 master-0 kubenswrapper[18592]: I0308 04:05:29.218664 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" Mar 08 04:05:29.218765 master-0 kubenswrapper[18592]: E0308 04:05:29.218676 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" Mar 08 04:05:29.218765 master-0 kubenswrapper[18592]: I0308 04:05:29.218684 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" Mar 08 04:05:29.219185 master-0 kubenswrapper[18592]: I0308 04:05:29.218964 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" Mar 08 04:05:29.219185 master-0 kubenswrapper[18592]: I0308 04:05:29.218979 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="b73e6391-2d8f-46e1-b275-a9106a730d60" containerName="installer" Mar 08 04:05:29.219185 master-0 kubenswrapper[18592]: I0308 04:05:29.219001 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" Mar 08 04:05:29.219384 master-0 kubenswrapper[18592]: I0308 04:05:29.219285 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" containerName="console" Mar 08 04:05:29.220241 master-0 kubenswrapper[18592]: I0308 04:05:29.220199 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.225428 master-0 kubenswrapper[18592]: I0308 04:05:29.225388 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 08 04:05:29.226073 master-0 kubenswrapper[18592]: I0308 04:05:29.225469 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 08 04:05:29.226219 master-0 kubenswrapper[18592]: I0308 04:05:29.226167 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 08 04:05:29.227996 master-0 kubenswrapper[18592]: I0308 04:05:29.227225 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 08 04:05:29.227996 master-0 kubenswrapper[18592]: I0308 04:05:29.227249 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 08 04:05:29.234361 master-0 kubenswrapper[18592]: I0308 04:05:29.234312 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 08 04:05:29.239538 master-0 kubenswrapper[18592]: I0308 04:05:29.239474 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-648fbfb658-kprxt"] Mar 08 04:05:29.372280 master-0 kubenswrapper[18592]: I0308 04:05:29.372130 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-metrics-client-ca\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.372280 master-0 kubenswrapper[18592]: I0308 04:05:29.372208 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-federate-client-tls\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.372280 master-0 kubenswrapper[18592]: I0308 04:05:29.372256 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46vx9\" (UniqueName: \"kubernetes.io/projected/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-kube-api-access-46vx9\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.372593 master-0 kubenswrapper[18592]: I0308 04:05:29.372306 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-trusted-ca-bundle\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.372593 master-0 kubenswrapper[18592]: I0308 04:05:29.372375 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-serving-certs-ca-bundle\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.372593 master-0 kubenswrapper[18592]: I0308 04:05:29.372400 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-secret-telemeter-client\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.372593 master-0 kubenswrapper[18592]: I0308 04:05:29.372422 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.372593 master-0 kubenswrapper[18592]: I0308 04:05:29.372456 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.474420 master-0 kubenswrapper[18592]: I0308 04:05:29.474275 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-trusted-ca-bundle\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.474420 master-0 kubenswrapper[18592]: I0308 04:05:29.474386 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-serving-certs-ca-bundle\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.474420 master-0 kubenswrapper[18592]: I0308 04:05:29.474408 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-secret-telemeter-client\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.474420 master-0 kubenswrapper[18592]: I0308 04:05:29.474425 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.474924 master-0 kubenswrapper[18592]: I0308 04:05:29.474564 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.474924 master-0 kubenswrapper[18592]: E0308 04:05:29.474733 18592 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Mar 08 04:05:29.474924 master-0 kubenswrapper[18592]: I0308 04:05:29.474800 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-metrics-client-ca\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.474924 master-0 kubenswrapper[18592]: E0308 04:05:29.474866 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls podName:b1ad8862-127f-4a2c-9846-57fef0b5cdb6 nodeName:}" failed. No retries permitted until 2026-03-08 04:05:29.974812672 +0000 UTC m=+742.073567052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls") pod "telemeter-client-648fbfb658-kprxt" (UID: "b1ad8862-127f-4a2c-9846-57fef0b5cdb6") : secret "telemeter-client-tls" not found Mar 08 04:05:29.474924 master-0 kubenswrapper[18592]: I0308 04:05:29.474906 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-federate-client-tls\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.475267 master-0 kubenswrapper[18592]: I0308 04:05:29.475010 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46vx9\" (UniqueName: \"kubernetes.io/projected/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-kube-api-access-46vx9\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.476454 master-0 kubenswrapper[18592]: I0308 04:05:29.475981 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-metrics-client-ca\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.476591 master-0 kubenswrapper[18592]: I0308 04:05:29.476461 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-serving-certs-ca-bundle\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.476666 master-0 kubenswrapper[18592]: I0308 04:05:29.476592 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-trusted-ca-bundle\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.480366 master-0 kubenswrapper[18592]: I0308 04:05:29.480005 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-federate-client-tls\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.480366 master-0 kubenswrapper[18592]: I0308 04:05:29.480107 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-secret-telemeter-client\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.485772 master-0 kubenswrapper[18592]: I0308 04:05:29.482102 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.511493 master-0 kubenswrapper[18592]: I0308 04:05:29.511429 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46vx9\" (UniqueName: \"kubernetes.io/projected/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-kube-api-access-46vx9\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.798284 master-0 kubenswrapper[18592]: I0308 04:05:29.798215 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-744db48f96-lgsd4_935ab7fb-b097-41c3-8926-8343eb29e7fc/console/1.log" Mar 08 04:05:29.798651 master-0 kubenswrapper[18592]: I0308 04:05:29.798324 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-744db48f96-lgsd4" event={"ID":"935ab7fb-b097-41c3-8926-8343eb29e7fc","Type":"ContainerDied","Data":"26efdafcb9fa82a0a97f09fa91567fd66e01ffff3d441f8948c444e1410e6016"} Mar 08 04:05:29.798651 master-0 kubenswrapper[18592]: I0308 04:05:29.798373 18592 scope.go:117] "RemoveContainer" containerID="32a66192def097d021b93aa8df6620f28795ead81c5ef79a07d5eac921422344" Mar 08 04:05:29.798651 master-0 kubenswrapper[18592]: I0308 04:05:29.798464 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-744db48f96-lgsd4" Mar 08 04:05:29.872621 master-0 kubenswrapper[18592]: I0308 04:05:29.872520 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-744db48f96-lgsd4"] Mar 08 04:05:29.888180 master-0 kubenswrapper[18592]: I0308 04:05:29.888065 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-744db48f96-lgsd4"] Mar 08 04:05:29.982983 master-0 kubenswrapper[18592]: I0308 04:05:29.982878 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:29.983348 master-0 kubenswrapper[18592]: E0308 04:05:29.983159 18592 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Mar 08 04:05:29.983348 master-0 kubenswrapper[18592]: E0308 04:05:29.983334 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls podName:b1ad8862-127f-4a2c-9846-57fef0b5cdb6 nodeName:}" failed. No retries permitted until 2026-03-08 04:05:30.983298156 +0000 UTC m=+743.082052546 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls") pod "telemeter-client-648fbfb658-kprxt" (UID: "b1ad8862-127f-4a2c-9846-57fef0b5cdb6") : secret "telemeter-client-tls" not found Mar 08 04:05:30.156691 master-0 kubenswrapper[18592]: I0308 04:05:30.156505 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="935ab7fb-b097-41c3-8926-8343eb29e7fc" path="/var/lib/kubelet/pods/935ab7fb-b097-41c3-8926-8343eb29e7fc/volumes" Mar 08 04:05:31.001817 master-0 kubenswrapper[18592]: I0308 04:05:31.001669 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:31.002700 master-0 kubenswrapper[18592]: E0308 04:05:31.001883 18592 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Mar 08 04:05:31.002700 master-0 kubenswrapper[18592]: E0308 04:05:31.001988 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls podName:b1ad8862-127f-4a2c-9846-57fef0b5cdb6 nodeName:}" failed. No retries permitted until 2026-03-08 04:05:33.001961762 +0000 UTC m=+745.100716152 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls") pod "telemeter-client-648fbfb658-kprxt" (UID: "b1ad8862-127f-4a2c-9846-57fef0b5cdb6") : secret "telemeter-client-tls" not found Mar 08 04:05:33.044624 master-0 kubenswrapper[18592]: I0308 04:05:33.044480 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:33.045532 master-0 kubenswrapper[18592]: E0308 04:05:33.044753 18592 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Mar 08 04:05:33.045532 master-0 kubenswrapper[18592]: E0308 04:05:33.044925 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls podName:b1ad8862-127f-4a2c-9846-57fef0b5cdb6 nodeName:}" failed. No retries permitted until 2026-03-08 04:05:37.04489648 +0000 UTC m=+749.143650870 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls") pod "telemeter-client-648fbfb658-kprxt" (UID: "b1ad8862-127f-4a2c-9846-57fef0b5cdb6") : secret "telemeter-client-tls" not found Mar 08 04:05:37.118607 master-0 kubenswrapper[18592]: I0308 04:05:37.118507 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:37.119907 master-0 kubenswrapper[18592]: E0308 04:05:37.118867 18592 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Mar 08 04:05:37.119907 master-0 kubenswrapper[18592]: E0308 04:05:37.118950 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls podName:b1ad8862-127f-4a2c-9846-57fef0b5cdb6 nodeName:}" failed. No retries permitted until 2026-03-08 04:05:45.11892543 +0000 UTC m=+757.217679820 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls") pod "telemeter-client-648fbfb658-kprxt" (UID: "b1ad8862-127f-4a2c-9846-57fef0b5cdb6") : secret "telemeter-client-tls" not found Mar 08 04:05:45.167032 master-0 kubenswrapper[18592]: I0308 04:05:45.166943 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:05:45.168105 master-0 kubenswrapper[18592]: E0308 04:05:45.167198 18592 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Mar 08 04:05:45.168105 master-0 kubenswrapper[18592]: E0308 04:05:45.167379 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls podName:b1ad8862-127f-4a2c-9846-57fef0b5cdb6 nodeName:}" failed. No retries permitted until 2026-03-08 04:06:01.167345913 +0000 UTC m=+773.266100293 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls") pod "telemeter-client-648fbfb658-kprxt" (UID: "b1ad8862-127f-4a2c-9846-57fef0b5cdb6") : secret "telemeter-client-tls" not found Mar 08 04:06:01.256747 master-0 kubenswrapper[18592]: I0308 04:06:01.256648 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:06:01.257858 master-0 kubenswrapper[18592]: E0308 04:06:01.256934 18592 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Mar 08 04:06:01.257858 master-0 kubenswrapper[18592]: E0308 04:06:01.257053 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls podName:b1ad8862-127f-4a2c-9846-57fef0b5cdb6 nodeName:}" failed. No retries permitted until 2026-03-08 04:06:33.257024938 +0000 UTC m=+805.355779328 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls") pod "telemeter-client-648fbfb658-kprxt" (UID: "b1ad8862-127f-4a2c-9846-57fef0b5cdb6") : secret "telemeter-client-tls" not found Mar 08 04:06:11.124262 master-0 kubenswrapper[18592]: I0308 04:06:11.124209 18592 patch_prober.go:28] interesting pod/monitoring-plugin-64589489d-z2spc container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.128.0.85:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 04:06:11.125326 master-0 kubenswrapper[18592]: I0308 04:06:11.125281 18592 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-64589489d-z2spc" podUID="8b79990f-516d-4eb7-bc3f-bf63ff11f105" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.128.0.85:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 04:06:27.251016 master-0 kubenswrapper[18592]: I0308 04:06:27.250947 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-68d695c84c-t9xsb"] Mar 08 04:06:27.252034 master-0 kubenswrapper[18592]: I0308 04:06:27.251989 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.278216 master-0 kubenswrapper[18592]: I0308 04:06:27.277172 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68d695c84c-t9xsb"] Mar 08 04:06:27.404802 master-0 kubenswrapper[18592]: I0308 04:06:27.404723 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bzv7\" (UniqueName: \"kubernetes.io/projected/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-kube-api-access-4bzv7\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.404802 master-0 kubenswrapper[18592]: I0308 04:06:27.404785 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-oauth-serving-cert\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.405156 master-0 kubenswrapper[18592]: I0308 04:06:27.404964 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-service-ca\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.405156 master-0 kubenswrapper[18592]: I0308 04:06:27.405086 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-trusted-ca-bundle\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.405279 master-0 kubenswrapper[18592]: I0308 04:06:27.405249 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-console-config\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.405335 master-0 kubenswrapper[18592]: I0308 04:06:27.405300 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-console-oauth-config\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.405386 master-0 kubenswrapper[18592]: I0308 04:06:27.405351 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-console-serving-cert\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.506491 master-0 kubenswrapper[18592]: I0308 04:06:27.506346 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bzv7\" (UniqueName: \"kubernetes.io/projected/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-kube-api-access-4bzv7\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.506491 master-0 kubenswrapper[18592]: I0308 04:06:27.506397 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-oauth-serving-cert\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.506767 master-0 kubenswrapper[18592]: I0308 04:06:27.506710 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-service-ca\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.506864 master-0 kubenswrapper[18592]: I0308 04:06:27.506818 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-trusted-ca-bundle\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.506993 master-0 kubenswrapper[18592]: I0308 04:06:27.506958 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-console-config\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.507073 master-0 kubenswrapper[18592]: I0308 04:06:27.507005 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-console-oauth-config\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.507073 master-0 kubenswrapper[18592]: I0308 04:06:27.507045 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-console-serving-cert\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.507320 master-0 kubenswrapper[18592]: I0308 04:06:27.507287 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-oauth-serving-cert\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.507412 master-0 kubenswrapper[18592]: I0308 04:06:27.507383 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-service-ca\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.507890 master-0 kubenswrapper[18592]: I0308 04:06:27.507853 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-console-config\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.508546 master-0 kubenswrapper[18592]: I0308 04:06:27.508496 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-trusted-ca-bundle\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.509924 master-0 kubenswrapper[18592]: I0308 04:06:27.509894 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-console-oauth-config\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.511868 master-0 kubenswrapper[18592]: I0308 04:06:27.511812 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-console-serving-cert\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.526524 master-0 kubenswrapper[18592]: I0308 04:06:27.526461 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bzv7\" (UniqueName: \"kubernetes.io/projected/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-kube-api-access-4bzv7\") pod \"console-68d695c84c-t9xsb\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:27.575144 master-0 kubenswrapper[18592]: I0308 04:06:27.575066 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:28.091076 master-0 kubenswrapper[18592]: I0308 04:06:28.091017 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68d695c84c-t9xsb"] Mar 08 04:06:28.345970 master-0 kubenswrapper[18592]: I0308 04:06:28.345809 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68d695c84c-t9xsb" event={"ID":"d1eeeb8d-1358-45f7-aa42-1c71e290ea30","Type":"ContainerStarted","Data":"8f7c4f157c47cc7d92f96cbce7b4a38de130a47685e91981345bb1f064ada373"} Mar 08 04:06:28.345970 master-0 kubenswrapper[18592]: I0308 04:06:28.345903 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68d695c84c-t9xsb" event={"ID":"d1eeeb8d-1358-45f7-aa42-1c71e290ea30","Type":"ContainerStarted","Data":"a9fff1f228c48371879736616861565f35b55c5102018462054f4dd4df0e3c84"} Mar 08 04:06:28.368537 master-0 kubenswrapper[18592]: I0308 04:06:28.368467 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68d695c84c-t9xsb" podStartSLOduration=1.368447636 podStartE2EDuration="1.368447636s" podCreationTimestamp="2026-03-08 04:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:06:28.363842773 +0000 UTC m=+800.462597123" watchObservedRunningTime="2026-03-08 04:06:28.368447636 +0000 UTC m=+800.467201986" Mar 08 04:06:33.318930 master-0 kubenswrapper[18592]: I0308 04:06:33.318847 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:06:33.328743 master-0 kubenswrapper[18592]: I0308 04:06:33.328679 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b1ad8862-127f-4a2c-9846-57fef0b5cdb6-telemeter-client-tls\") pod \"telemeter-client-648fbfb658-kprxt\" (UID: \"b1ad8862-127f-4a2c-9846-57fef0b5cdb6\") " pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:06:33.451394 master-0 kubenswrapper[18592]: I0308 04:06:33.451306 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" Mar 08 04:06:33.922612 master-0 kubenswrapper[18592]: I0308 04:06:33.922526 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-648fbfb658-kprxt"] Mar 08 04:06:33.924159 master-0 kubenswrapper[18592]: W0308 04:06:33.924084 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1ad8862_127f_4a2c_9846_57fef0b5cdb6.slice/crio-eec713f57c4b557c49c10c48a80be441363da055331deeae878b093b2206732c WatchSource:0}: Error finding container eec713f57c4b557c49c10c48a80be441363da055331deeae878b093b2206732c: Status 404 returned error can't find the container with id eec713f57c4b557c49c10c48a80be441363da055331deeae878b093b2206732c Mar 08 04:06:33.929541 master-0 kubenswrapper[18592]: I0308 04:06:33.929500 18592 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 04:06:34.411085 master-0 kubenswrapper[18592]: I0308 04:06:34.411010 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" event={"ID":"b1ad8862-127f-4a2c-9846-57fef0b5cdb6","Type":"ContainerStarted","Data":"eec713f57c4b557c49c10c48a80be441363da055331deeae878b093b2206732c"} Mar 08 04:06:37.439274 master-0 kubenswrapper[18592]: I0308 04:06:37.439225 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-648fbfb658-kprxt_b1ad8862-127f-4a2c-9846-57fef0b5cdb6/telemeter-client/0.log" Mar 08 04:06:37.439944 master-0 kubenswrapper[18592]: I0308 04:06:37.439345 18592 generic.go:334] "Generic (PLEG): container finished" podID="b1ad8862-127f-4a2c-9846-57fef0b5cdb6" containerID="f654a17244fa66689a6186d0dc2f40d32d28c0ac5f2f6d9f8a05a42b17feff54" exitCode=1 Mar 08 04:06:37.439944 master-0 kubenswrapper[18592]: I0308 04:06:37.439390 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" event={"ID":"b1ad8862-127f-4a2c-9846-57fef0b5cdb6","Type":"ContainerStarted","Data":"001c46882fddc40928c379f2501ab83e77a2f20d48176851c8c0ee67970af29d"} Mar 08 04:06:37.439944 master-0 kubenswrapper[18592]: I0308 04:06:37.439423 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" event={"ID":"b1ad8862-127f-4a2c-9846-57fef0b5cdb6","Type":"ContainerStarted","Data":"e09329589fb525bd47b2a5c3099f1c7841c8c9bb469766df4ce8f847a16f1fdb"} Mar 08 04:06:37.439944 master-0 kubenswrapper[18592]: I0308 04:06:37.439523 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" event={"ID":"b1ad8862-127f-4a2c-9846-57fef0b5cdb6","Type":"ContainerDied","Data":"f654a17244fa66689a6186d0dc2f40d32d28c0ac5f2f6d9f8a05a42b17feff54"} Mar 08 04:06:37.439944 master-0 kubenswrapper[18592]: I0308 04:06:37.439916 18592 scope.go:117] "RemoveContainer" containerID="f654a17244fa66689a6186d0dc2f40d32d28c0ac5f2f6d9f8a05a42b17feff54" Mar 08 04:06:37.575365 master-0 kubenswrapper[18592]: I0308 04:06:37.575303 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:37.575649 master-0 kubenswrapper[18592]: I0308 04:06:37.575409 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:37.581407 master-0 kubenswrapper[18592]: I0308 04:06:37.581319 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:38.456462 master-0 kubenswrapper[18592]: I0308 04:06:38.456411 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-648fbfb658-kprxt_b1ad8862-127f-4a2c-9846-57fef0b5cdb6/telemeter-client/0.log" Mar 08 04:06:38.459014 master-0 kubenswrapper[18592]: I0308 04:06:38.458931 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" event={"ID":"b1ad8862-127f-4a2c-9846-57fef0b5cdb6","Type":"ContainerStarted","Data":"5cb024cc0abd8c967e8d62a2ae83170c98ae1977e7c898598505cb7e4b62f626"} Mar 08 04:06:38.465936 master-0 kubenswrapper[18592]: I0308 04:06:38.464518 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:06:38.502344 master-0 kubenswrapper[18592]: I0308 04:06:38.502238 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-648fbfb658-kprxt" podStartSLOduration=66.763926251 podStartE2EDuration="1m9.502210073s" podCreationTimestamp="2026-03-08 04:05:29 +0000 UTC" firstStartedPulling="2026-03-08 04:06:33.929275731 +0000 UTC m=+806.028030121" lastFinishedPulling="2026-03-08 04:06:36.667559573 +0000 UTC m=+808.766313943" observedRunningTime="2026-03-08 04:06:38.494521329 +0000 UTC m=+810.593275689" watchObservedRunningTime="2026-03-08 04:06:38.502210073 +0000 UTC m=+810.600964433" Mar 08 04:06:38.633992 master-0 kubenswrapper[18592]: I0308 04:06:38.633919 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7748864899-8p6h5"] Mar 08 04:06:39.099074 master-0 kubenswrapper[18592]: I0308 04:06:39.098991 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-c6cff696d-qdnlj"] Mar 08 04:06:39.100408 master-0 kubenswrapper[18592]: I0308 04:06:39.100355 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.142602 master-0 kubenswrapper[18592]: I0308 04:06:39.142470 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c6cff696d-qdnlj"] Mar 08 04:06:39.213412 master-0 kubenswrapper[18592]: I0308 04:06:39.213363 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-oauth-serving-cert\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.213673 master-0 kubenswrapper[18592]: I0308 04:06:39.213436 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-trusted-ca-bundle\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.213673 master-0 kubenswrapper[18592]: I0308 04:06:39.213475 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqwdd\" (UniqueName: \"kubernetes.io/projected/7b8c1288-ec32-4821-94d1-4df01560bcb9-kube-api-access-dqwdd\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.213673 master-0 kubenswrapper[18592]: I0308 04:06:39.213542 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-service-ca\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.213673 master-0 kubenswrapper[18592]: I0308 04:06:39.213616 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b8c1288-ec32-4821-94d1-4df01560bcb9-console-serving-cert\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.213870 master-0 kubenswrapper[18592]: I0308 04:06:39.213649 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b8c1288-ec32-4821-94d1-4df01560bcb9-console-oauth-config\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.213870 master-0 kubenswrapper[18592]: I0308 04:06:39.213713 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-console-config\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.316200 master-0 kubenswrapper[18592]: I0308 04:06:39.316122 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b8c1288-ec32-4821-94d1-4df01560bcb9-console-serving-cert\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.316465 master-0 kubenswrapper[18592]: I0308 04:06:39.316337 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b8c1288-ec32-4821-94d1-4df01560bcb9-console-oauth-config\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.316576 master-0 kubenswrapper[18592]: I0308 04:06:39.316465 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-console-config\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.317070 master-0 kubenswrapper[18592]: I0308 04:06:39.317011 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-oauth-serving-cert\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.317168 master-0 kubenswrapper[18592]: I0308 04:06:39.317146 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-trusted-ca-bundle\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.317339 master-0 kubenswrapper[18592]: I0308 04:06:39.317295 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqwdd\" (UniqueName: \"kubernetes.io/projected/7b8c1288-ec32-4821-94d1-4df01560bcb9-kube-api-access-dqwdd\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.317415 master-0 kubenswrapper[18592]: I0308 04:06:39.317356 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-service-ca\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.317785 master-0 kubenswrapper[18592]: I0308 04:06:39.317741 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-console-config\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.317785 master-0 kubenswrapper[18592]: I0308 04:06:39.317761 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-oauth-serving-cert\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.318204 master-0 kubenswrapper[18592]: I0308 04:06:39.318163 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-service-ca\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.318382 master-0 kubenswrapper[18592]: I0308 04:06:39.318339 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-trusted-ca-bundle\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.319480 master-0 kubenswrapper[18592]: I0308 04:06:39.319429 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b8c1288-ec32-4821-94d1-4df01560bcb9-console-serving-cert\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.320701 master-0 kubenswrapper[18592]: I0308 04:06:39.320664 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b8c1288-ec32-4821-94d1-4df01560bcb9-console-oauth-config\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.334431 master-0 kubenswrapper[18592]: I0308 04:06:39.334361 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqwdd\" (UniqueName: \"kubernetes.io/projected/7b8c1288-ec32-4821-94d1-4df01560bcb9-kube-api-access-dqwdd\") pod \"console-c6cff696d-qdnlj\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.434042 master-0 kubenswrapper[18592]: I0308 04:06:39.433877 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:39.871658 master-0 kubenswrapper[18592]: I0308 04:06:39.871551 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c6cff696d-qdnlj"] Mar 08 04:06:40.483980 master-0 kubenswrapper[18592]: I0308 04:06:40.483808 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6cff696d-qdnlj" event={"ID":"7b8c1288-ec32-4821-94d1-4df01560bcb9","Type":"ContainerStarted","Data":"03d0863723d52041a0ea90d675112f24eb39f61de5449673b32e1643c162e991"} Mar 08 04:06:40.484270 master-0 kubenswrapper[18592]: I0308 04:06:40.483993 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6cff696d-qdnlj" event={"ID":"7b8c1288-ec32-4821-94d1-4df01560bcb9","Type":"ContainerStarted","Data":"cd30c8aa9cfbcb87fe2d690a7a0545f9cd61174218e23643e64eec6c80604666"} Mar 08 04:06:40.511899 master-0 kubenswrapper[18592]: I0308 04:06:40.511647 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c6cff696d-qdnlj" podStartSLOduration=1.5116125390000001 podStartE2EDuration="1.511612539s" podCreationTimestamp="2026-03-08 04:06:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:06:40.508161208 +0000 UTC m=+812.606915588" watchObservedRunningTime="2026-03-08 04:06:40.511612539 +0000 UTC m=+812.610366959" Mar 08 04:06:41.113163 master-0 kubenswrapper[18592]: I0308 04:06:41.113092 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c6cff696d-qdnlj"] Mar 08 04:06:41.167256 master-0 kubenswrapper[18592]: I0308 04:06:41.164819 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b94585997-5czft"] Mar 08 04:06:41.167256 master-0 kubenswrapper[18592]: I0308 04:06:41.165996 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.186545 master-0 kubenswrapper[18592]: I0308 04:06:41.186473 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b94585997-5czft"] Mar 08 04:06:41.258557 master-0 kubenswrapper[18592]: I0308 04:06:41.258482 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22aef614-003c-4e25-97fc-7386104c00d4-console-oauth-config\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.258764 master-0 kubenswrapper[18592]: I0308 04:06:41.258577 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22aef614-003c-4e25-97fc-7386104c00d4-console-serving-cert\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.258764 master-0 kubenswrapper[18592]: I0308 04:06:41.258621 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-console-config\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.258764 master-0 kubenswrapper[18592]: I0308 04:06:41.258671 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-oauth-serving-cert\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.258764 master-0 kubenswrapper[18592]: I0308 04:06:41.258750 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-service-ca\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.258961 master-0 kubenswrapper[18592]: I0308 04:06:41.258888 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-trusted-ca-bundle\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.258961 master-0 kubenswrapper[18592]: I0308 04:06:41.258930 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb9l2\" (UniqueName: \"kubernetes.io/projected/22aef614-003c-4e25-97fc-7386104c00d4-kube-api-access-jb9l2\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.360685 master-0 kubenswrapper[18592]: I0308 04:06:41.360622 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-service-ca\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.360906 master-0 kubenswrapper[18592]: I0308 04:06:41.360712 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-trusted-ca-bundle\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.360906 master-0 kubenswrapper[18592]: I0308 04:06:41.360736 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb9l2\" (UniqueName: \"kubernetes.io/projected/22aef614-003c-4e25-97fc-7386104c00d4-kube-api-access-jb9l2\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.361200 master-0 kubenswrapper[18592]: I0308 04:06:41.361132 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22aef614-003c-4e25-97fc-7386104c00d4-console-oauth-config\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.361302 master-0 kubenswrapper[18592]: I0308 04:06:41.361266 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22aef614-003c-4e25-97fc-7386104c00d4-console-serving-cert\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.361351 master-0 kubenswrapper[18592]: I0308 04:06:41.361325 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-console-config\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.361434 master-0 kubenswrapper[18592]: I0308 04:06:41.361402 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-oauth-serving-cert\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.362394 master-0 kubenswrapper[18592]: I0308 04:06:41.362336 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-service-ca\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.363630 master-0 kubenswrapper[18592]: I0308 04:06:41.363406 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-console-config\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.363630 master-0 kubenswrapper[18592]: I0308 04:06:41.363560 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-trusted-ca-bundle\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.364436 master-0 kubenswrapper[18592]: I0308 04:06:41.364398 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-oauth-serving-cert\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.367933 master-0 kubenswrapper[18592]: I0308 04:06:41.366817 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22aef614-003c-4e25-97fc-7386104c00d4-console-oauth-config\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.368407 master-0 kubenswrapper[18592]: I0308 04:06:41.368227 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22aef614-003c-4e25-97fc-7386104c00d4-console-serving-cert\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.385730 master-0 kubenswrapper[18592]: I0308 04:06:41.385671 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb9l2\" (UniqueName: \"kubernetes.io/projected/22aef614-003c-4e25-97fc-7386104c00d4-kube-api-access-jb9l2\") pod \"console-7b94585997-5czft\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:41.526050 master-0 kubenswrapper[18592]: I0308 04:06:41.525957 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:42.023799 master-0 kubenswrapper[18592]: I0308 04:06:42.023734 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b94585997-5czft"] Mar 08 04:06:42.031414 master-0 kubenswrapper[18592]: W0308 04:06:42.031374 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22aef614_003c_4e25_97fc_7386104c00d4.slice/crio-1ab84d6571aac061317d69a9d313d89b7992fa815baece92ecd2c6bf8ca6e059 WatchSource:0}: Error finding container 1ab84d6571aac061317d69a9d313d89b7992fa815baece92ecd2c6bf8ca6e059: Status 404 returned error can't find the container with id 1ab84d6571aac061317d69a9d313d89b7992fa815baece92ecd2c6bf8ca6e059 Mar 08 04:06:42.498355 master-0 kubenswrapper[18592]: I0308 04:06:42.498251 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b94585997-5czft" event={"ID":"22aef614-003c-4e25-97fc-7386104c00d4","Type":"ContainerStarted","Data":"deef64b51c530954aa19af5dc030e4d742635c0cf831046cf3fec50b1f6b4f3a"} Mar 08 04:06:42.498877 master-0 kubenswrapper[18592]: I0308 04:06:42.498366 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b94585997-5czft" event={"ID":"22aef614-003c-4e25-97fc-7386104c00d4","Type":"ContainerStarted","Data":"1ab84d6571aac061317d69a9d313d89b7992fa815baece92ecd2c6bf8ca6e059"} Mar 08 04:06:42.527894 master-0 kubenswrapper[18592]: I0308 04:06:42.527781 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b94585997-5czft" podStartSLOduration=1.527762345 podStartE2EDuration="1.527762345s" podCreationTimestamp="2026-03-08 04:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:06:42.518611472 +0000 UTC m=+814.617365822" watchObservedRunningTime="2026-03-08 04:06:42.527762345 +0000 UTC m=+814.626516715" Mar 08 04:06:49.434562 master-0 kubenswrapper[18592]: I0308 04:06:49.434357 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:06:51.526792 master-0 kubenswrapper[18592]: I0308 04:06:51.526725 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:51.527304 master-0 kubenswrapper[18592]: I0308 04:06:51.527290 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:51.534297 master-0 kubenswrapper[18592]: I0308 04:06:51.534259 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:51.593722 master-0 kubenswrapper[18592]: I0308 04:06:51.593636 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b94585997-5czft" Mar 08 04:06:51.738093 master-0 kubenswrapper[18592]: I0308 04:06:51.736084 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68d695c84c-t9xsb"] Mar 08 04:07:03.669078 master-0 kubenswrapper[18592]: I0308 04:07:03.668711 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7748864899-8p6h5" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" containerID="cri-o://211659fe18e5c4cb36e283be5616f6cec9746eb8310d8642428ea129f0b41e05" gracePeriod=15 Mar 08 04:07:04.226534 master-0 kubenswrapper[18592]: I0308 04:07:04.226469 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7748864899-8p6h5_5bc0469d-1ae9-4606-ba99-7a333b66af37/console/1.log" Mar 08 04:07:04.228288 master-0 kubenswrapper[18592]: I0308 04:07:04.228040 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7748864899-8p6h5_5bc0469d-1ae9-4606-ba99-7a333b66af37/console/0.log" Mar 08 04:07:04.228288 master-0 kubenswrapper[18592]: I0308 04:07:04.228131 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7748864899-8p6h5" Mar 08 04:07:04.262524 master-0 kubenswrapper[18592]: I0308 04:07:04.261529 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-console-config\") pod \"5bc0469d-1ae9-4606-ba99-7a333b66af37\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " Mar 08 04:07:04.262524 master-0 kubenswrapper[18592]: I0308 04:07:04.261668 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh2mj\" (UniqueName: \"kubernetes.io/projected/5bc0469d-1ae9-4606-ba99-7a333b66af37-kube-api-access-zh2mj\") pod \"5bc0469d-1ae9-4606-ba99-7a333b66af37\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " Mar 08 04:07:04.262524 master-0 kubenswrapper[18592]: I0308 04:07:04.261721 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-oauth-serving-cert\") pod \"5bc0469d-1ae9-4606-ba99-7a333b66af37\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " Mar 08 04:07:04.262524 master-0 kubenswrapper[18592]: I0308 04:07:04.261769 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-service-ca\") pod \"5bc0469d-1ae9-4606-ba99-7a333b66af37\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " Mar 08 04:07:04.262524 master-0 kubenswrapper[18592]: I0308 04:07:04.261789 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-trusted-ca-bundle\") pod \"5bc0469d-1ae9-4606-ba99-7a333b66af37\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " Mar 08 04:07:04.262524 master-0 kubenswrapper[18592]: I0308 04:07:04.261878 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5bc0469d-1ae9-4606-ba99-7a333b66af37-console-serving-cert\") pod \"5bc0469d-1ae9-4606-ba99-7a333b66af37\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " Mar 08 04:07:04.262524 master-0 kubenswrapper[18592]: I0308 04:07:04.261908 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5bc0469d-1ae9-4606-ba99-7a333b66af37-console-oauth-config\") pod \"5bc0469d-1ae9-4606-ba99-7a333b66af37\" (UID: \"5bc0469d-1ae9-4606-ba99-7a333b66af37\") " Mar 08 04:07:04.271334 master-0 kubenswrapper[18592]: I0308 04:07:04.269541 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5bc0469d-1ae9-4606-ba99-7a333b66af37" (UID: "5bc0469d-1ae9-4606-ba99-7a333b66af37"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:07:04.271334 master-0 kubenswrapper[18592]: I0308 04:07:04.269629 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-console-config" (OuterVolumeSpecName: "console-config") pod "5bc0469d-1ae9-4606-ba99-7a333b66af37" (UID: "5bc0469d-1ae9-4606-ba99-7a333b66af37"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:07:04.271334 master-0 kubenswrapper[18592]: I0308 04:07:04.270314 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bc0469d-1ae9-4606-ba99-7a333b66af37-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5bc0469d-1ae9-4606-ba99-7a333b66af37" (UID: "5bc0469d-1ae9-4606-ba99-7a333b66af37"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:07:04.271334 master-0 kubenswrapper[18592]: I0308 04:07:04.271061 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5bc0469d-1ae9-4606-ba99-7a333b66af37" (UID: "5bc0469d-1ae9-4606-ba99-7a333b66af37"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:07:04.271670 master-0 kubenswrapper[18592]: I0308 04:07:04.271435 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-service-ca" (OuterVolumeSpecName: "service-ca") pod "5bc0469d-1ae9-4606-ba99-7a333b66af37" (UID: "5bc0469d-1ae9-4606-ba99-7a333b66af37"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:07:04.274969 master-0 kubenswrapper[18592]: I0308 04:07:04.274089 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bc0469d-1ae9-4606-ba99-7a333b66af37-kube-api-access-zh2mj" (OuterVolumeSpecName: "kube-api-access-zh2mj") pod "5bc0469d-1ae9-4606-ba99-7a333b66af37" (UID: "5bc0469d-1ae9-4606-ba99-7a333b66af37"). InnerVolumeSpecName "kube-api-access-zh2mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:07:04.274969 master-0 kubenswrapper[18592]: I0308 04:07:04.274854 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bc0469d-1ae9-4606-ba99-7a333b66af37-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5bc0469d-1ae9-4606-ba99-7a333b66af37" (UID: "5bc0469d-1ae9-4606-ba99-7a333b66af37"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:07:04.363770 master-0 kubenswrapper[18592]: I0308 04:07:04.363691 18592 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:04.363770 master-0 kubenswrapper[18592]: I0308 04:07:04.363729 18592 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:04.363770 master-0 kubenswrapper[18592]: I0308 04:07:04.363745 18592 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5bc0469d-1ae9-4606-ba99-7a333b66af37-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:04.363770 master-0 kubenswrapper[18592]: I0308 04:07:04.363757 18592 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5bc0469d-1ae9-4606-ba99-7a333b66af37-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:04.363770 master-0 kubenswrapper[18592]: I0308 04:07:04.363768 18592 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-console-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:04.363770 master-0 kubenswrapper[18592]: I0308 04:07:04.363781 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh2mj\" (UniqueName: \"kubernetes.io/projected/5bc0469d-1ae9-4606-ba99-7a333b66af37-kube-api-access-zh2mj\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:04.363770 master-0 kubenswrapper[18592]: I0308 04:07:04.363792 18592 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5bc0469d-1ae9-4606-ba99-7a333b66af37-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:04.722128 master-0 kubenswrapper[18592]: I0308 04:07:04.722050 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7748864899-8p6h5_5bc0469d-1ae9-4606-ba99-7a333b66af37/console/1.log" Mar 08 04:07:04.723240 master-0 kubenswrapper[18592]: I0308 04:07:04.723135 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7748864899-8p6h5_5bc0469d-1ae9-4606-ba99-7a333b66af37/console/0.log" Mar 08 04:07:04.723364 master-0 kubenswrapper[18592]: I0308 04:07:04.723317 18592 generic.go:334] "Generic (PLEG): container finished" podID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerID="211659fe18e5c4cb36e283be5616f6cec9746eb8310d8642428ea129f0b41e05" exitCode=2 Mar 08 04:07:04.723440 master-0 kubenswrapper[18592]: I0308 04:07:04.723381 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7748864899-8p6h5" event={"ID":"5bc0469d-1ae9-4606-ba99-7a333b66af37","Type":"ContainerDied","Data":"211659fe18e5c4cb36e283be5616f6cec9746eb8310d8642428ea129f0b41e05"} Mar 08 04:07:04.723510 master-0 kubenswrapper[18592]: I0308 04:07:04.723408 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7748864899-8p6h5" Mar 08 04:07:04.723510 master-0 kubenswrapper[18592]: I0308 04:07:04.723478 18592 scope.go:117] "RemoveContainer" containerID="211659fe18e5c4cb36e283be5616f6cec9746eb8310d8642428ea129f0b41e05" Mar 08 04:07:04.723651 master-0 kubenswrapper[18592]: I0308 04:07:04.723459 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7748864899-8p6h5" event={"ID":"5bc0469d-1ae9-4606-ba99-7a333b66af37","Type":"ContainerDied","Data":"4fa7e88b1d76afea6d135ea7ac69bfcebd04384bc46d8431bd99191c0769b3f0"} Mar 08 04:07:04.751947 master-0 kubenswrapper[18592]: I0308 04:07:04.751830 18592 scope.go:117] "RemoveContainer" containerID="e72150628b7e9311091e98f72ec77d7eee04d74ad6ce896529075ce288be841e" Mar 08 04:07:04.792249 master-0 kubenswrapper[18592]: I0308 04:07:04.789649 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7748864899-8p6h5"] Mar 08 04:07:04.797925 master-0 kubenswrapper[18592]: I0308 04:07:04.797876 18592 scope.go:117] "RemoveContainer" containerID="211659fe18e5c4cb36e283be5616f6cec9746eb8310d8642428ea129f0b41e05" Mar 08 04:07:04.798445 master-0 kubenswrapper[18592]: E0308 04:07:04.798403 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"211659fe18e5c4cb36e283be5616f6cec9746eb8310d8642428ea129f0b41e05\": container with ID starting with 211659fe18e5c4cb36e283be5616f6cec9746eb8310d8642428ea129f0b41e05 not found: ID does not exist" containerID="211659fe18e5c4cb36e283be5616f6cec9746eb8310d8642428ea129f0b41e05" Mar 08 04:07:04.798511 master-0 kubenswrapper[18592]: I0308 04:07:04.798462 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211659fe18e5c4cb36e283be5616f6cec9746eb8310d8642428ea129f0b41e05"} err="failed to get container status \"211659fe18e5c4cb36e283be5616f6cec9746eb8310d8642428ea129f0b41e05\": rpc error: code = NotFound desc = could not find container \"211659fe18e5c4cb36e283be5616f6cec9746eb8310d8642428ea129f0b41e05\": container with ID starting with 211659fe18e5c4cb36e283be5616f6cec9746eb8310d8642428ea129f0b41e05 not found: ID does not exist" Mar 08 04:07:04.798511 master-0 kubenswrapper[18592]: I0308 04:07:04.798498 18592 scope.go:117] "RemoveContainer" containerID="e72150628b7e9311091e98f72ec77d7eee04d74ad6ce896529075ce288be841e" Mar 08 04:07:04.799220 master-0 kubenswrapper[18592]: E0308 04:07:04.799159 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e72150628b7e9311091e98f72ec77d7eee04d74ad6ce896529075ce288be841e\": container with ID starting with e72150628b7e9311091e98f72ec77d7eee04d74ad6ce896529075ce288be841e not found: ID does not exist" containerID="e72150628b7e9311091e98f72ec77d7eee04d74ad6ce896529075ce288be841e" Mar 08 04:07:04.799278 master-0 kubenswrapper[18592]: I0308 04:07:04.799239 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e72150628b7e9311091e98f72ec77d7eee04d74ad6ce896529075ce288be841e"} err="failed to get container status \"e72150628b7e9311091e98f72ec77d7eee04d74ad6ce896529075ce288be841e\": rpc error: code = NotFound desc = could not find container \"e72150628b7e9311091e98f72ec77d7eee04d74ad6ce896529075ce288be841e\": container with ID starting with e72150628b7e9311091e98f72ec77d7eee04d74ad6ce896529075ce288be841e not found: ID does not exist" Mar 08 04:07:04.850637 master-0 kubenswrapper[18592]: I0308 04:07:04.850532 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7748864899-8p6h5"] Mar 08 04:07:06.153510 master-0 kubenswrapper[18592]: I0308 04:07:06.153438 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" path="/var/lib/kubelet/pods/5bc0469d-1ae9-4606-ba99-7a333b66af37/volumes" Mar 08 04:07:07.531772 master-0 kubenswrapper[18592]: I0308 04:07:07.531610 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-c6cff696d-qdnlj" podUID="7b8c1288-ec32-4821-94d1-4df01560bcb9" containerName="console" containerID="cri-o://03d0863723d52041a0ea90d675112f24eb39f61de5449673b32e1643c162e991" gracePeriod=15 Mar 08 04:07:07.762499 master-0 kubenswrapper[18592]: I0308 04:07:07.762440 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c6cff696d-qdnlj_7b8c1288-ec32-4821-94d1-4df01560bcb9/console/0.log" Mar 08 04:07:07.762739 master-0 kubenswrapper[18592]: I0308 04:07:07.762533 18592 generic.go:334] "Generic (PLEG): container finished" podID="7b8c1288-ec32-4821-94d1-4df01560bcb9" containerID="03d0863723d52041a0ea90d675112f24eb39f61de5449673b32e1643c162e991" exitCode=2 Mar 08 04:07:07.762739 master-0 kubenswrapper[18592]: I0308 04:07:07.762609 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6cff696d-qdnlj" event={"ID":"7b8c1288-ec32-4821-94d1-4df01560bcb9","Type":"ContainerDied","Data":"03d0863723d52041a0ea90d675112f24eb39f61de5449673b32e1643c162e991"} Mar 08 04:07:08.070677 master-0 kubenswrapper[18592]: I0308 04:07:08.069983 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c6cff696d-qdnlj_7b8c1288-ec32-4821-94d1-4df01560bcb9/console/0.log" Mar 08 04:07:08.070677 master-0 kubenswrapper[18592]: I0308 04:07:08.070131 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:07:08.138527 master-0 kubenswrapper[18592]: I0308 04:07:08.138479 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-console-config\") pod \"7b8c1288-ec32-4821-94d1-4df01560bcb9\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " Mar 08 04:07:08.138711 master-0 kubenswrapper[18592]: I0308 04:07:08.138558 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-oauth-serving-cert\") pod \"7b8c1288-ec32-4821-94d1-4df01560bcb9\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " Mar 08 04:07:08.138711 master-0 kubenswrapper[18592]: I0308 04:07:08.138611 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b8c1288-ec32-4821-94d1-4df01560bcb9-console-oauth-config\") pod \"7b8c1288-ec32-4821-94d1-4df01560bcb9\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " Mar 08 04:07:08.138711 master-0 kubenswrapper[18592]: I0308 04:07:08.138631 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqwdd\" (UniqueName: \"kubernetes.io/projected/7b8c1288-ec32-4821-94d1-4df01560bcb9-kube-api-access-dqwdd\") pod \"7b8c1288-ec32-4821-94d1-4df01560bcb9\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " Mar 08 04:07:08.138711 master-0 kubenswrapper[18592]: I0308 04:07:08.138649 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b8c1288-ec32-4821-94d1-4df01560bcb9-console-serving-cert\") pod \"7b8c1288-ec32-4821-94d1-4df01560bcb9\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " Mar 08 04:07:08.138711 master-0 kubenswrapper[18592]: I0308 04:07:08.138680 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-service-ca\") pod \"7b8c1288-ec32-4821-94d1-4df01560bcb9\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " Mar 08 04:07:08.138711 master-0 kubenswrapper[18592]: I0308 04:07:08.138700 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-trusted-ca-bundle\") pod \"7b8c1288-ec32-4821-94d1-4df01560bcb9\" (UID: \"7b8c1288-ec32-4821-94d1-4df01560bcb9\") " Mar 08 04:07:08.139051 master-0 kubenswrapper[18592]: I0308 04:07:08.139012 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-console-config" (OuterVolumeSpecName: "console-config") pod "7b8c1288-ec32-4821-94d1-4df01560bcb9" (UID: "7b8c1288-ec32-4821-94d1-4df01560bcb9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:07:08.139484 master-0 kubenswrapper[18592]: I0308 04:07:08.139431 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7b8c1288-ec32-4821-94d1-4df01560bcb9" (UID: "7b8c1288-ec32-4821-94d1-4df01560bcb9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:07:08.139596 master-0 kubenswrapper[18592]: I0308 04:07:08.139518 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-service-ca" (OuterVolumeSpecName: "service-ca") pod "7b8c1288-ec32-4821-94d1-4df01560bcb9" (UID: "7b8c1288-ec32-4821-94d1-4df01560bcb9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:07:08.140131 master-0 kubenswrapper[18592]: I0308 04:07:08.140043 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7b8c1288-ec32-4821-94d1-4df01560bcb9" (UID: "7b8c1288-ec32-4821-94d1-4df01560bcb9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:07:08.141798 master-0 kubenswrapper[18592]: I0308 04:07:08.141749 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b8c1288-ec32-4821-94d1-4df01560bcb9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7b8c1288-ec32-4821-94d1-4df01560bcb9" (UID: "7b8c1288-ec32-4821-94d1-4df01560bcb9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:07:08.142298 master-0 kubenswrapper[18592]: I0308 04:07:08.142239 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b8c1288-ec32-4821-94d1-4df01560bcb9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7b8c1288-ec32-4821-94d1-4df01560bcb9" (UID: "7b8c1288-ec32-4821-94d1-4df01560bcb9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:07:08.143111 master-0 kubenswrapper[18592]: I0308 04:07:08.143072 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b8c1288-ec32-4821-94d1-4df01560bcb9-kube-api-access-dqwdd" (OuterVolumeSpecName: "kube-api-access-dqwdd") pod "7b8c1288-ec32-4821-94d1-4df01560bcb9" (UID: "7b8c1288-ec32-4821-94d1-4df01560bcb9"). InnerVolumeSpecName "kube-api-access-dqwdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:07:08.240639 master-0 kubenswrapper[18592]: I0308 04:07:08.240524 18592 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b8c1288-ec32-4821-94d1-4df01560bcb9-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:08.240639 master-0 kubenswrapper[18592]: I0308 04:07:08.240573 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqwdd\" (UniqueName: \"kubernetes.io/projected/7b8c1288-ec32-4821-94d1-4df01560bcb9-kube-api-access-dqwdd\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:08.240639 master-0 kubenswrapper[18592]: I0308 04:07:08.240587 18592 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b8c1288-ec32-4821-94d1-4df01560bcb9-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:08.240639 master-0 kubenswrapper[18592]: I0308 04:07:08.240599 18592 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:08.240639 master-0 kubenswrapper[18592]: I0308 04:07:08.240613 18592 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:08.240639 master-0 kubenswrapper[18592]: I0308 04:07:08.240624 18592 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-console-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:08.240639 master-0 kubenswrapper[18592]: I0308 04:07:08.240635 18592 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b8c1288-ec32-4821-94d1-4df01560bcb9-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:08.778508 master-0 kubenswrapper[18592]: I0308 04:07:08.778420 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c6cff696d-qdnlj_7b8c1288-ec32-4821-94d1-4df01560bcb9/console/0.log" Mar 08 04:07:08.779465 master-0 kubenswrapper[18592]: I0308 04:07:08.778591 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6cff696d-qdnlj" event={"ID":"7b8c1288-ec32-4821-94d1-4df01560bcb9","Type":"ContainerDied","Data":"cd30c8aa9cfbcb87fe2d690a7a0545f9cd61174218e23643e64eec6c80604666"} Mar 08 04:07:08.779465 master-0 kubenswrapper[18592]: I0308 04:07:08.778716 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6cff696d-qdnlj" Mar 08 04:07:08.779465 master-0 kubenswrapper[18592]: I0308 04:07:08.778742 18592 scope.go:117] "RemoveContainer" containerID="03d0863723d52041a0ea90d675112f24eb39f61de5449673b32e1643c162e991" Mar 08 04:07:08.820747 master-0 kubenswrapper[18592]: I0308 04:07:08.820683 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c6cff696d-qdnlj"] Mar 08 04:07:08.831862 master-0 kubenswrapper[18592]: I0308 04:07:08.831787 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c6cff696d-qdnlj"] Mar 08 04:07:10.158634 master-0 kubenswrapper[18592]: I0308 04:07:10.158516 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b8c1288-ec32-4821-94d1-4df01560bcb9" path="/var/lib/kubelet/pods/7b8c1288-ec32-4821-94d1-4df01560bcb9/volumes" Mar 08 04:07:16.779460 master-0 kubenswrapper[18592]: I0308 04:07:16.779316 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-68d695c84c-t9xsb" podUID="d1eeeb8d-1358-45f7-aa42-1c71e290ea30" containerName="console" containerID="cri-o://8f7c4f157c47cc7d92f96cbce7b4a38de130a47685e91981345bb1f064ada373" gracePeriod=15 Mar 08 04:07:17.321594 master-0 kubenswrapper[18592]: I0308 04:07:17.321421 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68d695c84c-t9xsb_d1eeeb8d-1358-45f7-aa42-1c71e290ea30/console/0.log" Mar 08 04:07:17.321594 master-0 kubenswrapper[18592]: I0308 04:07:17.321581 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:07:17.400075 master-0 kubenswrapper[18592]: I0308 04:07:17.400011 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bzv7\" (UniqueName: \"kubernetes.io/projected/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-kube-api-access-4bzv7\") pod \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " Mar 08 04:07:17.400267 master-0 kubenswrapper[18592]: I0308 04:07:17.400089 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-console-serving-cert\") pod \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " Mar 08 04:07:17.400267 master-0 kubenswrapper[18592]: I0308 04:07:17.400136 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-console-oauth-config\") pod \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " Mar 08 04:07:17.400267 master-0 kubenswrapper[18592]: I0308 04:07:17.400190 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-oauth-serving-cert\") pod \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " Mar 08 04:07:17.401146 master-0 kubenswrapper[18592]: I0308 04:07:17.400994 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d1eeeb8d-1358-45f7-aa42-1c71e290ea30" (UID: "d1eeeb8d-1358-45f7-aa42-1c71e290ea30"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:07:17.401146 master-0 kubenswrapper[18592]: I0308 04:07:17.401133 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-service-ca\") pod \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " Mar 08 04:07:17.401405 master-0 kubenswrapper[18592]: I0308 04:07:17.401263 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-console-config\") pod \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " Mar 08 04:07:17.401405 master-0 kubenswrapper[18592]: I0308 04:07:17.401343 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-trusted-ca-bundle\") pod \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\" (UID: \"d1eeeb8d-1358-45f7-aa42-1c71e290ea30\") " Mar 08 04:07:17.401647 master-0 kubenswrapper[18592]: I0308 04:07:17.401610 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-console-config" (OuterVolumeSpecName: "console-config") pod "d1eeeb8d-1358-45f7-aa42-1c71e290ea30" (UID: "d1eeeb8d-1358-45f7-aa42-1c71e290ea30"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:07:17.401736 master-0 kubenswrapper[18592]: I0308 04:07:17.401702 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-service-ca" (OuterVolumeSpecName: "service-ca") pod "d1eeeb8d-1358-45f7-aa42-1c71e290ea30" (UID: "d1eeeb8d-1358-45f7-aa42-1c71e290ea30"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:07:17.402089 master-0 kubenswrapper[18592]: I0308 04:07:17.402033 18592 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:17.402089 master-0 kubenswrapper[18592]: I0308 04:07:17.402064 18592 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:17.402089 master-0 kubenswrapper[18592]: I0308 04:07:17.402078 18592 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-console-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:17.402394 master-0 kubenswrapper[18592]: I0308 04:07:17.402086 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d1eeeb8d-1358-45f7-aa42-1c71e290ea30" (UID: "d1eeeb8d-1358-45f7-aa42-1c71e290ea30"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:07:17.403697 master-0 kubenswrapper[18592]: I0308 04:07:17.403624 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d1eeeb8d-1358-45f7-aa42-1c71e290ea30" (UID: "d1eeeb8d-1358-45f7-aa42-1c71e290ea30"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:07:17.406075 master-0 kubenswrapper[18592]: I0308 04:07:17.405970 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d1eeeb8d-1358-45f7-aa42-1c71e290ea30" (UID: "d1eeeb8d-1358-45f7-aa42-1c71e290ea30"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:07:17.407007 master-0 kubenswrapper[18592]: I0308 04:07:17.406889 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-kube-api-access-4bzv7" (OuterVolumeSpecName: "kube-api-access-4bzv7") pod "d1eeeb8d-1358-45f7-aa42-1c71e290ea30" (UID: "d1eeeb8d-1358-45f7-aa42-1c71e290ea30"). InnerVolumeSpecName "kube-api-access-4bzv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:07:17.503731 master-0 kubenswrapper[18592]: I0308 04:07:17.503647 18592 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:17.503731 master-0 kubenswrapper[18592]: I0308 04:07:17.503711 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bzv7\" (UniqueName: \"kubernetes.io/projected/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-kube-api-access-4bzv7\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:17.503731 master-0 kubenswrapper[18592]: I0308 04:07:17.503733 18592 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:17.504525 master-0 kubenswrapper[18592]: I0308 04:07:17.503753 18592 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1eeeb8d-1358-45f7-aa42-1c71e290ea30-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:07:17.864687 master-0 kubenswrapper[18592]: I0308 04:07:17.864581 18592 generic.go:334] "Generic (PLEG): container finished" podID="d1eeeb8d-1358-45f7-aa42-1c71e290ea30" containerID="8f7c4f157c47cc7d92f96cbce7b4a38de130a47685e91981345bb1f064ada373" exitCode=2 Mar 08 04:07:17.864687 master-0 kubenswrapper[18592]: I0308 04:07:17.864657 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68d695c84c-t9xsb" event={"ID":"d1eeeb8d-1358-45f7-aa42-1c71e290ea30","Type":"ContainerDied","Data":"8f7c4f157c47cc7d92f96cbce7b4a38de130a47685e91981345bb1f064ada373"} Mar 08 04:07:17.865740 master-0 kubenswrapper[18592]: I0308 04:07:17.864717 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68d695c84c-t9xsb" event={"ID":"d1eeeb8d-1358-45f7-aa42-1c71e290ea30","Type":"ContainerDied","Data":"a9fff1f228c48371879736616861565f35b55c5102018462054f4dd4df0e3c84"} Mar 08 04:07:17.865740 master-0 kubenswrapper[18592]: I0308 04:07:17.864771 18592 scope.go:117] "RemoveContainer" containerID="8f7c4f157c47cc7d92f96cbce7b4a38de130a47685e91981345bb1f064ada373" Mar 08 04:07:17.865740 master-0 kubenswrapper[18592]: I0308 04:07:17.864884 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68d695c84c-t9xsb" Mar 08 04:07:17.894198 master-0 kubenswrapper[18592]: I0308 04:07:17.894138 18592 scope.go:117] "RemoveContainer" containerID="8f7c4f157c47cc7d92f96cbce7b4a38de130a47685e91981345bb1f064ada373" Mar 08 04:07:17.896216 master-0 kubenswrapper[18592]: E0308 04:07:17.894802 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f7c4f157c47cc7d92f96cbce7b4a38de130a47685e91981345bb1f064ada373\": container with ID starting with 8f7c4f157c47cc7d92f96cbce7b4a38de130a47685e91981345bb1f064ada373 not found: ID does not exist" containerID="8f7c4f157c47cc7d92f96cbce7b4a38de130a47685e91981345bb1f064ada373" Mar 08 04:07:17.896216 master-0 kubenswrapper[18592]: I0308 04:07:17.894855 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f7c4f157c47cc7d92f96cbce7b4a38de130a47685e91981345bb1f064ada373"} err="failed to get container status \"8f7c4f157c47cc7d92f96cbce7b4a38de130a47685e91981345bb1f064ada373\": rpc error: code = NotFound desc = could not find container \"8f7c4f157c47cc7d92f96cbce7b4a38de130a47685e91981345bb1f064ada373\": container with ID starting with 8f7c4f157c47cc7d92f96cbce7b4a38de130a47685e91981345bb1f064ada373 not found: ID does not exist" Mar 08 04:07:17.937320 master-0 kubenswrapper[18592]: I0308 04:07:17.937232 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68d695c84c-t9xsb"] Mar 08 04:07:17.947394 master-0 kubenswrapper[18592]: I0308 04:07:17.947303 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68d695c84c-t9xsb"] Mar 08 04:07:18.157511 master-0 kubenswrapper[18592]: I0308 04:07:18.157418 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1eeeb8d-1358-45f7-aa42-1c71e290ea30" path="/var/lib/kubelet/pods/d1eeeb8d-1358-45f7-aa42-1c71e290ea30/volumes" Mar 08 04:07:20.158721 master-0 kubenswrapper[18592]: I0308 04:07:20.158650 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-hbj9h"] Mar 08 04:07:20.159648 master-0 kubenswrapper[18592]: E0308 04:07:20.159074 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b8c1288-ec32-4821-94d1-4df01560bcb9" containerName="console" Mar 08 04:07:20.159648 master-0 kubenswrapper[18592]: I0308 04:07:20.159097 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b8c1288-ec32-4821-94d1-4df01560bcb9" containerName="console" Mar 08 04:07:20.159648 master-0 kubenswrapper[18592]: E0308 04:07:20.159131 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" Mar 08 04:07:20.159648 master-0 kubenswrapper[18592]: I0308 04:07:20.159145 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" Mar 08 04:07:20.159648 master-0 kubenswrapper[18592]: E0308 04:07:20.159182 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" Mar 08 04:07:20.159648 master-0 kubenswrapper[18592]: I0308 04:07:20.159196 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" Mar 08 04:07:20.159648 master-0 kubenswrapper[18592]: E0308 04:07:20.159213 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1eeeb8d-1358-45f7-aa42-1c71e290ea30" containerName="console" Mar 08 04:07:20.159648 master-0 kubenswrapper[18592]: I0308 04:07:20.159227 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1eeeb8d-1358-45f7-aa42-1c71e290ea30" containerName="console" Mar 08 04:07:20.159648 master-0 kubenswrapper[18592]: I0308 04:07:20.159487 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" Mar 08 04:07:20.159648 master-0 kubenswrapper[18592]: I0308 04:07:20.159512 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1eeeb8d-1358-45f7-aa42-1c71e290ea30" containerName="console" Mar 08 04:07:20.159648 master-0 kubenswrapper[18592]: I0308 04:07:20.159554 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b8c1288-ec32-4821-94d1-4df01560bcb9" containerName="console" Mar 08 04:07:20.159648 master-0 kubenswrapper[18592]: I0308 04:07:20.159579 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc0469d-1ae9-4606-ba99-7a333b66af37" containerName="console" Mar 08 04:07:20.160507 master-0 kubenswrapper[18592]: I0308 04:07:20.160324 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" Mar 08 04:07:20.162482 master-0 kubenswrapper[18592]: I0308 04:07:20.162437 18592 reflector.go:368] Caches populated for *v1.Secret from object-"sushy-emulator"/"os-client-config" Mar 08 04:07:20.164441 master-0 kubenswrapper[18592]: I0308 04:07:20.164392 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-hbj9h"] Mar 08 04:07:20.166454 master-0 kubenswrapper[18592]: I0308 04:07:20.166384 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"openshift-service-ca.crt" Mar 08 04:07:20.166623 master-0 kubenswrapper[18592]: I0308 04:07:20.166556 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"kube-root-ca.crt" Mar 08 04:07:20.195607 master-0 kubenswrapper[18592]: I0308 04:07:20.195542 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"sushy-emulator-config" Mar 08 04:07:20.246669 master-0 kubenswrapper[18592]: I0308 04:07:20.246571 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529-sushy-emulator-config\") pod \"sushy-emulator-78f6d7d749-hbj9h\" (UID: \"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" Mar 08 04:07:20.247021 master-0 kubenswrapper[18592]: I0308 04:07:20.246774 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529-os-client-config\") pod \"sushy-emulator-78f6d7d749-hbj9h\" (UID: \"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" Mar 08 04:07:20.247182 master-0 kubenswrapper[18592]: I0308 04:07:20.247073 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-656fq\" (UniqueName: \"kubernetes.io/projected/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529-kube-api-access-656fq\") pod \"sushy-emulator-78f6d7d749-hbj9h\" (UID: \"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" Mar 08 04:07:20.349138 master-0 kubenswrapper[18592]: I0308 04:07:20.349031 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529-os-client-config\") pod \"sushy-emulator-78f6d7d749-hbj9h\" (UID: \"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" Mar 08 04:07:20.349138 master-0 kubenswrapper[18592]: I0308 04:07:20.349124 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-656fq\" (UniqueName: \"kubernetes.io/projected/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529-kube-api-access-656fq\") pod \"sushy-emulator-78f6d7d749-hbj9h\" (UID: \"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" Mar 08 04:07:20.349484 master-0 kubenswrapper[18592]: I0308 04:07:20.349395 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529-sushy-emulator-config\") pod \"sushy-emulator-78f6d7d749-hbj9h\" (UID: \"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" Mar 08 04:07:20.350348 master-0 kubenswrapper[18592]: I0308 04:07:20.350308 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529-sushy-emulator-config\") pod \"sushy-emulator-78f6d7d749-hbj9h\" (UID: \"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" Mar 08 04:07:20.353884 master-0 kubenswrapper[18592]: I0308 04:07:20.353806 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529-os-client-config\") pod \"sushy-emulator-78f6d7d749-hbj9h\" (UID: \"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" Mar 08 04:07:20.378729 master-0 kubenswrapper[18592]: I0308 04:07:20.378677 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-656fq\" (UniqueName: \"kubernetes.io/projected/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529-kube-api-access-656fq\") pod \"sushy-emulator-78f6d7d749-hbj9h\" (UID: \"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" Mar 08 04:07:20.528688 master-0 kubenswrapper[18592]: I0308 04:07:20.528593 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" Mar 08 04:07:21.053064 master-0 kubenswrapper[18592]: W0308 04:07:21.053006 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a03f3b9_7c27_4c2d_8a8e_c342aeb10529.slice/crio-169ba855e99c242c62ce625cd99ee1ae8b7ddb2407177178462abe5503ad814c WatchSource:0}: Error finding container 169ba855e99c242c62ce625cd99ee1ae8b7ddb2407177178462abe5503ad814c: Status 404 returned error can't find the container with id 169ba855e99c242c62ce625cd99ee1ae8b7ddb2407177178462abe5503ad814c Mar 08 04:07:21.062542 master-0 kubenswrapper[18592]: I0308 04:07:21.061891 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-hbj9h"] Mar 08 04:07:21.920875 master-0 kubenswrapper[18592]: I0308 04:07:21.920807 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" event={"ID":"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529","Type":"ContainerStarted","Data":"169ba855e99c242c62ce625cd99ee1ae8b7ddb2407177178462abe5503ad814c"} Mar 08 04:07:27.972186 master-0 kubenswrapper[18592]: I0308 04:07:27.972035 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" event={"ID":"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529","Type":"ContainerStarted","Data":"39a7d69d7fa5fbc824daf31104d7fa6c79f37c222ffbf496e54783c529e2ef3f"} Mar 08 04:07:27.996242 master-0 kubenswrapper[18592]: I0308 04:07:27.996125 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" podStartSLOduration=1.857179506 podStartE2EDuration="7.996107349s" podCreationTimestamp="2026-03-08 04:07:20 +0000 UTC" firstStartedPulling="2026-03-08 04:07:21.055379572 +0000 UTC m=+853.154133942" lastFinishedPulling="2026-03-08 04:07:27.194307415 +0000 UTC m=+859.293061785" observedRunningTime="2026-03-08 04:07:27.991931178 +0000 UTC m=+860.090685578" watchObservedRunningTime="2026-03-08 04:07:27.996107349 +0000 UTC m=+860.094861699" Mar 08 04:07:30.529505 master-0 kubenswrapper[18592]: I0308 04:07:30.529414 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" Mar 08 04:07:30.529505 master-0 kubenswrapper[18592]: I0308 04:07:30.529505 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" Mar 08 04:07:30.545047 master-0 kubenswrapper[18592]: I0308 04:07:30.544983 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" Mar 08 04:07:31.002781 master-0 kubenswrapper[18592]: I0308 04:07:31.002721 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" Mar 08 04:07:50.342394 master-0 kubenswrapper[18592]: I0308 04:07:50.342311 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-poller-78b87db7b5-2xl28"] Mar 08 04:07:50.343689 master-0 kubenswrapper[18592]: I0308 04:07:50.343305 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-78b87db7b5-2xl28" Mar 08 04:07:50.366916 master-0 kubenswrapper[18592]: I0308 04:07:50.366777 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-78b87db7b5-2xl28"] Mar 08 04:07:50.384350 master-0 kubenswrapper[18592]: I0308 04:07:50.384267 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x66f9\" (UniqueName: \"kubernetes.io/projected/df9dcf19-8fda-4dae-9efb-26c71871772c-kube-api-access-x66f9\") pod \"nova-console-poller-78b87db7b5-2xl28\" (UID: \"df9dcf19-8fda-4dae-9efb-26c71871772c\") " pod="sushy-emulator/nova-console-poller-78b87db7b5-2xl28" Mar 08 04:07:50.386057 master-0 kubenswrapper[18592]: I0308 04:07:50.386015 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/df9dcf19-8fda-4dae-9efb-26c71871772c-os-client-config\") pod \"nova-console-poller-78b87db7b5-2xl28\" (UID: \"df9dcf19-8fda-4dae-9efb-26c71871772c\") " pod="sushy-emulator/nova-console-poller-78b87db7b5-2xl28" Mar 08 04:07:50.488186 master-0 kubenswrapper[18592]: I0308 04:07:50.488096 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/df9dcf19-8fda-4dae-9efb-26c71871772c-os-client-config\") pod \"nova-console-poller-78b87db7b5-2xl28\" (UID: \"df9dcf19-8fda-4dae-9efb-26c71871772c\") " pod="sushy-emulator/nova-console-poller-78b87db7b5-2xl28" Mar 08 04:07:50.488186 master-0 kubenswrapper[18592]: I0308 04:07:50.488190 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x66f9\" (UniqueName: \"kubernetes.io/projected/df9dcf19-8fda-4dae-9efb-26c71871772c-kube-api-access-x66f9\") pod \"nova-console-poller-78b87db7b5-2xl28\" (UID: \"df9dcf19-8fda-4dae-9efb-26c71871772c\") " pod="sushy-emulator/nova-console-poller-78b87db7b5-2xl28" Mar 08 04:07:50.492670 master-0 kubenswrapper[18592]: I0308 04:07:50.492604 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/df9dcf19-8fda-4dae-9efb-26c71871772c-os-client-config\") pod \"nova-console-poller-78b87db7b5-2xl28\" (UID: \"df9dcf19-8fda-4dae-9efb-26c71871772c\") " pod="sushy-emulator/nova-console-poller-78b87db7b5-2xl28" Mar 08 04:07:50.517086 master-0 kubenswrapper[18592]: I0308 04:07:50.517012 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x66f9\" (UniqueName: \"kubernetes.io/projected/df9dcf19-8fda-4dae-9efb-26c71871772c-kube-api-access-x66f9\") pod \"nova-console-poller-78b87db7b5-2xl28\" (UID: \"df9dcf19-8fda-4dae-9efb-26c71871772c\") " pod="sushy-emulator/nova-console-poller-78b87db7b5-2xl28" Mar 08 04:07:50.673858 master-0 kubenswrapper[18592]: I0308 04:07:50.673623 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-78b87db7b5-2xl28" Mar 08 04:07:51.157783 master-0 kubenswrapper[18592]: I0308 04:07:51.157682 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-78b87db7b5-2xl28"] Mar 08 04:07:51.162115 master-0 kubenswrapper[18592]: W0308 04:07:51.162057 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf9dcf19_8fda_4dae_9efb_26c71871772c.slice/crio-e50335c8387f9625daedcb1588581a0d4f2c0a178de72252be9251c393a8ae34 WatchSource:0}: Error finding container e50335c8387f9625daedcb1588581a0d4f2c0a178de72252be9251c393a8ae34: Status 404 returned error can't find the container with id e50335c8387f9625daedcb1588581a0d4f2c0a178de72252be9251c393a8ae34 Mar 08 04:07:51.180517 master-0 kubenswrapper[18592]: I0308 04:07:51.180453 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-78b87db7b5-2xl28" event={"ID":"df9dcf19-8fda-4dae-9efb-26c71871772c","Type":"ContainerStarted","Data":"e50335c8387f9625daedcb1588581a0d4f2c0a178de72252be9251c393a8ae34"} Mar 08 04:07:57.233795 master-0 kubenswrapper[18592]: I0308 04:07:57.233685 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-78b87db7b5-2xl28" event={"ID":"df9dcf19-8fda-4dae-9efb-26c71871772c","Type":"ContainerStarted","Data":"1ea19a07fe974083c17747a4af06f02ac71dca99730d9fc1fa7b851c64ea3fff"} Mar 08 04:07:57.271453 master-0 kubenswrapper[18592]: I0308 04:07:57.271334 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-poller-78b87db7b5-2xl28" podStartSLOduration=1.6220608749999998 podStartE2EDuration="7.271305565s" podCreationTimestamp="2026-03-08 04:07:50 +0000 UTC" firstStartedPulling="2026-03-08 04:07:51.164911679 +0000 UTC m=+883.263666029" lastFinishedPulling="2026-03-08 04:07:56.814156349 +0000 UTC m=+888.912910719" observedRunningTime="2026-03-08 04:07:57.256587283 +0000 UTC m=+889.355341663" watchObservedRunningTime="2026-03-08 04:07:57.271305565 +0000 UTC m=+889.370059955" Mar 08 04:08:22.246637 master-0 kubenswrapper[18592]: I0308 04:08:22.246546 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-recorder-7b686c4b57-4r4fk"] Mar 08 04:08:22.249571 master-0 kubenswrapper[18592]: I0308 04:08:22.249528 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-7b686c4b57-4r4fk" Mar 08 04:08:22.278498 master-0 kubenswrapper[18592]: I0308 04:08:22.278437 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/d8214aaa-d97f-4497-bd73-062a1d550d99-os-client-config\") pod \"nova-console-recorder-7b686c4b57-4r4fk\" (UID: \"d8214aaa-d97f-4497-bd73-062a1d550d99\") " pod="sushy-emulator/nova-console-recorder-7b686c4b57-4r4fk" Mar 08 04:08:22.278802 master-0 kubenswrapper[18592]: I0308 04:08:22.278554 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/d8214aaa-d97f-4497-bd73-062a1d550d99-nova-console-recordings-pv\") pod \"nova-console-recorder-7b686c4b57-4r4fk\" (UID: \"d8214aaa-d97f-4497-bd73-062a1d550d99\") " pod="sushy-emulator/nova-console-recorder-7b686c4b57-4r4fk" Mar 08 04:08:22.278802 master-0 kubenswrapper[18592]: I0308 04:08:22.278587 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr8wj\" (UniqueName: \"kubernetes.io/projected/d8214aaa-d97f-4497-bd73-062a1d550d99-kube-api-access-fr8wj\") pod \"nova-console-recorder-7b686c4b57-4r4fk\" (UID: \"d8214aaa-d97f-4497-bd73-062a1d550d99\") " pod="sushy-emulator/nova-console-recorder-7b686c4b57-4r4fk" Mar 08 04:08:22.293418 master-0 kubenswrapper[18592]: I0308 04:08:22.293345 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-7b686c4b57-4r4fk"] Mar 08 04:08:22.379348 master-0 kubenswrapper[18592]: I0308 04:08:22.379272 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/d8214aaa-d97f-4497-bd73-062a1d550d99-os-client-config\") pod \"nova-console-recorder-7b686c4b57-4r4fk\" (UID: \"d8214aaa-d97f-4497-bd73-062a1d550d99\") " pod="sushy-emulator/nova-console-recorder-7b686c4b57-4r4fk" Mar 08 04:08:22.379687 master-0 kubenswrapper[18592]: I0308 04:08:22.379389 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/d8214aaa-d97f-4497-bd73-062a1d550d99-nova-console-recordings-pv\") pod \"nova-console-recorder-7b686c4b57-4r4fk\" (UID: \"d8214aaa-d97f-4497-bd73-062a1d550d99\") " pod="sushy-emulator/nova-console-recorder-7b686c4b57-4r4fk" Mar 08 04:08:22.379687 master-0 kubenswrapper[18592]: I0308 04:08:22.379420 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr8wj\" (UniqueName: \"kubernetes.io/projected/d8214aaa-d97f-4497-bd73-062a1d550d99-kube-api-access-fr8wj\") pod \"nova-console-recorder-7b686c4b57-4r4fk\" (UID: \"d8214aaa-d97f-4497-bd73-062a1d550d99\") " pod="sushy-emulator/nova-console-recorder-7b686c4b57-4r4fk" Mar 08 04:08:22.385762 master-0 kubenswrapper[18592]: I0308 04:08:22.385681 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/d8214aaa-d97f-4497-bd73-062a1d550d99-os-client-config\") pod \"nova-console-recorder-7b686c4b57-4r4fk\" (UID: \"d8214aaa-d97f-4497-bd73-062a1d550d99\") " pod="sushy-emulator/nova-console-recorder-7b686c4b57-4r4fk" Mar 08 04:08:22.395861 master-0 kubenswrapper[18592]: I0308 04:08:22.395730 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr8wj\" (UniqueName: \"kubernetes.io/projected/d8214aaa-d97f-4497-bd73-062a1d550d99-kube-api-access-fr8wj\") pod \"nova-console-recorder-7b686c4b57-4r4fk\" (UID: \"d8214aaa-d97f-4497-bd73-062a1d550d99\") " pod="sushy-emulator/nova-console-recorder-7b686c4b57-4r4fk" Mar 08 04:08:23.030694 master-0 kubenswrapper[18592]: I0308 04:08:23.030619 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/d8214aaa-d97f-4497-bd73-062a1d550d99-nova-console-recordings-pv\") pod \"nova-console-recorder-7b686c4b57-4r4fk\" (UID: \"d8214aaa-d97f-4497-bd73-062a1d550d99\") " pod="sushy-emulator/nova-console-recorder-7b686c4b57-4r4fk" Mar 08 04:08:23.172298 master-0 kubenswrapper[18592]: I0308 04:08:23.172212 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-7b686c4b57-4r4fk" Mar 08 04:08:23.690729 master-0 kubenswrapper[18592]: I0308 04:08:23.690668 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-7b686c4b57-4r4fk"] Mar 08 04:08:23.691621 master-0 kubenswrapper[18592]: W0308 04:08:23.691511 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8214aaa_d97f_4497_bd73_062a1d550d99.slice/crio-3eb5d2c41761ff7a5d93fc50ac9aa38b99a8fcd801fb5fbc5b076b4ed2bdbca6 WatchSource:0}: Error finding container 3eb5d2c41761ff7a5d93fc50ac9aa38b99a8fcd801fb5fbc5b076b4ed2bdbca6: Status 404 returned error can't find the container with id 3eb5d2c41761ff7a5d93fc50ac9aa38b99a8fcd801fb5fbc5b076b4ed2bdbca6 Mar 08 04:08:24.487488 master-0 kubenswrapper[18592]: I0308 04:08:24.487399 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-7b686c4b57-4r4fk" event={"ID":"d8214aaa-d97f-4497-bd73-062a1d550d99","Type":"ContainerStarted","Data":"3eb5d2c41761ff7a5d93fc50ac9aa38b99a8fcd801fb5fbc5b076b4ed2bdbca6"} Mar 08 04:08:32.557283 master-0 kubenswrapper[18592]: I0308 04:08:32.557237 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-7b686c4b57-4r4fk" event={"ID":"d8214aaa-d97f-4497-bd73-062a1d550d99","Type":"ContainerStarted","Data":"a0195d639363955ae60e3905520c6de97217a9e7450d48b12b10988a6ec6d7a4"} Mar 08 04:08:32.580881 master-0 kubenswrapper[18592]: I0308 04:08:32.580745 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-recorder-7b686c4b57-4r4fk" podStartSLOduration=2.231320137 podStartE2EDuration="10.580722578s" podCreationTimestamp="2026-03-08 04:08:22 +0000 UTC" firstStartedPulling="2026-03-08 04:08:23.694263433 +0000 UTC m=+915.793017813" lastFinishedPulling="2026-03-08 04:08:32.043665904 +0000 UTC m=+924.142420254" observedRunningTime="2026-03-08 04:08:32.579965448 +0000 UTC m=+924.678719798" watchObservedRunningTime="2026-03-08 04:08:32.580722578 +0000 UTC m=+924.679476938" Mar 08 04:10:16.110150 master-0 kubenswrapper[18592]: I0308 04:10:16.110070 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47"] Mar 08 04:10:16.112744 master-0 kubenswrapper[18592]: I0308 04:10:16.112515 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47" Mar 08 04:10:16.131918 master-0 kubenswrapper[18592]: I0308 04:10:16.131852 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47"] Mar 08 04:10:16.204326 master-0 kubenswrapper[18592]: I0308 04:10:16.204198 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmqnd\" (UniqueName: \"kubernetes.io/projected/34dc6165-f86b-443d-a226-1f0c774ea92c-kube-api-access-dmqnd\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47\" (UID: \"34dc6165-f86b-443d-a226-1f0c774ea92c\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47" Mar 08 04:10:16.204626 master-0 kubenswrapper[18592]: I0308 04:10:16.204464 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34dc6165-f86b-443d-a226-1f0c774ea92c-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47\" (UID: \"34dc6165-f86b-443d-a226-1f0c774ea92c\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47" Mar 08 04:10:16.204798 master-0 kubenswrapper[18592]: I0308 04:10:16.204755 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34dc6165-f86b-443d-a226-1f0c774ea92c-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47\" (UID: \"34dc6165-f86b-443d-a226-1f0c774ea92c\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47" Mar 08 04:10:16.307444 master-0 kubenswrapper[18592]: I0308 04:10:16.307370 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmqnd\" (UniqueName: \"kubernetes.io/projected/34dc6165-f86b-443d-a226-1f0c774ea92c-kube-api-access-dmqnd\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47\" (UID: \"34dc6165-f86b-443d-a226-1f0c774ea92c\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47" Mar 08 04:10:16.307681 master-0 kubenswrapper[18592]: I0308 04:10:16.307543 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34dc6165-f86b-443d-a226-1f0c774ea92c-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47\" (UID: \"34dc6165-f86b-443d-a226-1f0c774ea92c\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47" Mar 08 04:10:16.307681 master-0 kubenswrapper[18592]: I0308 04:10:16.307641 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34dc6165-f86b-443d-a226-1f0c774ea92c-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47\" (UID: \"34dc6165-f86b-443d-a226-1f0c774ea92c\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47" Mar 08 04:10:16.308630 master-0 kubenswrapper[18592]: I0308 04:10:16.308596 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34dc6165-f86b-443d-a226-1f0c774ea92c-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47\" (UID: \"34dc6165-f86b-443d-a226-1f0c774ea92c\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47" Mar 08 04:10:16.309204 master-0 kubenswrapper[18592]: I0308 04:10:16.309177 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34dc6165-f86b-443d-a226-1f0c774ea92c-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47\" (UID: \"34dc6165-f86b-443d-a226-1f0c774ea92c\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47" Mar 08 04:10:16.337116 master-0 kubenswrapper[18592]: I0308 04:10:16.337066 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmqnd\" (UniqueName: \"kubernetes.io/projected/34dc6165-f86b-443d-a226-1f0c774ea92c-kube-api-access-dmqnd\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47\" (UID: \"34dc6165-f86b-443d-a226-1f0c774ea92c\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47" Mar 08 04:10:16.453287 master-0 kubenswrapper[18592]: I0308 04:10:16.453206 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47" Mar 08 04:10:16.931889 master-0 kubenswrapper[18592]: W0308 04:10:16.930818 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34dc6165_f86b_443d_a226_1f0c774ea92c.slice/crio-b03922039a19848e50c735b066c8ceb7fff6e74f3362a9a9b03be83a4a76973f WatchSource:0}: Error finding container b03922039a19848e50c735b066c8ceb7fff6e74f3362a9a9b03be83a4a76973f: Status 404 returned error can't find the container with id b03922039a19848e50c735b066c8ceb7fff6e74f3362a9a9b03be83a4a76973f Mar 08 04:10:16.933177 master-0 kubenswrapper[18592]: I0308 04:10:16.933140 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47"] Mar 08 04:10:17.736504 master-0 kubenswrapper[18592]: I0308 04:10:17.736387 18592 generic.go:334] "Generic (PLEG): container finished" podID="34dc6165-f86b-443d-a226-1f0c774ea92c" containerID="647a7639be53180968e380e113c39d6f4549f03d426121710034c5b68e76f336" exitCode=0 Mar 08 04:10:17.737516 master-0 kubenswrapper[18592]: I0308 04:10:17.736903 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47" event={"ID":"34dc6165-f86b-443d-a226-1f0c774ea92c","Type":"ContainerDied","Data":"647a7639be53180968e380e113c39d6f4549f03d426121710034c5b68e76f336"} Mar 08 04:10:17.737516 master-0 kubenswrapper[18592]: I0308 04:10:17.737500 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47" event={"ID":"34dc6165-f86b-443d-a226-1f0c774ea92c","Type":"ContainerStarted","Data":"b03922039a19848e50c735b066c8ceb7fff6e74f3362a9a9b03be83a4a76973f"} Mar 08 04:10:19.761540 master-0 kubenswrapper[18592]: I0308 04:10:19.761444 18592 generic.go:334] "Generic (PLEG): container finished" podID="34dc6165-f86b-443d-a226-1f0c774ea92c" containerID="391c618b6c0550615ab2e3a5f05ad5854bded72bf7e23b5844fe7c47b0fbf937" exitCode=0 Mar 08 04:10:19.762439 master-0 kubenswrapper[18592]: I0308 04:10:19.761574 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47" event={"ID":"34dc6165-f86b-443d-a226-1f0c774ea92c","Type":"ContainerDied","Data":"391c618b6c0550615ab2e3a5f05ad5854bded72bf7e23b5844fe7c47b0fbf937"} Mar 08 04:10:20.772564 master-0 kubenswrapper[18592]: I0308 04:10:20.772443 18592 generic.go:334] "Generic (PLEG): container finished" podID="34dc6165-f86b-443d-a226-1f0c774ea92c" containerID="7566b0efe6b43e539a6557288730ae2a9d6946f3b220038ef0d279834f0164c6" exitCode=0 Mar 08 04:10:20.772564 master-0 kubenswrapper[18592]: I0308 04:10:20.772541 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47" event={"ID":"34dc6165-f86b-443d-a226-1f0c774ea92c","Type":"ContainerDied","Data":"7566b0efe6b43e539a6557288730ae2a9d6946f3b220038ef0d279834f0164c6"} Mar 08 04:10:22.181953 master-0 kubenswrapper[18592]: I0308 04:10:22.179760 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47" Mar 08 04:10:22.237465 master-0 kubenswrapper[18592]: I0308 04:10:22.237389 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmqnd\" (UniqueName: \"kubernetes.io/projected/34dc6165-f86b-443d-a226-1f0c774ea92c-kube-api-access-dmqnd\") pod \"34dc6165-f86b-443d-a226-1f0c774ea92c\" (UID: \"34dc6165-f86b-443d-a226-1f0c774ea92c\") " Mar 08 04:10:22.237465 master-0 kubenswrapper[18592]: I0308 04:10:22.237462 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34dc6165-f86b-443d-a226-1f0c774ea92c-util\") pod \"34dc6165-f86b-443d-a226-1f0c774ea92c\" (UID: \"34dc6165-f86b-443d-a226-1f0c774ea92c\") " Mar 08 04:10:22.238081 master-0 kubenswrapper[18592]: I0308 04:10:22.237585 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34dc6165-f86b-443d-a226-1f0c774ea92c-bundle\") pod \"34dc6165-f86b-443d-a226-1f0c774ea92c\" (UID: \"34dc6165-f86b-443d-a226-1f0c774ea92c\") " Mar 08 04:10:22.239383 master-0 kubenswrapper[18592]: I0308 04:10:22.239304 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34dc6165-f86b-443d-a226-1f0c774ea92c-bundle" (OuterVolumeSpecName: "bundle") pod "34dc6165-f86b-443d-a226-1f0c774ea92c" (UID: "34dc6165-f86b-443d-a226-1f0c774ea92c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:10:22.245176 master-0 kubenswrapper[18592]: I0308 04:10:22.245125 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34dc6165-f86b-443d-a226-1f0c774ea92c-kube-api-access-dmqnd" (OuterVolumeSpecName: "kube-api-access-dmqnd") pod "34dc6165-f86b-443d-a226-1f0c774ea92c" (UID: "34dc6165-f86b-443d-a226-1f0c774ea92c"). InnerVolumeSpecName "kube-api-access-dmqnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:10:22.265392 master-0 kubenswrapper[18592]: I0308 04:10:22.265316 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34dc6165-f86b-443d-a226-1f0c774ea92c-util" (OuterVolumeSpecName: "util") pod "34dc6165-f86b-443d-a226-1f0c774ea92c" (UID: "34dc6165-f86b-443d-a226-1f0c774ea92c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:10:22.341940 master-0 kubenswrapper[18592]: I0308 04:10:22.339461 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmqnd\" (UniqueName: \"kubernetes.io/projected/34dc6165-f86b-443d-a226-1f0c774ea92c-kube-api-access-dmqnd\") on node \"master-0\" DevicePath \"\"" Mar 08 04:10:22.341940 master-0 kubenswrapper[18592]: I0308 04:10:22.339528 18592 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34dc6165-f86b-443d-a226-1f0c774ea92c-util\") on node \"master-0\" DevicePath \"\"" Mar 08 04:10:22.341940 master-0 kubenswrapper[18592]: I0308 04:10:22.339552 18592 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34dc6165-f86b-443d-a226-1f0c774ea92c-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:10:22.799914 master-0 kubenswrapper[18592]: I0308 04:10:22.797058 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47" event={"ID":"34dc6165-f86b-443d-a226-1f0c774ea92c","Type":"ContainerDied","Data":"b03922039a19848e50c735b066c8ceb7fff6e74f3362a9a9b03be83a4a76973f"} Mar 08 04:10:22.799914 master-0 kubenswrapper[18592]: I0308 04:10:22.797118 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b03922039a19848e50c735b066c8ceb7fff6e74f3362a9a9b03be83a4a76973f" Mar 08 04:10:22.799914 master-0 kubenswrapper[18592]: I0308 04:10:22.797209 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4cll47" Mar 08 04:10:29.197116 master-0 kubenswrapper[18592]: I0308 04:10:29.197051 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/lvms-operator-69d6f48865-mj97v"] Mar 08 04:10:29.197737 master-0 kubenswrapper[18592]: E0308 04:10:29.197708 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34dc6165-f86b-443d-a226-1f0c774ea92c" containerName="extract" Mar 08 04:10:29.197737 master-0 kubenswrapper[18592]: I0308 04:10:29.197729 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="34dc6165-f86b-443d-a226-1f0c774ea92c" containerName="extract" Mar 08 04:10:29.197806 master-0 kubenswrapper[18592]: E0308 04:10:29.197760 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34dc6165-f86b-443d-a226-1f0c774ea92c" containerName="pull" Mar 08 04:10:29.197806 master-0 kubenswrapper[18592]: I0308 04:10:29.197772 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="34dc6165-f86b-443d-a226-1f0c774ea92c" containerName="pull" Mar 08 04:10:29.197806 master-0 kubenswrapper[18592]: E0308 04:10:29.197799 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34dc6165-f86b-443d-a226-1f0c774ea92c" containerName="util" Mar 08 04:10:29.197946 master-0 kubenswrapper[18592]: I0308 04:10:29.197808 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="34dc6165-f86b-443d-a226-1f0c774ea92c" containerName="util" Mar 08 04:10:29.198091 master-0 kubenswrapper[18592]: I0308 04:10:29.198067 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="34dc6165-f86b-443d-a226-1f0c774ea92c" containerName="extract" Mar 08 04:10:29.198699 master-0 kubenswrapper[18592]: I0308 04:10:29.198666 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:29.202990 master-0 kubenswrapper[18592]: I0308 04:10:29.201300 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-webhook-server-cert" Mar 08 04:10:29.202990 master-0 kubenswrapper[18592]: I0308 04:10:29.201314 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Mar 08 04:10:29.213868 master-0 kubenswrapper[18592]: I0308 04:10:29.212493 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Mar 08 04:10:29.213868 master-0 kubenswrapper[18592]: I0308 04:10:29.212812 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-metrics-cert" Mar 08 04:10:29.213868 master-0 kubenswrapper[18592]: I0308 04:10:29.212990 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-service-cert" Mar 08 04:10:29.237915 master-0 kubenswrapper[18592]: I0308 04:10:29.237392 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-69d6f48865-mj97v"] Mar 08 04:10:29.266221 master-0 kubenswrapper[18592]: I0308 04:10:29.266135 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/ec85adee-abf5-4bfa-a9b9-768bf7348d40-socket-dir\") pod \"lvms-operator-69d6f48865-mj97v\" (UID: \"ec85adee-abf5-4bfa-a9b9-768bf7348d40\") " pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:29.266221 master-0 kubenswrapper[18592]: I0308 04:10:29.266204 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ec85adee-abf5-4bfa-a9b9-768bf7348d40-apiservice-cert\") pod \"lvms-operator-69d6f48865-mj97v\" (UID: \"ec85adee-abf5-4bfa-a9b9-768bf7348d40\") " pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:29.266584 master-0 kubenswrapper[18592]: I0308 04:10:29.266308 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9mfb\" (UniqueName: \"kubernetes.io/projected/ec85adee-abf5-4bfa-a9b9-768bf7348d40-kube-api-access-t9mfb\") pod \"lvms-operator-69d6f48865-mj97v\" (UID: \"ec85adee-abf5-4bfa-a9b9-768bf7348d40\") " pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:29.266584 master-0 kubenswrapper[18592]: I0308 04:10:29.266477 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ec85adee-abf5-4bfa-a9b9-768bf7348d40-webhook-cert\") pod \"lvms-operator-69d6f48865-mj97v\" (UID: \"ec85adee-abf5-4bfa-a9b9-768bf7348d40\") " pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:29.266584 master-0 kubenswrapper[18592]: I0308 04:10:29.266570 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec85adee-abf5-4bfa-a9b9-768bf7348d40-metrics-cert\") pod \"lvms-operator-69d6f48865-mj97v\" (UID: \"ec85adee-abf5-4bfa-a9b9-768bf7348d40\") " pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:29.368912 master-0 kubenswrapper[18592]: I0308 04:10:29.368808 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9mfb\" (UniqueName: \"kubernetes.io/projected/ec85adee-abf5-4bfa-a9b9-768bf7348d40-kube-api-access-t9mfb\") pod \"lvms-operator-69d6f48865-mj97v\" (UID: \"ec85adee-abf5-4bfa-a9b9-768bf7348d40\") " pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:29.369134 master-0 kubenswrapper[18592]: I0308 04:10:29.368946 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ec85adee-abf5-4bfa-a9b9-768bf7348d40-webhook-cert\") pod \"lvms-operator-69d6f48865-mj97v\" (UID: \"ec85adee-abf5-4bfa-a9b9-768bf7348d40\") " pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:29.369134 master-0 kubenswrapper[18592]: I0308 04:10:29.368992 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec85adee-abf5-4bfa-a9b9-768bf7348d40-metrics-cert\") pod \"lvms-operator-69d6f48865-mj97v\" (UID: \"ec85adee-abf5-4bfa-a9b9-768bf7348d40\") " pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:29.369134 master-0 kubenswrapper[18592]: I0308 04:10:29.369065 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/ec85adee-abf5-4bfa-a9b9-768bf7348d40-socket-dir\") pod \"lvms-operator-69d6f48865-mj97v\" (UID: \"ec85adee-abf5-4bfa-a9b9-768bf7348d40\") " pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:29.369134 master-0 kubenswrapper[18592]: I0308 04:10:29.369109 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ec85adee-abf5-4bfa-a9b9-768bf7348d40-apiservice-cert\") pod \"lvms-operator-69d6f48865-mj97v\" (UID: \"ec85adee-abf5-4bfa-a9b9-768bf7348d40\") " pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:29.370280 master-0 kubenswrapper[18592]: I0308 04:10:29.370217 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/ec85adee-abf5-4bfa-a9b9-768bf7348d40-socket-dir\") pod \"lvms-operator-69d6f48865-mj97v\" (UID: \"ec85adee-abf5-4bfa-a9b9-768bf7348d40\") " pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:29.374073 master-0 kubenswrapper[18592]: I0308 04:10:29.373716 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec85adee-abf5-4bfa-a9b9-768bf7348d40-metrics-cert\") pod \"lvms-operator-69d6f48865-mj97v\" (UID: \"ec85adee-abf5-4bfa-a9b9-768bf7348d40\") " pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:29.374073 master-0 kubenswrapper[18592]: I0308 04:10:29.373735 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ec85adee-abf5-4bfa-a9b9-768bf7348d40-apiservice-cert\") pod \"lvms-operator-69d6f48865-mj97v\" (UID: \"ec85adee-abf5-4bfa-a9b9-768bf7348d40\") " pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:29.375942 master-0 kubenswrapper[18592]: I0308 04:10:29.375179 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ec85adee-abf5-4bfa-a9b9-768bf7348d40-webhook-cert\") pod \"lvms-operator-69d6f48865-mj97v\" (UID: \"ec85adee-abf5-4bfa-a9b9-768bf7348d40\") " pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:29.397565 master-0 kubenswrapper[18592]: I0308 04:10:29.397489 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9mfb\" (UniqueName: \"kubernetes.io/projected/ec85adee-abf5-4bfa-a9b9-768bf7348d40-kube-api-access-t9mfb\") pod \"lvms-operator-69d6f48865-mj97v\" (UID: \"ec85adee-abf5-4bfa-a9b9-768bf7348d40\") " pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:29.517529 master-0 kubenswrapper[18592]: I0308 04:10:29.517362 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:30.017646 master-0 kubenswrapper[18592]: I0308 04:10:30.017282 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-69d6f48865-mj97v"] Mar 08 04:10:30.026481 master-0 kubenswrapper[18592]: W0308 04:10:30.026418 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec85adee_abf5_4bfa_a9b9_768bf7348d40.slice/crio-a5796f0a130c3a404bc7f6d0d3d2dca6ea3228bf73199ad31d4c87c355c3afd9 WatchSource:0}: Error finding container a5796f0a130c3a404bc7f6d0d3d2dca6ea3228bf73199ad31d4c87c355c3afd9: Status 404 returned error can't find the container with id a5796f0a130c3a404bc7f6d0d3d2dca6ea3228bf73199ad31d4c87c355c3afd9 Mar 08 04:10:30.859702 master-0 kubenswrapper[18592]: I0308 04:10:30.859646 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-69d6f48865-mj97v" event={"ID":"ec85adee-abf5-4bfa-a9b9-768bf7348d40","Type":"ContainerStarted","Data":"a5796f0a130c3a404bc7f6d0d3d2dca6ea3228bf73199ad31d4c87c355c3afd9"} Mar 08 04:10:35.904420 master-0 kubenswrapper[18592]: I0308 04:10:35.904254 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-69d6f48865-mj97v" event={"ID":"ec85adee-abf5-4bfa-a9b9-768bf7348d40","Type":"ContainerStarted","Data":"0bbdc5e17f79673fa1a1fa2b30cfdb55db1b3162f9186a7b0ff68daf0a06b992"} Mar 08 04:10:35.904420 master-0 kubenswrapper[18592]: I0308 04:10:35.904379 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:35.939000 master-0 kubenswrapper[18592]: I0308 04:10:35.938852 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/lvms-operator-69d6f48865-mj97v" podStartSLOduration=1.507249104 podStartE2EDuration="6.938794287s" podCreationTimestamp="2026-03-08 04:10:29 +0000 UTC" firstStartedPulling="2026-03-08 04:10:30.03319058 +0000 UTC m=+1042.131944930" lastFinishedPulling="2026-03-08 04:10:35.464735743 +0000 UTC m=+1047.563490113" observedRunningTime="2026-03-08 04:10:35.930977839 +0000 UTC m=+1048.029732279" watchObservedRunningTime="2026-03-08 04:10:35.938794287 +0000 UTC m=+1048.037548677" Mar 08 04:10:36.919588 master-0 kubenswrapper[18592]: I0308 04:10:36.919532 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/lvms-operator-69d6f48865-mj97v" Mar 08 04:10:41.111946 master-0 kubenswrapper[18592]: I0308 04:10:41.111812 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v"] Mar 08 04:10:41.115287 master-0 kubenswrapper[18592]: I0308 04:10:41.114151 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v" Mar 08 04:10:41.132986 master-0 kubenswrapper[18592]: I0308 04:10:41.132917 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v"] Mar 08 04:10:41.171551 master-0 kubenswrapper[18592]: I0308 04:10:41.171487 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0320b24e-1979-4094-83ee-1b0bed497c48-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v\" (UID: \"0320b24e-1979-4094-83ee-1b0bed497c48\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v" Mar 08 04:10:41.171753 master-0 kubenswrapper[18592]: I0308 04:10:41.171665 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0320b24e-1979-4094-83ee-1b0bed497c48-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v\" (UID: \"0320b24e-1979-4094-83ee-1b0bed497c48\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v" Mar 08 04:10:41.171843 master-0 kubenswrapper[18592]: I0308 04:10:41.171788 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8zc4\" (UniqueName: \"kubernetes.io/projected/0320b24e-1979-4094-83ee-1b0bed497c48-kube-api-access-j8zc4\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v\" (UID: \"0320b24e-1979-4094-83ee-1b0bed497c48\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v" Mar 08 04:10:41.273682 master-0 kubenswrapper[18592]: I0308 04:10:41.273576 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0320b24e-1979-4094-83ee-1b0bed497c48-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v\" (UID: \"0320b24e-1979-4094-83ee-1b0bed497c48\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v" Mar 08 04:10:41.273971 master-0 kubenswrapper[18592]: I0308 04:10:41.273721 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8zc4\" (UniqueName: \"kubernetes.io/projected/0320b24e-1979-4094-83ee-1b0bed497c48-kube-api-access-j8zc4\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v\" (UID: \"0320b24e-1979-4094-83ee-1b0bed497c48\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v" Mar 08 04:10:41.273971 master-0 kubenswrapper[18592]: I0308 04:10:41.273875 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0320b24e-1979-4094-83ee-1b0bed497c48-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v\" (UID: \"0320b24e-1979-4094-83ee-1b0bed497c48\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v" Mar 08 04:10:41.274814 master-0 kubenswrapper[18592]: I0308 04:10:41.274751 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0320b24e-1979-4094-83ee-1b0bed497c48-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v\" (UID: \"0320b24e-1979-4094-83ee-1b0bed497c48\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v" Mar 08 04:10:41.275328 master-0 kubenswrapper[18592]: I0308 04:10:41.275286 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0320b24e-1979-4094-83ee-1b0bed497c48-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v\" (UID: \"0320b24e-1979-4094-83ee-1b0bed497c48\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v" Mar 08 04:10:41.296726 master-0 kubenswrapper[18592]: I0308 04:10:41.296618 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8zc4\" (UniqueName: \"kubernetes.io/projected/0320b24e-1979-4094-83ee-1b0bed497c48-kube-api-access-j8zc4\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v\" (UID: \"0320b24e-1979-4094-83ee-1b0bed497c48\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v" Mar 08 04:10:41.442746 master-0 kubenswrapper[18592]: I0308 04:10:41.442690 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v" Mar 08 04:10:41.885262 master-0 kubenswrapper[18592]: I0308 04:10:41.881852 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v"] Mar 08 04:10:41.891051 master-0 kubenswrapper[18592]: W0308 04:10:41.890990 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0320b24e_1979_4094_83ee_1b0bed497c48.slice/crio-924edb72a8e175c2ea299af0b6a234b1c4bb545fae4fb4b320e522e1f9444f57 WatchSource:0}: Error finding container 924edb72a8e175c2ea299af0b6a234b1c4bb545fae4fb4b320e522e1f9444f57: Status 404 returned error can't find the container with id 924edb72a8e175c2ea299af0b6a234b1c4bb545fae4fb4b320e522e1f9444f57 Mar 08 04:10:41.967858 master-0 kubenswrapper[18592]: I0308 04:10:41.963859 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v" event={"ID":"0320b24e-1979-4094-83ee-1b0bed497c48","Type":"ContainerStarted","Data":"924edb72a8e175c2ea299af0b6a234b1c4bb545fae4fb4b320e522e1f9444f57"} Mar 08 04:10:42.973932 master-0 kubenswrapper[18592]: I0308 04:10:42.973842 18592 generic.go:334] "Generic (PLEG): container finished" podID="0320b24e-1979-4094-83ee-1b0bed497c48" containerID="418858ed7df9f1768e3dc873303be355f67dac9c8cdbce976d3180eb11bc2e07" exitCode=0 Mar 08 04:10:42.973932 master-0 kubenswrapper[18592]: I0308 04:10:42.973895 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v" event={"ID":"0320b24e-1979-4094-83ee-1b0bed497c48","Type":"ContainerDied","Data":"418858ed7df9f1768e3dc873303be355f67dac9c8cdbce976d3180eb11bc2e07"} Mar 08 04:10:43.078387 master-0 kubenswrapper[18592]: I0308 04:10:43.076120 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs"] Mar 08 04:10:43.078387 master-0 kubenswrapper[18592]: I0308 04:10:43.077657 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs" Mar 08 04:10:43.104969 master-0 kubenswrapper[18592]: I0308 04:10:43.104897 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs"] Mar 08 04:10:43.214394 master-0 kubenswrapper[18592]: I0308 04:10:43.214302 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/10696d0e-9804-4b45-bfb3-b3a126b9c731-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs\" (UID: \"10696d0e-9804-4b45-bfb3-b3a126b9c731\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs" Mar 08 04:10:43.214630 master-0 kubenswrapper[18592]: I0308 04:10:43.214585 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/10696d0e-9804-4b45-bfb3-b3a126b9c731-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs\" (UID: \"10696d0e-9804-4b45-bfb3-b3a126b9c731\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs" Mar 08 04:10:43.214797 master-0 kubenswrapper[18592]: I0308 04:10:43.214755 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2wd2\" (UniqueName: \"kubernetes.io/projected/10696d0e-9804-4b45-bfb3-b3a126b9c731-kube-api-access-p2wd2\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs\" (UID: \"10696d0e-9804-4b45-bfb3-b3a126b9c731\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs" Mar 08 04:10:43.316653 master-0 kubenswrapper[18592]: I0308 04:10:43.316512 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/10696d0e-9804-4b45-bfb3-b3a126b9c731-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs\" (UID: \"10696d0e-9804-4b45-bfb3-b3a126b9c731\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs" Mar 08 04:10:43.316653 master-0 kubenswrapper[18592]: I0308 04:10:43.316634 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2wd2\" (UniqueName: \"kubernetes.io/projected/10696d0e-9804-4b45-bfb3-b3a126b9c731-kube-api-access-p2wd2\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs\" (UID: \"10696d0e-9804-4b45-bfb3-b3a126b9c731\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs" Mar 08 04:10:43.316921 master-0 kubenswrapper[18592]: I0308 04:10:43.316703 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/10696d0e-9804-4b45-bfb3-b3a126b9c731-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs\" (UID: \"10696d0e-9804-4b45-bfb3-b3a126b9c731\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs" Mar 08 04:10:43.317786 master-0 kubenswrapper[18592]: I0308 04:10:43.317733 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/10696d0e-9804-4b45-bfb3-b3a126b9c731-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs\" (UID: \"10696d0e-9804-4b45-bfb3-b3a126b9c731\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs" Mar 08 04:10:43.319260 master-0 kubenswrapper[18592]: I0308 04:10:43.319225 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/10696d0e-9804-4b45-bfb3-b3a126b9c731-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs\" (UID: \"10696d0e-9804-4b45-bfb3-b3a126b9c731\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs" Mar 08 04:10:43.345132 master-0 kubenswrapper[18592]: I0308 04:10:43.345071 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2wd2\" (UniqueName: \"kubernetes.io/projected/10696d0e-9804-4b45-bfb3-b3a126b9c731-kube-api-access-p2wd2\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs\" (UID: \"10696d0e-9804-4b45-bfb3-b3a126b9c731\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs" Mar 08 04:10:43.391881 master-0 kubenswrapper[18592]: I0308 04:10:43.391814 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs" Mar 08 04:10:43.666127 master-0 kubenswrapper[18592]: I0308 04:10:43.666078 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4"] Mar 08 04:10:43.667461 master-0 kubenswrapper[18592]: I0308 04:10:43.667429 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4" Mar 08 04:10:43.683402 master-0 kubenswrapper[18592]: I0308 04:10:43.683000 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4"] Mar 08 04:10:43.726252 master-0 kubenswrapper[18592]: I0308 04:10:43.726183 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/364e3e24-88b1-4936-a970-251d27c101f9-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4\" (UID: \"364e3e24-88b1-4936-a970-251d27c101f9\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4" Mar 08 04:10:43.726462 master-0 kubenswrapper[18592]: I0308 04:10:43.726309 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/364e3e24-88b1-4936-a970-251d27c101f9-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4\" (UID: \"364e3e24-88b1-4936-a970-251d27c101f9\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4" Mar 08 04:10:43.726462 master-0 kubenswrapper[18592]: I0308 04:10:43.726412 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pt45\" (UniqueName: \"kubernetes.io/projected/364e3e24-88b1-4936-a970-251d27c101f9-kube-api-access-2pt45\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4\" (UID: \"364e3e24-88b1-4936-a970-251d27c101f9\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4" Mar 08 04:10:43.827712 master-0 kubenswrapper[18592]: I0308 04:10:43.827641 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/364e3e24-88b1-4936-a970-251d27c101f9-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4\" (UID: \"364e3e24-88b1-4936-a970-251d27c101f9\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4" Mar 08 04:10:43.827950 master-0 kubenswrapper[18592]: I0308 04:10:43.827738 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pt45\" (UniqueName: \"kubernetes.io/projected/364e3e24-88b1-4936-a970-251d27c101f9-kube-api-access-2pt45\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4\" (UID: \"364e3e24-88b1-4936-a970-251d27c101f9\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4" Mar 08 04:10:43.827950 master-0 kubenswrapper[18592]: I0308 04:10:43.827797 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/364e3e24-88b1-4936-a970-251d27c101f9-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4\" (UID: \"364e3e24-88b1-4936-a970-251d27c101f9\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4" Mar 08 04:10:43.829156 master-0 kubenswrapper[18592]: I0308 04:10:43.828207 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/364e3e24-88b1-4936-a970-251d27c101f9-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4\" (UID: \"364e3e24-88b1-4936-a970-251d27c101f9\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4" Mar 08 04:10:43.829156 master-0 kubenswrapper[18592]: I0308 04:10:43.828287 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/364e3e24-88b1-4936-a970-251d27c101f9-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4\" (UID: \"364e3e24-88b1-4936-a970-251d27c101f9\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4" Mar 08 04:10:43.844844 master-0 kubenswrapper[18592]: I0308 04:10:43.844781 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pt45\" (UniqueName: \"kubernetes.io/projected/364e3e24-88b1-4936-a970-251d27c101f9-kube-api-access-2pt45\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4\" (UID: \"364e3e24-88b1-4936-a970-251d27c101f9\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4" Mar 08 04:10:43.903186 master-0 kubenswrapper[18592]: I0308 04:10:43.903131 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs"] Mar 08 04:10:43.912019 master-0 kubenswrapper[18592]: W0308 04:10:43.911966 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10696d0e_9804_4b45_bfb3_b3a126b9c731.slice/crio-1145dc252b6d8d5cc06357b0dc7e2a3a95643d0989575ed56e111fab416830f9 WatchSource:0}: Error finding container 1145dc252b6d8d5cc06357b0dc7e2a3a95643d0989575ed56e111fab416830f9: Status 404 returned error can't find the container with id 1145dc252b6d8d5cc06357b0dc7e2a3a95643d0989575ed56e111fab416830f9 Mar 08 04:10:43.984545 master-0 kubenswrapper[18592]: I0308 04:10:43.984490 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs" event={"ID":"10696d0e-9804-4b45-bfb3-b3a126b9c731","Type":"ContainerStarted","Data":"1145dc252b6d8d5cc06357b0dc7e2a3a95643d0989575ed56e111fab416830f9"} Mar 08 04:10:44.000221 master-0 kubenswrapper[18592]: I0308 04:10:43.992862 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4" Mar 08 04:10:44.488195 master-0 kubenswrapper[18592]: I0308 04:10:44.488089 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4"] Mar 08 04:10:44.493784 master-0 kubenswrapper[18592]: W0308 04:10:44.493719 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod364e3e24_88b1_4936_a970_251d27c101f9.slice/crio-01aa94e33dd33fae64119d55ab651f97c65897c1a3af823e62e5245900e05554 WatchSource:0}: Error finding container 01aa94e33dd33fae64119d55ab651f97c65897c1a3af823e62e5245900e05554: Status 404 returned error can't find the container with id 01aa94e33dd33fae64119d55ab651f97c65897c1a3af823e62e5245900e05554 Mar 08 04:10:44.999997 master-0 kubenswrapper[18592]: I0308 04:10:44.999868 18592 generic.go:334] "Generic (PLEG): container finished" podID="364e3e24-88b1-4936-a970-251d27c101f9" containerID="6152ae47d94836b1cb821912ebbf8f26f92a763b85c3595bd567bf62904cf1e6" exitCode=0 Mar 08 04:10:44.999997 master-0 kubenswrapper[18592]: I0308 04:10:44.999962 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4" event={"ID":"364e3e24-88b1-4936-a970-251d27c101f9","Type":"ContainerDied","Data":"6152ae47d94836b1cb821912ebbf8f26f92a763b85c3595bd567bf62904cf1e6"} Mar 08 04:10:45.001092 master-0 kubenswrapper[18592]: I0308 04:10:45.000115 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4" event={"ID":"364e3e24-88b1-4936-a970-251d27c101f9","Type":"ContainerStarted","Data":"01aa94e33dd33fae64119d55ab651f97c65897c1a3af823e62e5245900e05554"} Mar 08 04:10:45.005273 master-0 kubenswrapper[18592]: I0308 04:10:45.005225 18592 generic.go:334] "Generic (PLEG): container finished" podID="10696d0e-9804-4b45-bfb3-b3a126b9c731" containerID="cf397cc9f192d98c86448abd3522c54ee0ea45c24b3c2f18ee20490f962a0ebf" exitCode=0 Mar 08 04:10:45.005513 master-0 kubenswrapper[18592]: I0308 04:10:45.005314 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs" event={"ID":"10696d0e-9804-4b45-bfb3-b3a126b9c731","Type":"ContainerDied","Data":"cf397cc9f192d98c86448abd3522c54ee0ea45c24b3c2f18ee20490f962a0ebf"} Mar 08 04:10:46.899885 master-0 kubenswrapper[18592]: I0308 04:10:46.899703 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29"] Mar 08 04:10:46.902883 master-0 kubenswrapper[18592]: I0308 04:10:46.902772 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29" Mar 08 04:10:46.923118 master-0 kubenswrapper[18592]: I0308 04:10:46.923057 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29"] Mar 08 04:10:46.981483 master-0 kubenswrapper[18592]: I0308 04:10:46.981373 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3350939a-447b-497e-9ed7-2712f1b45784-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29\" (UID: \"3350939a-447b-497e-9ed7-2712f1b45784\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29" Mar 08 04:10:46.981705 master-0 kubenswrapper[18592]: I0308 04:10:46.981636 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9bbl\" (UniqueName: \"kubernetes.io/projected/3350939a-447b-497e-9ed7-2712f1b45784-kube-api-access-j9bbl\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29\" (UID: \"3350939a-447b-497e-9ed7-2712f1b45784\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29" Mar 08 04:10:46.981770 master-0 kubenswrapper[18592]: I0308 04:10:46.981746 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3350939a-447b-497e-9ed7-2712f1b45784-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29\" (UID: \"3350939a-447b-497e-9ed7-2712f1b45784\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29" Mar 08 04:10:47.021575 master-0 kubenswrapper[18592]: I0308 04:10:47.021512 18592 generic.go:334] "Generic (PLEG): container finished" podID="0320b24e-1979-4094-83ee-1b0bed497c48" containerID="646022cff220fd8e16a4efbb13dd40952cf47ca58b37e9ade5d381583661fd08" exitCode=0 Mar 08 04:10:47.021754 master-0 kubenswrapper[18592]: I0308 04:10:47.021571 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v" event={"ID":"0320b24e-1979-4094-83ee-1b0bed497c48","Type":"ContainerDied","Data":"646022cff220fd8e16a4efbb13dd40952cf47ca58b37e9ade5d381583661fd08"} Mar 08 04:10:47.082923 master-0 kubenswrapper[18592]: I0308 04:10:47.082847 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3350939a-447b-497e-9ed7-2712f1b45784-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29\" (UID: \"3350939a-447b-497e-9ed7-2712f1b45784\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29" Mar 08 04:10:47.083147 master-0 kubenswrapper[18592]: I0308 04:10:47.083027 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3350939a-447b-497e-9ed7-2712f1b45784-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29\" (UID: \"3350939a-447b-497e-9ed7-2712f1b45784\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29" Mar 08 04:10:47.083147 master-0 kubenswrapper[18592]: I0308 04:10:47.083091 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9bbl\" (UniqueName: \"kubernetes.io/projected/3350939a-447b-497e-9ed7-2712f1b45784-kube-api-access-j9bbl\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29\" (UID: \"3350939a-447b-497e-9ed7-2712f1b45784\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29" Mar 08 04:10:47.083653 master-0 kubenswrapper[18592]: I0308 04:10:47.083366 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3350939a-447b-497e-9ed7-2712f1b45784-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29\" (UID: \"3350939a-447b-497e-9ed7-2712f1b45784\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29" Mar 08 04:10:47.083653 master-0 kubenswrapper[18592]: I0308 04:10:47.083445 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3350939a-447b-497e-9ed7-2712f1b45784-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29\" (UID: \"3350939a-447b-497e-9ed7-2712f1b45784\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29" Mar 08 04:10:47.104530 master-0 kubenswrapper[18592]: I0308 04:10:47.104492 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9bbl\" (UniqueName: \"kubernetes.io/projected/3350939a-447b-497e-9ed7-2712f1b45784-kube-api-access-j9bbl\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29\" (UID: \"3350939a-447b-497e-9ed7-2712f1b45784\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29" Mar 08 04:10:47.229568 master-0 kubenswrapper[18592]: I0308 04:10:47.229502 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29" Mar 08 04:10:47.654507 master-0 kubenswrapper[18592]: I0308 04:10:47.654449 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29"] Mar 08 04:10:47.668767 master-0 kubenswrapper[18592]: W0308 04:10:47.668727 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3350939a_447b_497e_9ed7_2712f1b45784.slice/crio-848827a10c25732daf0bada9f3e87687252923309373f23ea23eaeeb673a089b WatchSource:0}: Error finding container 848827a10c25732daf0bada9f3e87687252923309373f23ea23eaeeb673a089b: Status 404 returned error can't find the container with id 848827a10c25732daf0bada9f3e87687252923309373f23ea23eaeeb673a089b Mar 08 04:10:48.040582 master-0 kubenswrapper[18592]: I0308 04:10:48.040468 18592 generic.go:334] "Generic (PLEG): container finished" podID="10696d0e-9804-4b45-bfb3-b3a126b9c731" containerID="3d7b73828213e81c8d19eed330ea9eed5f300d59e2556db872db11553b542665" exitCode=0 Mar 08 04:10:48.040582 master-0 kubenswrapper[18592]: I0308 04:10:48.040578 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs" event={"ID":"10696d0e-9804-4b45-bfb3-b3a126b9c731","Type":"ContainerDied","Data":"3d7b73828213e81c8d19eed330ea9eed5f300d59e2556db872db11553b542665"} Mar 08 04:10:48.046669 master-0 kubenswrapper[18592]: I0308 04:10:48.046601 18592 generic.go:334] "Generic (PLEG): container finished" podID="0320b24e-1979-4094-83ee-1b0bed497c48" containerID="b6e3bc8110a888deca5be7ad730c1d9695fca73ee2eb143e759eef507b156d34" exitCode=0 Mar 08 04:10:48.046900 master-0 kubenswrapper[18592]: I0308 04:10:48.046736 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v" event={"ID":"0320b24e-1979-4094-83ee-1b0bed497c48","Type":"ContainerDied","Data":"b6e3bc8110a888deca5be7ad730c1d9695fca73ee2eb143e759eef507b156d34"} Mar 08 04:10:48.052793 master-0 kubenswrapper[18592]: I0308 04:10:48.052707 18592 generic.go:334] "Generic (PLEG): container finished" podID="364e3e24-88b1-4936-a970-251d27c101f9" containerID="f21596af2da3d221c127318288b86a1ed845851a449052f2e3db54d1063a432d" exitCode=0 Mar 08 04:10:48.053023 master-0 kubenswrapper[18592]: I0308 04:10:48.052790 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4" event={"ID":"364e3e24-88b1-4936-a970-251d27c101f9","Type":"ContainerDied","Data":"f21596af2da3d221c127318288b86a1ed845851a449052f2e3db54d1063a432d"} Mar 08 04:10:48.057157 master-0 kubenswrapper[18592]: I0308 04:10:48.056820 18592 generic.go:334] "Generic (PLEG): container finished" podID="3350939a-447b-497e-9ed7-2712f1b45784" containerID="1508715503f3898c72110084e610b46fedc01a6c8b78b666f61c05cbcffd3810" exitCode=0 Mar 08 04:10:48.057157 master-0 kubenswrapper[18592]: I0308 04:10:48.056894 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29" event={"ID":"3350939a-447b-497e-9ed7-2712f1b45784","Type":"ContainerDied","Data":"1508715503f3898c72110084e610b46fedc01a6c8b78b666f61c05cbcffd3810"} Mar 08 04:10:48.057157 master-0 kubenswrapper[18592]: I0308 04:10:48.056929 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29" event={"ID":"3350939a-447b-497e-9ed7-2712f1b45784","Type":"ContainerStarted","Data":"848827a10c25732daf0bada9f3e87687252923309373f23ea23eaeeb673a089b"} Mar 08 04:10:49.071654 master-0 kubenswrapper[18592]: I0308 04:10:49.071544 18592 generic.go:334] "Generic (PLEG): container finished" podID="364e3e24-88b1-4936-a970-251d27c101f9" containerID="6d55328288fa40e42882ad5f78c672780bb5ce9c49c8e3e5b41a428c884165c1" exitCode=0 Mar 08 04:10:49.072595 master-0 kubenswrapper[18592]: I0308 04:10:49.071669 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4" event={"ID":"364e3e24-88b1-4936-a970-251d27c101f9","Type":"ContainerDied","Data":"6d55328288fa40e42882ad5f78c672780bb5ce9c49c8e3e5b41a428c884165c1"} Mar 08 04:10:49.078386 master-0 kubenswrapper[18592]: I0308 04:10:49.078304 18592 generic.go:334] "Generic (PLEG): container finished" podID="10696d0e-9804-4b45-bfb3-b3a126b9c731" containerID="9e352a4ba70fe96cef1d8cb0373db82a3eec85e275582c36a684688d5cbff587" exitCode=0 Mar 08 04:10:49.078537 master-0 kubenswrapper[18592]: I0308 04:10:49.078364 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs" event={"ID":"10696d0e-9804-4b45-bfb3-b3a126b9c731","Type":"ContainerDied","Data":"9e352a4ba70fe96cef1d8cb0373db82a3eec85e275582c36a684688d5cbff587"} Mar 08 04:10:49.771722 master-0 kubenswrapper[18592]: I0308 04:10:49.771629 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v" Mar 08 04:10:49.827521 master-0 kubenswrapper[18592]: I0308 04:10:49.827442 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0320b24e-1979-4094-83ee-1b0bed497c48-bundle\") pod \"0320b24e-1979-4094-83ee-1b0bed497c48\" (UID: \"0320b24e-1979-4094-83ee-1b0bed497c48\") " Mar 08 04:10:49.827748 master-0 kubenswrapper[18592]: I0308 04:10:49.827535 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8zc4\" (UniqueName: \"kubernetes.io/projected/0320b24e-1979-4094-83ee-1b0bed497c48-kube-api-access-j8zc4\") pod \"0320b24e-1979-4094-83ee-1b0bed497c48\" (UID: \"0320b24e-1979-4094-83ee-1b0bed497c48\") " Mar 08 04:10:49.827748 master-0 kubenswrapper[18592]: I0308 04:10:49.827584 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0320b24e-1979-4094-83ee-1b0bed497c48-util\") pod \"0320b24e-1979-4094-83ee-1b0bed497c48\" (UID: \"0320b24e-1979-4094-83ee-1b0bed497c48\") " Mar 08 04:10:49.829087 master-0 kubenswrapper[18592]: I0308 04:10:49.828967 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0320b24e-1979-4094-83ee-1b0bed497c48-bundle" (OuterVolumeSpecName: "bundle") pod "0320b24e-1979-4094-83ee-1b0bed497c48" (UID: "0320b24e-1979-4094-83ee-1b0bed497c48"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:10:49.831343 master-0 kubenswrapper[18592]: I0308 04:10:49.831271 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0320b24e-1979-4094-83ee-1b0bed497c48-kube-api-access-j8zc4" (OuterVolumeSpecName: "kube-api-access-j8zc4") pod "0320b24e-1979-4094-83ee-1b0bed497c48" (UID: "0320b24e-1979-4094-83ee-1b0bed497c48"). InnerVolumeSpecName "kube-api-access-j8zc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:10:49.857194 master-0 kubenswrapper[18592]: I0308 04:10:49.857111 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0320b24e-1979-4094-83ee-1b0bed497c48-util" (OuterVolumeSpecName: "util") pod "0320b24e-1979-4094-83ee-1b0bed497c48" (UID: "0320b24e-1979-4094-83ee-1b0bed497c48"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:10:49.930067 master-0 kubenswrapper[18592]: I0308 04:10:49.930010 18592 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0320b24e-1979-4094-83ee-1b0bed497c48-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:10:49.930067 master-0 kubenswrapper[18592]: I0308 04:10:49.930045 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8zc4\" (UniqueName: \"kubernetes.io/projected/0320b24e-1979-4094-83ee-1b0bed497c48-kube-api-access-j8zc4\") on node \"master-0\" DevicePath \"\"" Mar 08 04:10:49.930067 master-0 kubenswrapper[18592]: I0308 04:10:49.930057 18592 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0320b24e-1979-4094-83ee-1b0bed497c48-util\") on node \"master-0\" DevicePath \"\"" Mar 08 04:10:50.090429 master-0 kubenswrapper[18592]: I0308 04:10:50.090371 18592 generic.go:334] "Generic (PLEG): container finished" podID="3350939a-447b-497e-9ed7-2712f1b45784" containerID="46be177aaca06158acdf9a0c3264c9e29a5e9e1f77ce78e369f3ef75a158a10a" exitCode=0 Mar 08 04:10:50.090991 master-0 kubenswrapper[18592]: I0308 04:10:50.090504 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29" event={"ID":"3350939a-447b-497e-9ed7-2712f1b45784","Type":"ContainerDied","Data":"46be177aaca06158acdf9a0c3264c9e29a5e9e1f77ce78e369f3ef75a158a10a"} Mar 08 04:10:50.094220 master-0 kubenswrapper[18592]: I0308 04:10:50.094034 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v" Mar 08 04:10:50.095976 master-0 kubenswrapper[18592]: I0308 04:10:50.095948 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e556w8v" event={"ID":"0320b24e-1979-4094-83ee-1b0bed497c48","Type":"ContainerDied","Data":"924edb72a8e175c2ea299af0b6a234b1c4bb545fae4fb4b320e522e1f9444f57"} Mar 08 04:10:50.096099 master-0 kubenswrapper[18592]: I0308 04:10:50.096080 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="924edb72a8e175c2ea299af0b6a234b1c4bb545fae4fb4b320e522e1f9444f57" Mar 08 04:10:50.513012 master-0 kubenswrapper[18592]: I0308 04:10:50.512913 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs" Mar 08 04:10:50.560600 master-0 kubenswrapper[18592]: I0308 04:10:50.560552 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/10696d0e-9804-4b45-bfb3-b3a126b9c731-util\") pod \"10696d0e-9804-4b45-bfb3-b3a126b9c731\" (UID: \"10696d0e-9804-4b45-bfb3-b3a126b9c731\") " Mar 08 04:10:50.560694 master-0 kubenswrapper[18592]: I0308 04:10:50.560667 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2wd2\" (UniqueName: \"kubernetes.io/projected/10696d0e-9804-4b45-bfb3-b3a126b9c731-kube-api-access-p2wd2\") pod \"10696d0e-9804-4b45-bfb3-b3a126b9c731\" (UID: \"10696d0e-9804-4b45-bfb3-b3a126b9c731\") " Mar 08 04:10:50.561121 master-0 kubenswrapper[18592]: I0308 04:10:50.561092 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/10696d0e-9804-4b45-bfb3-b3a126b9c731-bundle\") pod \"10696d0e-9804-4b45-bfb3-b3a126b9c731\" (UID: \"10696d0e-9804-4b45-bfb3-b3a126b9c731\") " Mar 08 04:10:50.562012 master-0 kubenswrapper[18592]: I0308 04:10:50.561964 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10696d0e-9804-4b45-bfb3-b3a126b9c731-bundle" (OuterVolumeSpecName: "bundle") pod "10696d0e-9804-4b45-bfb3-b3a126b9c731" (UID: "10696d0e-9804-4b45-bfb3-b3a126b9c731"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:10:50.563517 master-0 kubenswrapper[18592]: I0308 04:10:50.563487 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10696d0e-9804-4b45-bfb3-b3a126b9c731-kube-api-access-p2wd2" (OuterVolumeSpecName: "kube-api-access-p2wd2") pod "10696d0e-9804-4b45-bfb3-b3a126b9c731" (UID: "10696d0e-9804-4b45-bfb3-b3a126b9c731"). InnerVolumeSpecName "kube-api-access-p2wd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:10:50.570290 master-0 kubenswrapper[18592]: I0308 04:10:50.570238 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10696d0e-9804-4b45-bfb3-b3a126b9c731-util" (OuterVolumeSpecName: "util") pod "10696d0e-9804-4b45-bfb3-b3a126b9c731" (UID: "10696d0e-9804-4b45-bfb3-b3a126b9c731"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:10:50.623219 master-0 kubenswrapper[18592]: I0308 04:10:50.623115 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4" Mar 08 04:10:50.662945 master-0 kubenswrapper[18592]: I0308 04:10:50.662815 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/364e3e24-88b1-4936-a970-251d27c101f9-bundle\") pod \"364e3e24-88b1-4936-a970-251d27c101f9\" (UID: \"364e3e24-88b1-4936-a970-251d27c101f9\") " Mar 08 04:10:50.663381 master-0 kubenswrapper[18592]: I0308 04:10:50.663353 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/364e3e24-88b1-4936-a970-251d27c101f9-util\") pod \"364e3e24-88b1-4936-a970-251d27c101f9\" (UID: \"364e3e24-88b1-4936-a970-251d27c101f9\") " Mar 08 04:10:50.663641 master-0 kubenswrapper[18592]: I0308 04:10:50.663610 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pt45\" (UniqueName: \"kubernetes.io/projected/364e3e24-88b1-4936-a970-251d27c101f9-kube-api-access-2pt45\") pod \"364e3e24-88b1-4936-a970-251d27c101f9\" (UID: \"364e3e24-88b1-4936-a970-251d27c101f9\") " Mar 08 04:10:50.664388 master-0 kubenswrapper[18592]: I0308 04:10:50.664360 18592 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/10696d0e-9804-4b45-bfb3-b3a126b9c731-util\") on node \"master-0\" DevicePath \"\"" Mar 08 04:10:50.664555 master-0 kubenswrapper[18592]: I0308 04:10:50.664530 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2wd2\" (UniqueName: \"kubernetes.io/projected/10696d0e-9804-4b45-bfb3-b3a126b9c731-kube-api-access-p2wd2\") on node \"master-0\" DevicePath \"\"" Mar 08 04:10:50.664692 master-0 kubenswrapper[18592]: I0308 04:10:50.664670 18592 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/10696d0e-9804-4b45-bfb3-b3a126b9c731-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:10:50.664809 master-0 kubenswrapper[18592]: I0308 04:10:50.663710 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/364e3e24-88b1-4936-a970-251d27c101f9-bundle" (OuterVolumeSpecName: "bundle") pod "364e3e24-88b1-4936-a970-251d27c101f9" (UID: "364e3e24-88b1-4936-a970-251d27c101f9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:10:50.669578 master-0 kubenswrapper[18592]: I0308 04:10:50.669540 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/364e3e24-88b1-4936-a970-251d27c101f9-kube-api-access-2pt45" (OuterVolumeSpecName: "kube-api-access-2pt45") pod "364e3e24-88b1-4936-a970-251d27c101f9" (UID: "364e3e24-88b1-4936-a970-251d27c101f9"). InnerVolumeSpecName "kube-api-access-2pt45". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:10:50.684881 master-0 kubenswrapper[18592]: I0308 04:10:50.684603 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/364e3e24-88b1-4936-a970-251d27c101f9-util" (OuterVolumeSpecName: "util") pod "364e3e24-88b1-4936-a970-251d27c101f9" (UID: "364e3e24-88b1-4936-a970-251d27c101f9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:10:50.766184 master-0 kubenswrapper[18592]: I0308 04:10:50.766119 18592 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/364e3e24-88b1-4936-a970-251d27c101f9-util\") on node \"master-0\" DevicePath \"\"" Mar 08 04:10:50.766184 master-0 kubenswrapper[18592]: I0308 04:10:50.766171 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pt45\" (UniqueName: \"kubernetes.io/projected/364e3e24-88b1-4936-a970-251d27c101f9-kube-api-access-2pt45\") on node \"master-0\" DevicePath \"\"" Mar 08 04:10:50.766184 master-0 kubenswrapper[18592]: I0308 04:10:50.766187 18592 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/364e3e24-88b1-4936-a970-251d27c101f9-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:10:51.107573 master-0 kubenswrapper[18592]: I0308 04:10:51.107516 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs" event={"ID":"10696d0e-9804-4b45-bfb3-b3a126b9c731","Type":"ContainerDied","Data":"1145dc252b6d8d5cc06357b0dc7e2a3a95643d0989575ed56e111fab416830f9"} Mar 08 04:10:51.108265 master-0 kubenswrapper[18592]: I0308 04:10:51.108243 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1145dc252b6d8d5cc06357b0dc7e2a3a95643d0989575ed56e111fab416830f9" Mar 08 04:10:51.108352 master-0 kubenswrapper[18592]: I0308 04:10:51.107553 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47mjcs" Mar 08 04:10:51.111014 master-0 kubenswrapper[18592]: I0308 04:10:51.110970 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4" Mar 08 04:10:51.111250 master-0 kubenswrapper[18592]: I0308 04:10:51.110957 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82286h4" event={"ID":"364e3e24-88b1-4936-a970-251d27c101f9","Type":"ContainerDied","Data":"01aa94e33dd33fae64119d55ab651f97c65897c1a3af823e62e5245900e05554"} Mar 08 04:10:51.111316 master-0 kubenswrapper[18592]: I0308 04:10:51.111259 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01aa94e33dd33fae64119d55ab651f97c65897c1a3af823e62e5245900e05554" Mar 08 04:10:51.114056 master-0 kubenswrapper[18592]: I0308 04:10:51.113996 18592 generic.go:334] "Generic (PLEG): container finished" podID="3350939a-447b-497e-9ed7-2712f1b45784" containerID="9d01ff30900f500ad1cbc70eb4769d6b22ed99d9ea778085bdd503a6c10263bf" exitCode=0 Mar 08 04:10:51.114183 master-0 kubenswrapper[18592]: I0308 04:10:51.114057 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29" event={"ID":"3350939a-447b-497e-9ed7-2712f1b45784","Type":"ContainerDied","Data":"9d01ff30900f500ad1cbc70eb4769d6b22ed99d9ea778085bdd503a6c10263bf"} Mar 08 04:10:52.612594 master-0 kubenswrapper[18592]: I0308 04:10:52.612534 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29" Mar 08 04:10:52.719319 master-0 kubenswrapper[18592]: I0308 04:10:52.719275 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9bbl\" (UniqueName: \"kubernetes.io/projected/3350939a-447b-497e-9ed7-2712f1b45784-kube-api-access-j9bbl\") pod \"3350939a-447b-497e-9ed7-2712f1b45784\" (UID: \"3350939a-447b-497e-9ed7-2712f1b45784\") " Mar 08 04:10:52.719632 master-0 kubenswrapper[18592]: I0308 04:10:52.719611 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3350939a-447b-497e-9ed7-2712f1b45784-bundle\") pod \"3350939a-447b-497e-9ed7-2712f1b45784\" (UID: \"3350939a-447b-497e-9ed7-2712f1b45784\") " Mar 08 04:10:52.719788 master-0 kubenswrapper[18592]: I0308 04:10:52.719769 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3350939a-447b-497e-9ed7-2712f1b45784-util\") pod \"3350939a-447b-497e-9ed7-2712f1b45784\" (UID: \"3350939a-447b-497e-9ed7-2712f1b45784\") " Mar 08 04:10:52.740940 master-0 kubenswrapper[18592]: I0308 04:10:52.728309 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3350939a-447b-497e-9ed7-2712f1b45784-bundle" (OuterVolumeSpecName: "bundle") pod "3350939a-447b-497e-9ed7-2712f1b45784" (UID: "3350939a-447b-497e-9ed7-2712f1b45784"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:10:52.740940 master-0 kubenswrapper[18592]: I0308 04:10:52.730403 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3350939a-447b-497e-9ed7-2712f1b45784-kube-api-access-j9bbl" (OuterVolumeSpecName: "kube-api-access-j9bbl") pod "3350939a-447b-497e-9ed7-2712f1b45784" (UID: "3350939a-447b-497e-9ed7-2712f1b45784"). InnerVolumeSpecName "kube-api-access-j9bbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:10:52.763834 master-0 kubenswrapper[18592]: I0308 04:10:52.763768 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3350939a-447b-497e-9ed7-2712f1b45784-util" (OuterVolumeSpecName: "util") pod "3350939a-447b-497e-9ed7-2712f1b45784" (UID: "3350939a-447b-497e-9ed7-2712f1b45784"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:10:52.823906 master-0 kubenswrapper[18592]: I0308 04:10:52.822875 18592 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3350939a-447b-497e-9ed7-2712f1b45784-util\") on node \"master-0\" DevicePath \"\"" Mar 08 04:10:52.823906 master-0 kubenswrapper[18592]: I0308 04:10:52.822936 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9bbl\" (UniqueName: \"kubernetes.io/projected/3350939a-447b-497e-9ed7-2712f1b45784-kube-api-access-j9bbl\") on node \"master-0\" DevicePath \"\"" Mar 08 04:10:52.823906 master-0 kubenswrapper[18592]: I0308 04:10:52.822952 18592 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3350939a-447b-497e-9ed7-2712f1b45784-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:10:53.143929 master-0 kubenswrapper[18592]: I0308 04:10:53.142649 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29" event={"ID":"3350939a-447b-497e-9ed7-2712f1b45784","Type":"ContainerDied","Data":"848827a10c25732daf0bada9f3e87687252923309373f23ea23eaeeb673a089b"} Mar 08 04:10:53.143929 master-0 kubenswrapper[18592]: I0308 04:10:53.142692 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="848827a10c25732daf0bada9f3e87687252923309373f23ea23eaeeb673a089b" Mar 08 04:10:53.143929 master-0 kubenswrapper[18592]: I0308 04:10:53.142674 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08x6x29" Mar 08 04:10:54.320342 master-0 kubenswrapper[18592]: I0308 04:10:54.320236 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-nhcp5"] Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: E0308 04:10:54.320530 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="364e3e24-88b1-4936-a970-251d27c101f9" containerName="extract" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: I0308 04:10:54.320542 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="364e3e24-88b1-4936-a970-251d27c101f9" containerName="extract" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: E0308 04:10:54.320557 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3350939a-447b-497e-9ed7-2712f1b45784" containerName="util" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: I0308 04:10:54.320563 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="3350939a-447b-497e-9ed7-2712f1b45784" containerName="util" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: E0308 04:10:54.320574 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0320b24e-1979-4094-83ee-1b0bed497c48" containerName="pull" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: I0308 04:10:54.320581 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="0320b24e-1979-4094-83ee-1b0bed497c48" containerName="pull" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: E0308 04:10:54.320594 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10696d0e-9804-4b45-bfb3-b3a126b9c731" containerName="util" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: I0308 04:10:54.320600 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="10696d0e-9804-4b45-bfb3-b3a126b9c731" containerName="util" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: E0308 04:10:54.320610 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="364e3e24-88b1-4936-a970-251d27c101f9" containerName="pull" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: I0308 04:10:54.320616 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="364e3e24-88b1-4936-a970-251d27c101f9" containerName="pull" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: E0308 04:10:54.320625 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="364e3e24-88b1-4936-a970-251d27c101f9" containerName="util" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: I0308 04:10:54.320630 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="364e3e24-88b1-4936-a970-251d27c101f9" containerName="util" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: E0308 04:10:54.320642 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3350939a-447b-497e-9ed7-2712f1b45784" containerName="extract" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: I0308 04:10:54.320648 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="3350939a-447b-497e-9ed7-2712f1b45784" containerName="extract" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: E0308 04:10:54.320661 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0320b24e-1979-4094-83ee-1b0bed497c48" containerName="extract" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: I0308 04:10:54.320667 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="0320b24e-1979-4094-83ee-1b0bed497c48" containerName="extract" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: E0308 04:10:54.320680 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10696d0e-9804-4b45-bfb3-b3a126b9c731" containerName="pull" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: I0308 04:10:54.320686 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="10696d0e-9804-4b45-bfb3-b3a126b9c731" containerName="pull" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: E0308 04:10:54.320696 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0320b24e-1979-4094-83ee-1b0bed497c48" containerName="util" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: I0308 04:10:54.320702 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="0320b24e-1979-4094-83ee-1b0bed497c48" containerName="util" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: E0308 04:10:54.320712 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10696d0e-9804-4b45-bfb3-b3a126b9c731" containerName="extract" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: I0308 04:10:54.320718 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="10696d0e-9804-4b45-bfb3-b3a126b9c731" containerName="extract" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: E0308 04:10:54.320727 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3350939a-447b-497e-9ed7-2712f1b45784" containerName="pull" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: I0308 04:10:54.320733 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="3350939a-447b-497e-9ed7-2712f1b45784" containerName="pull" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: I0308 04:10:54.320861 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="3350939a-447b-497e-9ed7-2712f1b45784" containerName="extract" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: I0308 04:10:54.320877 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="364e3e24-88b1-4936-a970-251d27c101f9" containerName="extract" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: I0308 04:10:54.320897 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="0320b24e-1979-4094-83ee-1b0bed497c48" containerName="extract" Mar 08 04:10:54.321196 master-0 kubenswrapper[18592]: I0308 04:10:54.320917 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="10696d0e-9804-4b45-bfb3-b3a126b9c731" containerName="extract" Mar 08 04:10:54.322866 master-0 kubenswrapper[18592]: I0308 04:10:54.321333 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-nhcp5" Mar 08 04:10:54.323802 master-0 kubenswrapper[18592]: I0308 04:10:54.323760 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 08 04:10:54.325002 master-0 kubenswrapper[18592]: I0308 04:10:54.324948 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 08 04:10:54.344076 master-0 kubenswrapper[18592]: I0308 04:10:54.344026 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-nhcp5"] Mar 08 04:10:54.346851 master-0 kubenswrapper[18592]: I0308 04:10:54.346740 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z2xd\" (UniqueName: \"kubernetes.io/projected/aaf0c2bc-8c16-4ba9-ab6e-8a06ddba7de6-kube-api-access-7z2xd\") pod \"cert-manager-operator-controller-manager-66c8bdd694-nhcp5\" (UID: \"aaf0c2bc-8c16-4ba9-ab6e-8a06ddba7de6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-nhcp5" Mar 08 04:10:54.347068 master-0 kubenswrapper[18592]: I0308 04:10:54.347011 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aaf0c2bc-8c16-4ba9-ab6e-8a06ddba7de6-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-nhcp5\" (UID: \"aaf0c2bc-8c16-4ba9-ab6e-8a06ddba7de6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-nhcp5" Mar 08 04:10:54.448780 master-0 kubenswrapper[18592]: I0308 04:10:54.448709 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aaf0c2bc-8c16-4ba9-ab6e-8a06ddba7de6-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-nhcp5\" (UID: \"aaf0c2bc-8c16-4ba9-ab6e-8a06ddba7de6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-nhcp5" Mar 08 04:10:54.449042 master-0 kubenswrapper[18592]: I0308 04:10:54.448810 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z2xd\" (UniqueName: \"kubernetes.io/projected/aaf0c2bc-8c16-4ba9-ab6e-8a06ddba7de6-kube-api-access-7z2xd\") pod \"cert-manager-operator-controller-manager-66c8bdd694-nhcp5\" (UID: \"aaf0c2bc-8c16-4ba9-ab6e-8a06ddba7de6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-nhcp5" Mar 08 04:10:54.449586 master-0 kubenswrapper[18592]: I0308 04:10:54.449533 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aaf0c2bc-8c16-4ba9-ab6e-8a06ddba7de6-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-nhcp5\" (UID: \"aaf0c2bc-8c16-4ba9-ab6e-8a06ddba7de6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-nhcp5" Mar 08 04:10:54.467396 master-0 kubenswrapper[18592]: I0308 04:10:54.467314 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z2xd\" (UniqueName: \"kubernetes.io/projected/aaf0c2bc-8c16-4ba9-ab6e-8a06ddba7de6-kube-api-access-7z2xd\") pod \"cert-manager-operator-controller-manager-66c8bdd694-nhcp5\" (UID: \"aaf0c2bc-8c16-4ba9-ab6e-8a06ddba7de6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-nhcp5" Mar 08 04:10:54.643747 master-0 kubenswrapper[18592]: I0308 04:10:54.643619 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-nhcp5" Mar 08 04:10:55.187678 master-0 kubenswrapper[18592]: I0308 04:10:55.187493 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-nhcp5"] Mar 08 04:10:55.192963 master-0 kubenswrapper[18592]: W0308 04:10:55.192913 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaf0c2bc_8c16_4ba9_ab6e_8a06ddba7de6.slice/crio-b1474e08abf4f9bf76ffcee16efad7aeb106b7ca01dc6530fc3b01c435671624 WatchSource:0}: Error finding container b1474e08abf4f9bf76ffcee16efad7aeb106b7ca01dc6530fc3b01c435671624: Status 404 returned error can't find the container with id b1474e08abf4f9bf76ffcee16efad7aeb106b7ca01dc6530fc3b01c435671624 Mar 08 04:10:56.168218 master-0 kubenswrapper[18592]: I0308 04:10:56.168157 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-nhcp5" event={"ID":"aaf0c2bc-8c16-4ba9-ab6e-8a06ddba7de6","Type":"ContainerStarted","Data":"b1474e08abf4f9bf76ffcee16efad7aeb106b7ca01dc6530fc3b01c435671624"} Mar 08 04:10:59.191990 master-0 kubenswrapper[18592]: I0308 04:10:59.191936 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-nhcp5" event={"ID":"aaf0c2bc-8c16-4ba9-ab6e-8a06ddba7de6","Type":"ContainerStarted","Data":"ea68585d82f8a789b730f57b360c56b9a66df320a92b5ad08ec058acd4ef3580"} Mar 08 04:10:59.228124 master-0 kubenswrapper[18592]: I0308 04:10:59.228030 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-nhcp5" podStartSLOduration=1.518798828 podStartE2EDuration="5.228007545s" podCreationTimestamp="2026-03-08 04:10:54 +0000 UTC" firstStartedPulling="2026-03-08 04:10:55.196370861 +0000 UTC m=+1067.295125201" lastFinishedPulling="2026-03-08 04:10:58.905579558 +0000 UTC m=+1071.004333918" observedRunningTime="2026-03-08 04:10:59.2248128 +0000 UTC m=+1071.323567160" watchObservedRunningTime="2026-03-08 04:10:59.228007545 +0000 UTC m=+1071.326761905" Mar 08 04:11:05.389700 master-0 kubenswrapper[18592]: I0308 04:11:05.389627 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-ljtmb"] Mar 08 04:11:05.390629 master-0 kubenswrapper[18592]: I0308 04:11:05.390605 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-ljtmb" Mar 08 04:11:05.395245 master-0 kubenswrapper[18592]: I0308 04:11:05.395011 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 08 04:11:05.395245 master-0 kubenswrapper[18592]: I0308 04:11:05.395063 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 08 04:11:05.403790 master-0 kubenswrapper[18592]: I0308 04:11:05.403729 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-ljtmb"] Mar 08 04:11:05.551439 master-0 kubenswrapper[18592]: I0308 04:11:05.551403 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qht84\" (UniqueName: \"kubernetes.io/projected/4e3ba138-68bb-41cd-976b-97275ee70a53-kube-api-access-qht84\") pod \"nmstate-operator-75c5dccd6c-ljtmb\" (UID: \"4e3ba138-68bb-41cd-976b-97275ee70a53\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-ljtmb" Mar 08 04:11:05.653269 master-0 kubenswrapper[18592]: I0308 04:11:05.653142 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qht84\" (UniqueName: \"kubernetes.io/projected/4e3ba138-68bb-41cd-976b-97275ee70a53-kube-api-access-qht84\") pod \"nmstate-operator-75c5dccd6c-ljtmb\" (UID: \"4e3ba138-68bb-41cd-976b-97275ee70a53\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-ljtmb" Mar 08 04:11:05.670773 master-0 kubenswrapper[18592]: I0308 04:11:05.670699 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qht84\" (UniqueName: \"kubernetes.io/projected/4e3ba138-68bb-41cd-976b-97275ee70a53-kube-api-access-qht84\") pod \"nmstate-operator-75c5dccd6c-ljtmb\" (UID: \"4e3ba138-68bb-41cd-976b-97275ee70a53\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-ljtmb" Mar 08 04:11:05.705836 master-0 kubenswrapper[18592]: I0308 04:11:05.705743 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-ljtmb" Mar 08 04:11:06.173578 master-0 kubenswrapper[18592]: I0308 04:11:06.173255 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-ljtmb"] Mar 08 04:11:06.178934 master-0 kubenswrapper[18592]: W0308 04:11:06.178811 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e3ba138_68bb_41cd_976b_97275ee70a53.slice/crio-82b4f1bd6b38228ac5dfe51982706752d6a7f7f06e73b406d128e05b6ca96631 WatchSource:0}: Error finding container 82b4f1bd6b38228ac5dfe51982706752d6a7f7f06e73b406d128e05b6ca96631: Status 404 returned error can't find the container with id 82b4f1bd6b38228ac5dfe51982706752d6a7f7f06e73b406d128e05b6ca96631 Mar 08 04:11:06.242526 master-0 kubenswrapper[18592]: I0308 04:11:06.242412 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-ljtmb" event={"ID":"4e3ba138-68bb-41cd-976b-97275ee70a53","Type":"ContainerStarted","Data":"82b4f1bd6b38228ac5dfe51982706752d6a7f7f06e73b406d128e05b6ca96631"} Mar 08 04:11:06.601539 master-0 kubenswrapper[18592]: I0308 04:11:06.601470 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-pd99m"] Mar 08 04:11:06.602647 master-0 kubenswrapper[18592]: I0308 04:11:06.602616 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-pd99m" Mar 08 04:11:06.605576 master-0 kubenswrapper[18592]: I0308 04:11:06.605524 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 08 04:11:06.607262 master-0 kubenswrapper[18592]: I0308 04:11:06.607213 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 08 04:11:06.624009 master-0 kubenswrapper[18592]: I0308 04:11:06.623944 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-pd99m"] Mar 08 04:11:06.771817 master-0 kubenswrapper[18592]: I0308 04:11:06.771757 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cnlh\" (UniqueName: \"kubernetes.io/projected/f1bf193a-6765-4efd-92b9-e54f2183837b-kube-api-access-9cnlh\") pod \"cert-manager-webhook-6888856db4-pd99m\" (UID: \"f1bf193a-6765-4efd-92b9-e54f2183837b\") " pod="cert-manager/cert-manager-webhook-6888856db4-pd99m" Mar 08 04:11:06.772067 master-0 kubenswrapper[18592]: I0308 04:11:06.771856 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1bf193a-6765-4efd-92b9-e54f2183837b-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-pd99m\" (UID: \"f1bf193a-6765-4efd-92b9-e54f2183837b\") " pod="cert-manager/cert-manager-webhook-6888856db4-pd99m" Mar 08 04:11:06.874142 master-0 kubenswrapper[18592]: I0308 04:11:06.873979 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cnlh\" (UniqueName: \"kubernetes.io/projected/f1bf193a-6765-4efd-92b9-e54f2183837b-kube-api-access-9cnlh\") pod \"cert-manager-webhook-6888856db4-pd99m\" (UID: \"f1bf193a-6765-4efd-92b9-e54f2183837b\") " pod="cert-manager/cert-manager-webhook-6888856db4-pd99m" Mar 08 04:11:06.874342 master-0 kubenswrapper[18592]: I0308 04:11:06.874179 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1bf193a-6765-4efd-92b9-e54f2183837b-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-pd99m\" (UID: \"f1bf193a-6765-4efd-92b9-e54f2183837b\") " pod="cert-manager/cert-manager-webhook-6888856db4-pd99m" Mar 08 04:11:06.912758 master-0 kubenswrapper[18592]: I0308 04:11:06.912676 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-5sntj"] Mar 08 04:11:06.914868 master-0 kubenswrapper[18592]: I0308 04:11:06.914781 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-5sntj" Mar 08 04:11:06.930845 master-0 kubenswrapper[18592]: I0308 04:11:06.930542 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1bf193a-6765-4efd-92b9-e54f2183837b-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-pd99m\" (UID: \"f1bf193a-6765-4efd-92b9-e54f2183837b\") " pod="cert-manager/cert-manager-webhook-6888856db4-pd99m" Mar 08 04:11:06.932425 master-0 kubenswrapper[18592]: I0308 04:11:06.932375 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cnlh\" (UniqueName: \"kubernetes.io/projected/f1bf193a-6765-4efd-92b9-e54f2183837b-kube-api-access-9cnlh\") pod \"cert-manager-webhook-6888856db4-pd99m\" (UID: \"f1bf193a-6765-4efd-92b9-e54f2183837b\") " pod="cert-manager/cert-manager-webhook-6888856db4-pd99m" Mar 08 04:11:06.967837 master-0 kubenswrapper[18592]: I0308 04:11:06.966018 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-5sntj"] Mar 08 04:11:07.078365 master-0 kubenswrapper[18592]: I0308 04:11:07.078271 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/187ccaeb-95ac-488e-81ab-f904238389a7-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-5sntj\" (UID: \"187ccaeb-95ac-488e-81ab-f904238389a7\") " pod="cert-manager/cert-manager-cainjector-5545bd876-5sntj" Mar 08 04:11:07.078708 master-0 kubenswrapper[18592]: I0308 04:11:07.078643 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc9jt\" (UniqueName: \"kubernetes.io/projected/187ccaeb-95ac-488e-81ab-f904238389a7-kube-api-access-dc9jt\") pod \"cert-manager-cainjector-5545bd876-5sntj\" (UID: \"187ccaeb-95ac-488e-81ab-f904238389a7\") " pod="cert-manager/cert-manager-cainjector-5545bd876-5sntj" Mar 08 04:11:07.180130 master-0 kubenswrapper[18592]: I0308 04:11:07.179975 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/187ccaeb-95ac-488e-81ab-f904238389a7-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-5sntj\" (UID: \"187ccaeb-95ac-488e-81ab-f904238389a7\") " pod="cert-manager/cert-manager-cainjector-5545bd876-5sntj" Mar 08 04:11:07.180363 master-0 kubenswrapper[18592]: I0308 04:11:07.180203 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc9jt\" (UniqueName: \"kubernetes.io/projected/187ccaeb-95ac-488e-81ab-f904238389a7-kube-api-access-dc9jt\") pod \"cert-manager-cainjector-5545bd876-5sntj\" (UID: \"187ccaeb-95ac-488e-81ab-f904238389a7\") " pod="cert-manager/cert-manager-cainjector-5545bd876-5sntj" Mar 08 04:11:07.199529 master-0 kubenswrapper[18592]: I0308 04:11:07.199482 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/187ccaeb-95ac-488e-81ab-f904238389a7-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-5sntj\" (UID: \"187ccaeb-95ac-488e-81ab-f904238389a7\") " pod="cert-manager/cert-manager-cainjector-5545bd876-5sntj" Mar 08 04:11:07.214858 master-0 kubenswrapper[18592]: I0308 04:11:07.211756 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc9jt\" (UniqueName: \"kubernetes.io/projected/187ccaeb-95ac-488e-81ab-f904238389a7-kube-api-access-dc9jt\") pod \"cert-manager-cainjector-5545bd876-5sntj\" (UID: \"187ccaeb-95ac-488e-81ab-f904238389a7\") " pod="cert-manager/cert-manager-cainjector-5545bd876-5sntj" Mar 08 04:11:07.224175 master-0 kubenswrapper[18592]: I0308 04:11:07.222466 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-pd99m" Mar 08 04:11:07.275493 master-0 kubenswrapper[18592]: I0308 04:11:07.275436 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-5sntj" Mar 08 04:11:07.723192 master-0 kubenswrapper[18592]: W0308 04:11:07.723126 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1bf193a_6765_4efd_92b9_e54f2183837b.slice/crio-3d45b9eccce2b2790a0c9b27ed5323ccfa174876e20b2e32f35319c55b31976a WatchSource:0}: Error finding container 3d45b9eccce2b2790a0c9b27ed5323ccfa174876e20b2e32f35319c55b31976a: Status 404 returned error can't find the container with id 3d45b9eccce2b2790a0c9b27ed5323ccfa174876e20b2e32f35319c55b31976a Mar 08 04:11:07.723916 master-0 kubenswrapper[18592]: I0308 04:11:07.723853 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-pd99m"] Mar 08 04:11:07.784119 master-0 kubenswrapper[18592]: I0308 04:11:07.784047 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-5sntj"] Mar 08 04:11:07.787654 master-0 kubenswrapper[18592]: W0308 04:11:07.787589 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod187ccaeb_95ac_488e_81ab_f904238389a7.slice/crio-6d50bccef15b4c7bd37250d5479d4dfe096eba3233c3efdabd57e35941570f78 WatchSource:0}: Error finding container 6d50bccef15b4c7bd37250d5479d4dfe096eba3233c3efdabd57e35941570f78: Status 404 returned error can't find the container with id 6d50bccef15b4c7bd37250d5479d4dfe096eba3233c3efdabd57e35941570f78 Mar 08 04:11:08.287335 master-0 kubenswrapper[18592]: I0308 04:11:08.287243 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-pd99m" event={"ID":"f1bf193a-6765-4efd-92b9-e54f2183837b","Type":"ContainerStarted","Data":"3d45b9eccce2b2790a0c9b27ed5323ccfa174876e20b2e32f35319c55b31976a"} Mar 08 04:11:08.288968 master-0 kubenswrapper[18592]: I0308 04:11:08.288907 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-5sntj" event={"ID":"187ccaeb-95ac-488e-81ab-f904238389a7","Type":"ContainerStarted","Data":"6d50bccef15b4c7bd37250d5479d4dfe096eba3233c3efdabd57e35941570f78"} Mar 08 04:11:10.316367 master-0 kubenswrapper[18592]: I0308 04:11:10.316310 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-ljtmb" event={"ID":"4e3ba138-68bb-41cd-976b-97275ee70a53","Type":"ContainerStarted","Data":"c8a68d86650a8f990e4f76c30049cb2f9bc0cfca5611372bf96554cc7f4c353a"} Mar 08 04:11:10.345734 master-0 kubenswrapper[18592]: I0308 04:11:10.345585 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-ljtmb" podStartSLOduration=1.934602204 podStartE2EDuration="5.345561489s" podCreationTimestamp="2026-03-08 04:11:05 +0000 UTC" firstStartedPulling="2026-03-08 04:11:06.180532033 +0000 UTC m=+1078.279286383" lastFinishedPulling="2026-03-08 04:11:09.591491308 +0000 UTC m=+1081.690245668" observedRunningTime="2026-03-08 04:11:10.343158645 +0000 UTC m=+1082.441913035" watchObservedRunningTime="2026-03-08 04:11:10.345561489 +0000 UTC m=+1082.444315849" Mar 08 04:11:13.960841 master-0 kubenswrapper[18592]: I0308 04:11:13.959394 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7"] Mar 08 04:11:13.960841 master-0 kubenswrapper[18592]: I0308 04:11:13.960485 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7" Mar 08 04:11:13.962758 master-0 kubenswrapper[18592]: I0308 04:11:13.962725 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 08 04:11:13.962948 master-0 kubenswrapper[18592]: I0308 04:11:13.962877 18592 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 08 04:11:13.963007 master-0 kubenswrapper[18592]: I0308 04:11:13.962999 18592 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 08 04:11:13.963433 master-0 kubenswrapper[18592]: I0308 04:11:13.963119 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 08 04:11:14.003696 master-0 kubenswrapper[18592]: I0308 04:11:14.003650 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7"] Mar 08 04:11:14.064840 master-0 kubenswrapper[18592]: I0308 04:11:14.064755 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4rg5\" (UniqueName: \"kubernetes.io/projected/d0a838e8-3b7d-4368-99bc-21c83edb251a-kube-api-access-c4rg5\") pod \"metallb-operator-controller-manager-c7f497896-fbcg7\" (UID: \"d0a838e8-3b7d-4368-99bc-21c83edb251a\") " pod="metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7" Mar 08 04:11:14.065067 master-0 kubenswrapper[18592]: I0308 04:11:14.064882 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d0a838e8-3b7d-4368-99bc-21c83edb251a-apiservice-cert\") pod \"metallb-operator-controller-manager-c7f497896-fbcg7\" (UID: \"d0a838e8-3b7d-4368-99bc-21c83edb251a\") " pod="metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7" Mar 08 04:11:14.065067 master-0 kubenswrapper[18592]: I0308 04:11:14.064951 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d0a838e8-3b7d-4368-99bc-21c83edb251a-webhook-cert\") pod \"metallb-operator-controller-manager-c7f497896-fbcg7\" (UID: \"d0a838e8-3b7d-4368-99bc-21c83edb251a\") " pod="metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7" Mar 08 04:11:14.168843 master-0 kubenswrapper[18592]: I0308 04:11:14.168372 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d0a838e8-3b7d-4368-99bc-21c83edb251a-apiservice-cert\") pod \"metallb-operator-controller-manager-c7f497896-fbcg7\" (UID: \"d0a838e8-3b7d-4368-99bc-21c83edb251a\") " pod="metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7" Mar 08 04:11:14.168843 master-0 kubenswrapper[18592]: I0308 04:11:14.168452 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d0a838e8-3b7d-4368-99bc-21c83edb251a-webhook-cert\") pod \"metallb-operator-controller-manager-c7f497896-fbcg7\" (UID: \"d0a838e8-3b7d-4368-99bc-21c83edb251a\") " pod="metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7" Mar 08 04:11:14.168843 master-0 kubenswrapper[18592]: I0308 04:11:14.168518 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4rg5\" (UniqueName: \"kubernetes.io/projected/d0a838e8-3b7d-4368-99bc-21c83edb251a-kube-api-access-c4rg5\") pod \"metallb-operator-controller-manager-c7f497896-fbcg7\" (UID: \"d0a838e8-3b7d-4368-99bc-21c83edb251a\") " pod="metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7" Mar 08 04:11:14.182613 master-0 kubenswrapper[18592]: I0308 04:11:14.173197 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d0a838e8-3b7d-4368-99bc-21c83edb251a-apiservice-cert\") pod \"metallb-operator-controller-manager-c7f497896-fbcg7\" (UID: \"d0a838e8-3b7d-4368-99bc-21c83edb251a\") " pod="metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7" Mar 08 04:11:14.182613 master-0 kubenswrapper[18592]: I0308 04:11:14.174010 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d0a838e8-3b7d-4368-99bc-21c83edb251a-webhook-cert\") pod \"metallb-operator-controller-manager-c7f497896-fbcg7\" (UID: \"d0a838e8-3b7d-4368-99bc-21c83edb251a\") " pod="metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7" Mar 08 04:11:14.219930 master-0 kubenswrapper[18592]: I0308 04:11:14.218939 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4rg5\" (UniqueName: \"kubernetes.io/projected/d0a838e8-3b7d-4368-99bc-21c83edb251a-kube-api-access-c4rg5\") pod \"metallb-operator-controller-manager-c7f497896-fbcg7\" (UID: \"d0a838e8-3b7d-4368-99bc-21c83edb251a\") " pod="metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7" Mar 08 04:11:14.289846 master-0 kubenswrapper[18592]: I0308 04:11:14.289327 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7" Mar 08 04:11:15.123781 master-0 kubenswrapper[18592]: I0308 04:11:15.123687 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv"] Mar 08 04:11:15.124815 master-0 kubenswrapper[18592]: I0308 04:11:15.124770 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv" Mar 08 04:11:15.130846 master-0 kubenswrapper[18592]: I0308 04:11:15.127938 18592 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 08 04:11:15.130846 master-0 kubenswrapper[18592]: I0308 04:11:15.128165 18592 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 08 04:11:15.153528 master-0 kubenswrapper[18592]: I0308 04:11:15.153455 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv"] Mar 08 04:11:15.203846 master-0 kubenswrapper[18592]: I0308 04:11:15.203038 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba434b94-1eeb-4d82-89a2-b3c8fa4a998d-apiservice-cert\") pod \"metallb-operator-webhook-server-7d4ccb5df5-l4btv\" (UID: \"ba434b94-1eeb-4d82-89a2-b3c8fa4a998d\") " pod="metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv" Mar 08 04:11:15.203846 master-0 kubenswrapper[18592]: I0308 04:11:15.203101 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjqcr\" (UniqueName: \"kubernetes.io/projected/ba434b94-1eeb-4d82-89a2-b3c8fa4a998d-kube-api-access-jjqcr\") pod \"metallb-operator-webhook-server-7d4ccb5df5-l4btv\" (UID: \"ba434b94-1eeb-4d82-89a2-b3c8fa4a998d\") " pod="metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv" Mar 08 04:11:15.203846 master-0 kubenswrapper[18592]: I0308 04:11:15.203133 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba434b94-1eeb-4d82-89a2-b3c8fa4a998d-webhook-cert\") pod \"metallb-operator-webhook-server-7d4ccb5df5-l4btv\" (UID: \"ba434b94-1eeb-4d82-89a2-b3c8fa4a998d\") " pod="metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv" Mar 08 04:11:15.305684 master-0 kubenswrapper[18592]: I0308 04:11:15.305640 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba434b94-1eeb-4d82-89a2-b3c8fa4a998d-apiservice-cert\") pod \"metallb-operator-webhook-server-7d4ccb5df5-l4btv\" (UID: \"ba434b94-1eeb-4d82-89a2-b3c8fa4a998d\") " pod="metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv" Mar 08 04:11:15.306026 master-0 kubenswrapper[18592]: I0308 04:11:15.305996 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjqcr\" (UniqueName: \"kubernetes.io/projected/ba434b94-1eeb-4d82-89a2-b3c8fa4a998d-kube-api-access-jjqcr\") pod \"metallb-operator-webhook-server-7d4ccb5df5-l4btv\" (UID: \"ba434b94-1eeb-4d82-89a2-b3c8fa4a998d\") " pod="metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv" Mar 08 04:11:15.306187 master-0 kubenswrapper[18592]: I0308 04:11:15.306172 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba434b94-1eeb-4d82-89a2-b3c8fa4a998d-webhook-cert\") pod \"metallb-operator-webhook-server-7d4ccb5df5-l4btv\" (UID: \"ba434b94-1eeb-4d82-89a2-b3c8fa4a998d\") " pod="metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv" Mar 08 04:11:15.313070 master-0 kubenswrapper[18592]: I0308 04:11:15.312978 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba434b94-1eeb-4d82-89a2-b3c8fa4a998d-apiservice-cert\") pod \"metallb-operator-webhook-server-7d4ccb5df5-l4btv\" (UID: \"ba434b94-1eeb-4d82-89a2-b3c8fa4a998d\") " pod="metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv" Mar 08 04:11:15.314086 master-0 kubenswrapper[18592]: I0308 04:11:15.313841 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba434b94-1eeb-4d82-89a2-b3c8fa4a998d-webhook-cert\") pod \"metallb-operator-webhook-server-7d4ccb5df5-l4btv\" (UID: \"ba434b94-1eeb-4d82-89a2-b3c8fa4a998d\") " pod="metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv" Mar 08 04:11:15.372894 master-0 kubenswrapper[18592]: I0308 04:11:15.369655 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjqcr\" (UniqueName: \"kubernetes.io/projected/ba434b94-1eeb-4d82-89a2-b3c8fa4a998d-kube-api-access-jjqcr\") pod \"metallb-operator-webhook-server-7d4ccb5df5-l4btv\" (UID: \"ba434b94-1eeb-4d82-89a2-b3c8fa4a998d\") " pod="metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv" Mar 08 04:11:15.460616 master-0 kubenswrapper[18592]: I0308 04:11:15.460553 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv" Mar 08 04:11:15.551844 master-0 kubenswrapper[18592]: I0308 04:11:15.551338 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-clqjb"] Mar 08 04:11:15.555841 master-0 kubenswrapper[18592]: I0308 04:11:15.552306 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-clqjb" Mar 08 04:11:15.623848 master-0 kubenswrapper[18592]: I0308 04:11:15.619991 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-clqjb"] Mar 08 04:11:15.714862 master-0 kubenswrapper[18592]: I0308 04:11:15.714144 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9ctg\" (UniqueName: \"kubernetes.io/projected/ad51020b-c6a5-4241-a690-d90bdece2a0a-kube-api-access-r9ctg\") pod \"cert-manager-545d4d4674-clqjb\" (UID: \"ad51020b-c6a5-4241-a690-d90bdece2a0a\") " pod="cert-manager/cert-manager-545d4d4674-clqjb" Mar 08 04:11:15.714862 master-0 kubenswrapper[18592]: I0308 04:11:15.714253 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad51020b-c6a5-4241-a690-d90bdece2a0a-bound-sa-token\") pod \"cert-manager-545d4d4674-clqjb\" (UID: \"ad51020b-c6a5-4241-a690-d90bdece2a0a\") " pod="cert-manager/cert-manager-545d4d4674-clqjb" Mar 08 04:11:15.816371 master-0 kubenswrapper[18592]: I0308 04:11:15.816287 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9ctg\" (UniqueName: \"kubernetes.io/projected/ad51020b-c6a5-4241-a690-d90bdece2a0a-kube-api-access-r9ctg\") pod \"cert-manager-545d4d4674-clqjb\" (UID: \"ad51020b-c6a5-4241-a690-d90bdece2a0a\") " pod="cert-manager/cert-manager-545d4d4674-clqjb" Mar 08 04:11:15.816371 master-0 kubenswrapper[18592]: I0308 04:11:15.816347 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad51020b-c6a5-4241-a690-d90bdece2a0a-bound-sa-token\") pod \"cert-manager-545d4d4674-clqjb\" (UID: \"ad51020b-c6a5-4241-a690-d90bdece2a0a\") " pod="cert-manager/cert-manager-545d4d4674-clqjb" Mar 08 04:11:15.845869 master-0 kubenswrapper[18592]: I0308 04:11:15.844881 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9ctg\" (UniqueName: \"kubernetes.io/projected/ad51020b-c6a5-4241-a690-d90bdece2a0a-kube-api-access-r9ctg\") pod \"cert-manager-545d4d4674-clqjb\" (UID: \"ad51020b-c6a5-4241-a690-d90bdece2a0a\") " pod="cert-manager/cert-manager-545d4d4674-clqjb" Mar 08 04:11:15.851851 master-0 kubenswrapper[18592]: I0308 04:11:15.849290 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad51020b-c6a5-4241-a690-d90bdece2a0a-bound-sa-token\") pod \"cert-manager-545d4d4674-clqjb\" (UID: \"ad51020b-c6a5-4241-a690-d90bdece2a0a\") " pod="cert-manager/cert-manager-545d4d4674-clqjb" Mar 08 04:11:15.887204 master-0 kubenswrapper[18592]: I0308 04:11:15.887116 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-clqjb" Mar 08 04:11:17.549580 master-0 kubenswrapper[18592]: I0308 04:11:17.549536 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7"] Mar 08 04:11:17.773457 master-0 kubenswrapper[18592]: W0308 04:11:17.773409 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba434b94_1eeb_4d82_89a2_b3c8fa4a998d.slice/crio-7cf848bc18a13b54198089dc63e2585ef8c8684497a08bce58180da54f1f15cc WatchSource:0}: Error finding container 7cf848bc18a13b54198089dc63e2585ef8c8684497a08bce58180da54f1f15cc: Status 404 returned error can't find the container with id 7cf848bc18a13b54198089dc63e2585ef8c8684497a08bce58180da54f1f15cc Mar 08 04:11:17.774207 master-0 kubenswrapper[18592]: I0308 04:11:17.774146 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv"] Mar 08 04:11:17.847792 master-0 kubenswrapper[18592]: I0308 04:11:17.847735 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-clqjb"] Mar 08 04:11:17.872571 master-0 kubenswrapper[18592]: W0308 04:11:17.872509 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad51020b_c6a5_4241_a690_d90bdece2a0a.slice/crio-704375e9add37cb408e470b0bfcd81bef326e6036b21d4acae48b73ff2dfdd0b WatchSource:0}: Error finding container 704375e9add37cb408e470b0bfcd81bef326e6036b21d4acae48b73ff2dfdd0b: Status 404 returned error can't find the container with id 704375e9add37cb408e470b0bfcd81bef326e6036b21d4acae48b73ff2dfdd0b Mar 08 04:11:18.451134 master-0 kubenswrapper[18592]: I0308 04:11:18.451036 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-pd99m" event={"ID":"f1bf193a-6765-4efd-92b9-e54f2183837b","Type":"ContainerStarted","Data":"2bb38940697d775c188fde3222d8cb7c237982fd21f429b615c8a214a02f93f4"} Mar 08 04:11:18.451618 master-0 kubenswrapper[18592]: I0308 04:11:18.451162 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-pd99m" Mar 08 04:11:18.453491 master-0 kubenswrapper[18592]: I0308 04:11:18.453434 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7" event={"ID":"d0a838e8-3b7d-4368-99bc-21c83edb251a","Type":"ContainerStarted","Data":"f47c173723f898cfe50133e24de778c69af3fa9758e1c9a820bda7f030f314d7"} Mar 08 04:11:18.456014 master-0 kubenswrapper[18592]: I0308 04:11:18.455958 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-clqjb" event={"ID":"ad51020b-c6a5-4241-a690-d90bdece2a0a","Type":"ContainerStarted","Data":"55eed6e3b6a863697b1aef7b119a046fd6d8034a5c4e6f2335935a088802234b"} Mar 08 04:11:18.456014 master-0 kubenswrapper[18592]: I0308 04:11:18.455999 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-clqjb" event={"ID":"ad51020b-c6a5-4241-a690-d90bdece2a0a","Type":"ContainerStarted","Data":"704375e9add37cb408e470b0bfcd81bef326e6036b21d4acae48b73ff2dfdd0b"} Mar 08 04:11:18.458338 master-0 kubenswrapper[18592]: I0308 04:11:18.458232 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv" event={"ID":"ba434b94-1eeb-4d82-89a2-b3c8fa4a998d","Type":"ContainerStarted","Data":"7cf848bc18a13b54198089dc63e2585ef8c8684497a08bce58180da54f1f15cc"} Mar 08 04:11:18.460860 master-0 kubenswrapper[18592]: I0308 04:11:18.460798 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-5sntj" event={"ID":"187ccaeb-95ac-488e-81ab-f904238389a7","Type":"ContainerStarted","Data":"f4c364950b19e336f08f7c4137b9416a5a536fe106580a24d13cc97286651e68"} Mar 08 04:11:18.494175 master-0 kubenswrapper[18592]: I0308 04:11:18.494108 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-clqjb" podStartSLOduration=3.494092057 podStartE2EDuration="3.494092057s" podCreationTimestamp="2026-03-08 04:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:11:18.493230744 +0000 UTC m=+1090.591985094" watchObservedRunningTime="2026-03-08 04:11:18.494092057 +0000 UTC m=+1090.592846407" Mar 08 04:11:18.499189 master-0 kubenswrapper[18592]: I0308 04:11:18.499135 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-pd99m" podStartSLOduration=3.092317605 podStartE2EDuration="12.499124831s" podCreationTimestamp="2026-03-08 04:11:06 +0000 UTC" firstStartedPulling="2026-03-08 04:11:07.725958088 +0000 UTC m=+1079.824712478" lastFinishedPulling="2026-03-08 04:11:17.132765354 +0000 UTC m=+1089.231519704" observedRunningTime="2026-03-08 04:11:18.476135948 +0000 UTC m=+1090.574890288" watchObservedRunningTime="2026-03-08 04:11:18.499124831 +0000 UTC m=+1090.597879171" Mar 08 04:11:18.547438 master-0 kubenswrapper[18592]: I0308 04:11:18.547356 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-5sntj" podStartSLOduration=3.231585873 podStartE2EDuration="12.547326954s" podCreationTimestamp="2026-03-08 04:11:06 +0000 UTC" firstStartedPulling="2026-03-08 04:11:07.790294561 +0000 UTC m=+1079.889048921" lastFinishedPulling="2026-03-08 04:11:17.106035662 +0000 UTC m=+1089.204790002" observedRunningTime="2026-03-08 04:11:18.545055803 +0000 UTC m=+1090.643810163" watchObservedRunningTime="2026-03-08 04:11:18.547326954 +0000 UTC m=+1090.646081294" Mar 08 04:11:20.742430 master-0 kubenswrapper[18592]: I0308 04:11:20.742374 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-szjgg"] Mar 08 04:11:20.744195 master-0 kubenswrapper[18592]: I0308 04:11:20.744165 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-szjgg" Mar 08 04:11:20.747539 master-0 kubenswrapper[18592]: I0308 04:11:20.747504 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 08 04:11:20.747686 master-0 kubenswrapper[18592]: I0308 04:11:20.747559 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 08 04:11:20.766546 master-0 kubenswrapper[18592]: I0308 04:11:20.766504 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-szjgg"] Mar 08 04:11:20.808587 master-0 kubenswrapper[18592]: I0308 04:11:20.808529 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-sb4gb"] Mar 08 04:11:20.810100 master-0 kubenswrapper[18592]: I0308 04:11:20.809465 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-sb4gb" Mar 08 04:11:20.812951 master-0 kubenswrapper[18592]: I0308 04:11:20.812915 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 08 04:11:20.833074 master-0 kubenswrapper[18592]: I0308 04:11:20.833004 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-42ffh"] Mar 08 04:11:20.837167 master-0 kubenswrapper[18592]: I0308 04:11:20.836793 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxncg\" (UniqueName: \"kubernetes.io/projected/776ddb55-954f-445c-933b-a94b65fa57af-kube-api-access-hxncg\") pod \"obo-prometheus-operator-68bc856cb9-szjgg\" (UID: \"776ddb55-954f-445c-933b-a94b65fa57af\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-szjgg" Mar 08 04:11:20.837167 master-0 kubenswrapper[18592]: I0308 04:11:20.836907 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-42ffh" Mar 08 04:11:20.840042 master-0 kubenswrapper[18592]: I0308 04:11:20.839511 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-sb4gb"] Mar 08 04:11:20.857071 master-0 kubenswrapper[18592]: I0308 04:11:20.857016 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-42ffh"] Mar 08 04:11:20.943594 master-0 kubenswrapper[18592]: I0308 04:11:20.943547 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxncg\" (UniqueName: \"kubernetes.io/projected/776ddb55-954f-445c-933b-a94b65fa57af-kube-api-access-hxncg\") pod \"obo-prometheus-operator-68bc856cb9-szjgg\" (UID: \"776ddb55-954f-445c-933b-a94b65fa57af\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-szjgg" Mar 08 04:11:20.943815 master-0 kubenswrapper[18592]: I0308 04:11:20.943606 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/837891ca-c979-429e-a13d-e3f1a5779648-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-85bf586469-sb4gb\" (UID: \"837891ca-c979-429e-a13d-e3f1a5779648\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-sb4gb" Mar 08 04:11:20.943815 master-0 kubenswrapper[18592]: I0308 04:11:20.943635 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/837891ca-c979-429e-a13d-e3f1a5779648-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-85bf586469-sb4gb\" (UID: \"837891ca-c979-429e-a13d-e3f1a5779648\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-sb4gb" Mar 08 04:11:20.943815 master-0 kubenswrapper[18592]: I0308 04:11:20.943716 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f0c8d80a-0bab-4de5-ba90-d20adaf211c6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-85bf586469-42ffh\" (UID: \"f0c8d80a-0bab-4de5-ba90-d20adaf211c6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-42ffh" Mar 08 04:11:20.944072 master-0 kubenswrapper[18592]: I0308 04:11:20.944003 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f0c8d80a-0bab-4de5-ba90-d20adaf211c6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-85bf586469-42ffh\" (UID: \"f0c8d80a-0bab-4de5-ba90-d20adaf211c6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-42ffh" Mar 08 04:11:20.969285 master-0 kubenswrapper[18592]: I0308 04:11:20.967268 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxncg\" (UniqueName: \"kubernetes.io/projected/776ddb55-954f-445c-933b-a94b65fa57af-kube-api-access-hxncg\") pod \"obo-prometheus-operator-68bc856cb9-szjgg\" (UID: \"776ddb55-954f-445c-933b-a94b65fa57af\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-szjgg" Mar 08 04:11:20.981055 master-0 kubenswrapper[18592]: I0308 04:11:20.979571 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-477v2"] Mar 08 04:11:20.981987 master-0 kubenswrapper[18592]: I0308 04:11:20.981955 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-477v2" Mar 08 04:11:20.986685 master-0 kubenswrapper[18592]: I0308 04:11:20.986601 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 08 04:11:21.001486 master-0 kubenswrapper[18592]: I0308 04:11:21.001451 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-477v2"] Mar 08 04:11:21.045367 master-0 kubenswrapper[18592]: I0308 04:11:21.045305 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/837891ca-c979-429e-a13d-e3f1a5779648-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-85bf586469-sb4gb\" (UID: \"837891ca-c979-429e-a13d-e3f1a5779648\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-sb4gb" Mar 08 04:11:21.045367 master-0 kubenswrapper[18592]: I0308 04:11:21.045355 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/837891ca-c979-429e-a13d-e3f1a5779648-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-85bf586469-sb4gb\" (UID: \"837891ca-c979-429e-a13d-e3f1a5779648\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-sb4gb" Mar 08 04:11:21.045885 master-0 kubenswrapper[18592]: I0308 04:11:21.045857 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f0c8d80a-0bab-4de5-ba90-d20adaf211c6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-85bf586469-42ffh\" (UID: \"f0c8d80a-0bab-4de5-ba90-d20adaf211c6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-42ffh" Mar 08 04:11:21.045951 master-0 kubenswrapper[18592]: I0308 04:11:21.045938 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f0c8d80a-0bab-4de5-ba90-d20adaf211c6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-85bf586469-42ffh\" (UID: \"f0c8d80a-0bab-4de5-ba90-d20adaf211c6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-42ffh" Mar 08 04:11:21.049326 master-0 kubenswrapper[18592]: I0308 04:11:21.049265 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f0c8d80a-0bab-4de5-ba90-d20adaf211c6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-85bf586469-42ffh\" (UID: \"f0c8d80a-0bab-4de5-ba90-d20adaf211c6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-42ffh" Mar 08 04:11:21.049326 master-0 kubenswrapper[18592]: I0308 04:11:21.049312 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/837891ca-c979-429e-a13d-e3f1a5779648-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-85bf586469-sb4gb\" (UID: \"837891ca-c979-429e-a13d-e3f1a5779648\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-sb4gb" Mar 08 04:11:21.050344 master-0 kubenswrapper[18592]: I0308 04:11:21.050306 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/837891ca-c979-429e-a13d-e3f1a5779648-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-85bf586469-sb4gb\" (UID: \"837891ca-c979-429e-a13d-e3f1a5779648\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-sb4gb" Mar 08 04:11:21.051401 master-0 kubenswrapper[18592]: I0308 04:11:21.051367 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f0c8d80a-0bab-4de5-ba90-d20adaf211c6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-85bf586469-42ffh\" (UID: \"f0c8d80a-0bab-4de5-ba90-d20adaf211c6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-42ffh" Mar 08 04:11:21.067737 master-0 kubenswrapper[18592]: I0308 04:11:21.067683 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-szjgg" Mar 08 04:11:21.128848 master-0 kubenswrapper[18592]: I0308 04:11:21.125866 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-sb4gb" Mar 08 04:11:21.148307 master-0 kubenswrapper[18592]: I0308 04:11:21.147162 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/100ca7f0-6139-4fa1-96d0-7cd231a2c39a-observability-operator-tls\") pod \"observability-operator-59bdc8b94-477v2\" (UID: \"100ca7f0-6139-4fa1-96d0-7cd231a2c39a\") " pod="openshift-operators/observability-operator-59bdc8b94-477v2" Mar 08 04:11:21.148307 master-0 kubenswrapper[18592]: I0308 04:11:21.147238 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snztz\" (UniqueName: \"kubernetes.io/projected/100ca7f0-6139-4fa1-96d0-7cd231a2c39a-kube-api-access-snztz\") pod \"observability-operator-59bdc8b94-477v2\" (UID: \"100ca7f0-6139-4fa1-96d0-7cd231a2c39a\") " pod="openshift-operators/observability-operator-59bdc8b94-477v2" Mar 08 04:11:21.159197 master-0 kubenswrapper[18592]: I0308 04:11:21.159032 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-42ffh" Mar 08 04:11:21.192434 master-0 kubenswrapper[18592]: I0308 04:11:21.192372 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-cpg92"] Mar 08 04:11:21.193296 master-0 kubenswrapper[18592]: I0308 04:11:21.193276 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-cpg92" Mar 08 04:11:21.231736 master-0 kubenswrapper[18592]: I0308 04:11:21.223238 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-cpg92"] Mar 08 04:11:21.250028 master-0 kubenswrapper[18592]: I0308 04:11:21.249968 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/100ca7f0-6139-4fa1-96d0-7cd231a2c39a-observability-operator-tls\") pod \"observability-operator-59bdc8b94-477v2\" (UID: \"100ca7f0-6139-4fa1-96d0-7cd231a2c39a\") " pod="openshift-operators/observability-operator-59bdc8b94-477v2" Mar 08 04:11:21.250249 master-0 kubenswrapper[18592]: I0308 04:11:21.250027 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snztz\" (UniqueName: \"kubernetes.io/projected/100ca7f0-6139-4fa1-96d0-7cd231a2c39a-kube-api-access-snztz\") pod \"observability-operator-59bdc8b94-477v2\" (UID: \"100ca7f0-6139-4fa1-96d0-7cd231a2c39a\") " pod="openshift-operators/observability-operator-59bdc8b94-477v2" Mar 08 04:11:21.259133 master-0 kubenswrapper[18592]: I0308 04:11:21.257907 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/100ca7f0-6139-4fa1-96d0-7cd231a2c39a-observability-operator-tls\") pod \"observability-operator-59bdc8b94-477v2\" (UID: \"100ca7f0-6139-4fa1-96d0-7cd231a2c39a\") " pod="openshift-operators/observability-operator-59bdc8b94-477v2" Mar 08 04:11:21.277257 master-0 kubenswrapper[18592]: I0308 04:11:21.277220 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snztz\" (UniqueName: \"kubernetes.io/projected/100ca7f0-6139-4fa1-96d0-7cd231a2c39a-kube-api-access-snztz\") pod \"observability-operator-59bdc8b94-477v2\" (UID: \"100ca7f0-6139-4fa1-96d0-7cd231a2c39a\") " pod="openshift-operators/observability-operator-59bdc8b94-477v2" Mar 08 04:11:21.320847 master-0 kubenswrapper[18592]: I0308 04:11:21.314335 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-477v2" Mar 08 04:11:21.351680 master-0 kubenswrapper[18592]: I0308 04:11:21.351602 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvrpm\" (UniqueName: \"kubernetes.io/projected/d9d93f66-e65f-4f9b-b895-f00deda47253-kube-api-access-mvrpm\") pod \"perses-operator-5bf474d74f-cpg92\" (UID: \"d9d93f66-e65f-4f9b-b895-f00deda47253\") " pod="openshift-operators/perses-operator-5bf474d74f-cpg92" Mar 08 04:11:21.351930 master-0 kubenswrapper[18592]: I0308 04:11:21.351694 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d9d93f66-e65f-4f9b-b895-f00deda47253-openshift-service-ca\") pod \"perses-operator-5bf474d74f-cpg92\" (UID: \"d9d93f66-e65f-4f9b-b895-f00deda47253\") " pod="openshift-operators/perses-operator-5bf474d74f-cpg92" Mar 08 04:11:21.452602 master-0 kubenswrapper[18592]: I0308 04:11:21.452552 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvrpm\" (UniqueName: \"kubernetes.io/projected/d9d93f66-e65f-4f9b-b895-f00deda47253-kube-api-access-mvrpm\") pod \"perses-operator-5bf474d74f-cpg92\" (UID: \"d9d93f66-e65f-4f9b-b895-f00deda47253\") " pod="openshift-operators/perses-operator-5bf474d74f-cpg92" Mar 08 04:11:21.452602 master-0 kubenswrapper[18592]: I0308 04:11:21.452638 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d9d93f66-e65f-4f9b-b895-f00deda47253-openshift-service-ca\") pod \"perses-operator-5bf474d74f-cpg92\" (UID: \"d9d93f66-e65f-4f9b-b895-f00deda47253\") " pod="openshift-operators/perses-operator-5bf474d74f-cpg92" Mar 08 04:11:21.453480 master-0 kubenswrapper[18592]: I0308 04:11:21.453445 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d9d93f66-e65f-4f9b-b895-f00deda47253-openshift-service-ca\") pod \"perses-operator-5bf474d74f-cpg92\" (UID: \"d9d93f66-e65f-4f9b-b895-f00deda47253\") " pod="openshift-operators/perses-operator-5bf474d74f-cpg92" Mar 08 04:11:21.475989 master-0 kubenswrapper[18592]: I0308 04:11:21.475935 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvrpm\" (UniqueName: \"kubernetes.io/projected/d9d93f66-e65f-4f9b-b895-f00deda47253-kube-api-access-mvrpm\") pod \"perses-operator-5bf474d74f-cpg92\" (UID: \"d9d93f66-e65f-4f9b-b895-f00deda47253\") " pod="openshift-operators/perses-operator-5bf474d74f-cpg92" Mar 08 04:11:21.604093 master-0 kubenswrapper[18592]: I0308 04:11:21.599134 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-cpg92" Mar 08 04:11:22.228763 master-0 kubenswrapper[18592]: I0308 04:11:22.228595 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-pd99m" Mar 08 04:11:22.500760 master-0 kubenswrapper[18592]: I0308 04:11:22.500636 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7" event={"ID":"d0a838e8-3b7d-4368-99bc-21c83edb251a","Type":"ContainerStarted","Data":"c2497d28240940e1d9d50b9877e4012257b4239bc9539db57240f6d1f82ded84"} Mar 08 04:11:22.502596 master-0 kubenswrapper[18592]: I0308 04:11:22.502554 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7" Mar 08 04:11:22.530445 master-0 kubenswrapper[18592]: I0308 04:11:22.530354 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7" podStartSLOduration=5.150415475 podStartE2EDuration="9.530336293s" podCreationTimestamp="2026-03-08 04:11:13 +0000 UTC" firstStartedPulling="2026-03-08 04:11:17.574681482 +0000 UTC m=+1089.673435832" lastFinishedPulling="2026-03-08 04:11:21.9546023 +0000 UTC m=+1094.053356650" observedRunningTime="2026-03-08 04:11:22.524642361 +0000 UTC m=+1094.623396721" watchObservedRunningTime="2026-03-08 04:11:22.530336293 +0000 UTC m=+1094.629090643" Mar 08 04:11:22.664563 master-0 kubenswrapper[18592]: I0308 04:11:22.664514 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-sb4gb"] Mar 08 04:11:22.672405 master-0 kubenswrapper[18592]: I0308 04:11:22.671967 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-szjgg"] Mar 08 04:11:22.678490 master-0 kubenswrapper[18592]: I0308 04:11:22.678440 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-477v2"] Mar 08 04:11:22.685868 master-0 kubenswrapper[18592]: I0308 04:11:22.685805 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-cpg92"] Mar 08 04:11:22.722915 master-0 kubenswrapper[18592]: I0308 04:11:22.722563 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-42ffh"] Mar 08 04:11:24.192947 master-0 kubenswrapper[18592]: W0308 04:11:24.192897 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod837891ca_c979_429e_a13d_e3f1a5779648.slice/crio-7938e4c42d238aa83e7f34ee7ff9f8866f74f9c8bf5028b2790fa210a6f4e20f WatchSource:0}: Error finding container 7938e4c42d238aa83e7f34ee7ff9f8866f74f9c8bf5028b2790fa210a6f4e20f: Status 404 returned error can't find the container with id 7938e4c42d238aa83e7f34ee7ff9f8866f74f9c8bf5028b2790fa210a6f4e20f Mar 08 04:11:24.210605 master-0 kubenswrapper[18592]: W0308 04:11:24.210563 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod776ddb55_954f_445c_933b_a94b65fa57af.slice/crio-bf017b677c70b431cb96fb6af8829352fde1f6080040a7086e8c3bae6c3c8644 WatchSource:0}: Error finding container bf017b677c70b431cb96fb6af8829352fde1f6080040a7086e8c3bae6c3c8644: Status 404 returned error can't find the container with id bf017b677c70b431cb96fb6af8829352fde1f6080040a7086e8c3bae6c3c8644 Mar 08 04:11:24.211409 master-0 kubenswrapper[18592]: W0308 04:11:24.211351 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9d93f66_e65f_4f9b_b895_f00deda47253.slice/crio-427e8633c54641bdec930ee2c5d0ce80bec7aa05142fb45dba5ed9a31c099002 WatchSource:0}: Error finding container 427e8633c54641bdec930ee2c5d0ce80bec7aa05142fb45dba5ed9a31c099002: Status 404 returned error can't find the container with id 427e8633c54641bdec930ee2c5d0ce80bec7aa05142fb45dba5ed9a31c099002 Mar 08 04:11:24.225046 master-0 kubenswrapper[18592]: W0308 04:11:24.225004 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod100ca7f0_6139_4fa1_96d0_7cd231a2c39a.slice/crio-9c9168ed3d5ce161440820b7803bb1d60be4fdbef8c49a51497c0a3cc4429fbd WatchSource:0}: Error finding container 9c9168ed3d5ce161440820b7803bb1d60be4fdbef8c49a51497c0a3cc4429fbd: Status 404 returned error can't find the container with id 9c9168ed3d5ce161440820b7803bb1d60be4fdbef8c49a51497c0a3cc4429fbd Mar 08 04:11:24.515476 master-0 kubenswrapper[18592]: I0308 04:11:24.515412 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-477v2" event={"ID":"100ca7f0-6139-4fa1-96d0-7cd231a2c39a","Type":"ContainerStarted","Data":"9c9168ed3d5ce161440820b7803bb1d60be4fdbef8c49a51497c0a3cc4429fbd"} Mar 08 04:11:24.516309 master-0 kubenswrapper[18592]: I0308 04:11:24.516281 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-42ffh" event={"ID":"f0c8d80a-0bab-4de5-ba90-d20adaf211c6","Type":"ContainerStarted","Data":"cd660f5149d9bfd2448ee3ae87019539a1c778dc1a09941e392cfc239b04779c"} Mar 08 04:11:24.531905 master-0 kubenswrapper[18592]: I0308 04:11:24.531863 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-cpg92" event={"ID":"d9d93f66-e65f-4f9b-b895-f00deda47253","Type":"ContainerStarted","Data":"427e8633c54641bdec930ee2c5d0ce80bec7aa05142fb45dba5ed9a31c099002"} Mar 08 04:11:24.535183 master-0 kubenswrapper[18592]: I0308 04:11:24.533047 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-sb4gb" event={"ID":"837891ca-c979-429e-a13d-e3f1a5779648","Type":"ContainerStarted","Data":"7938e4c42d238aa83e7f34ee7ff9f8866f74f9c8bf5028b2790fa210a6f4e20f"} Mar 08 04:11:24.540740 master-0 kubenswrapper[18592]: I0308 04:11:24.540706 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-szjgg" event={"ID":"776ddb55-954f-445c-933b-a94b65fa57af","Type":"ContainerStarted","Data":"bf017b677c70b431cb96fb6af8829352fde1f6080040a7086e8c3bae6c3c8644"} Mar 08 04:11:24.546838 master-0 kubenswrapper[18592]: I0308 04:11:24.543958 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv" event={"ID":"ba434b94-1eeb-4d82-89a2-b3c8fa4a998d","Type":"ContainerStarted","Data":"57562b892d0060070a87a93501de25c8cdbac6241b0af71bc478212aa6835ba4"} Mar 08 04:11:24.546838 master-0 kubenswrapper[18592]: I0308 04:11:24.544200 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv" Mar 08 04:11:28.308406 master-0 kubenswrapper[18592]: I0308 04:11:28.302890 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv" podStartSLOduration=7.791039565 podStartE2EDuration="14.302872516s" podCreationTimestamp="2026-03-08 04:11:14 +0000 UTC" firstStartedPulling="2026-03-08 04:11:17.791044024 +0000 UTC m=+1089.889798374" lastFinishedPulling="2026-03-08 04:11:24.302876985 +0000 UTC m=+1096.401631325" observedRunningTime="2026-03-08 04:11:24.565162771 +0000 UTC m=+1096.663917121" watchObservedRunningTime="2026-03-08 04:11:28.302872516 +0000 UTC m=+1100.401626866" Mar 08 04:11:35.480377 master-0 kubenswrapper[18592]: I0308 04:11:35.479075 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7d4ccb5df5-l4btv" Mar 08 04:11:36.677854 master-0 kubenswrapper[18592]: I0308 04:11:36.677773 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-477v2" event={"ID":"100ca7f0-6139-4fa1-96d0-7cd231a2c39a","Type":"ContainerStarted","Data":"149fda9b93e2820c7b42204bf04c2e129a4e8bda68657b059b2ebca2d69ccf4d"} Mar 08 04:11:36.677854 master-0 kubenswrapper[18592]: I0308 04:11:36.677860 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-477v2" Mar 08 04:11:36.680191 master-0 kubenswrapper[18592]: I0308 04:11:36.680133 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-42ffh" event={"ID":"f0c8d80a-0bab-4de5-ba90-d20adaf211c6","Type":"ContainerStarted","Data":"2a86a73806c55d0ca7e38f67906940e16acfe0726b1a2807b348a41b2562910f"} Mar 08 04:11:36.682182 master-0 kubenswrapper[18592]: I0308 04:11:36.682143 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-cpg92" event={"ID":"d9d93f66-e65f-4f9b-b895-f00deda47253","Type":"ContainerStarted","Data":"94f6be1b03809d7d89886c8e6420dba653a4199e0bfd15e7d3019dc74411adcb"} Mar 08 04:11:36.682323 master-0 kubenswrapper[18592]: I0308 04:11:36.682305 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-cpg92" Mar 08 04:11:36.684286 master-0 kubenswrapper[18592]: I0308 04:11:36.684241 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-sb4gb" event={"ID":"837891ca-c979-429e-a13d-e3f1a5779648","Type":"ContainerStarted","Data":"03b8c7edcb9588f82d3ab2354cfd6d50924fa4d837a0ed55532652519e7cb96a"} Mar 08 04:11:36.686210 master-0 kubenswrapper[18592]: I0308 04:11:36.686098 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-szjgg" event={"ID":"776ddb55-954f-445c-933b-a94b65fa57af","Type":"ContainerStarted","Data":"e9e425366bf9911d78273e1f7d0ae61a812ee9b2b3fb17a40cd9b11aef0e6ff6"} Mar 08 04:11:36.715811 master-0 kubenswrapper[18592]: I0308 04:11:36.715737 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-477v2" podStartSLOduration=5.319644902 podStartE2EDuration="16.715722383s" podCreationTimestamp="2026-03-08 04:11:20 +0000 UTC" firstStartedPulling="2026-03-08 04:11:24.278692612 +0000 UTC m=+1096.377446962" lastFinishedPulling="2026-03-08 04:11:35.674770093 +0000 UTC m=+1107.773524443" observedRunningTime="2026-03-08 04:11:36.713998048 +0000 UTC m=+1108.812752398" watchObservedRunningTime="2026-03-08 04:11:36.715722383 +0000 UTC m=+1108.814476733" Mar 08 04:11:36.734884 master-0 kubenswrapper[18592]: I0308 04:11:36.734744 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-477v2" Mar 08 04:11:36.741274 master-0 kubenswrapper[18592]: I0308 04:11:36.741171 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-cpg92" podStartSLOduration=4.337509277 podStartE2EDuration="15.7411414s" podCreationTimestamp="2026-03-08 04:11:21 +0000 UTC" firstStartedPulling="2026-03-08 04:11:24.213244078 +0000 UTC m=+1096.311998428" lastFinishedPulling="2026-03-08 04:11:35.616876201 +0000 UTC m=+1107.715630551" observedRunningTime="2026-03-08 04:11:36.732540101 +0000 UTC m=+1108.831294471" watchObservedRunningTime="2026-03-08 04:11:36.7411414 +0000 UTC m=+1108.839895790" Mar 08 04:11:36.776328 master-0 kubenswrapper[18592]: I0308 04:11:36.776198 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-42ffh" podStartSLOduration=5.407378159 podStartE2EDuration="16.776170143s" podCreationTimestamp="2026-03-08 04:11:20 +0000 UTC" firstStartedPulling="2026-03-08 04:11:24.208115333 +0000 UTC m=+1096.306869683" lastFinishedPulling="2026-03-08 04:11:35.576907307 +0000 UTC m=+1107.675661667" observedRunningTime="2026-03-08 04:11:36.766267549 +0000 UTC m=+1108.865021899" watchObservedRunningTime="2026-03-08 04:11:36.776170143 +0000 UTC m=+1108.874924533" Mar 08 04:11:36.829688 master-0 kubenswrapper[18592]: I0308 04:11:36.823375 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85bf586469-sb4gb" podStartSLOduration=5.403600488 podStartE2EDuration="16.82335962s" podCreationTimestamp="2026-03-08 04:11:20 +0000 UTC" firstStartedPulling="2026-03-08 04:11:24.198575088 +0000 UTC m=+1096.297329438" lastFinishedPulling="2026-03-08 04:11:35.6183342 +0000 UTC m=+1107.717088570" observedRunningTime="2026-03-08 04:11:36.801062156 +0000 UTC m=+1108.899816526" watchObservedRunningTime="2026-03-08 04:11:36.82335962 +0000 UTC m=+1108.922113970" Mar 08 04:11:36.912315 master-0 kubenswrapper[18592]: I0308 04:11:36.912123 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-szjgg" podStartSLOduration=5.512315353 podStartE2EDuration="16.912100393s" podCreationTimestamp="2026-03-08 04:11:20 +0000 UTC" firstStartedPulling="2026-03-08 04:11:24.213381632 +0000 UTC m=+1096.312135982" lastFinishedPulling="2026-03-08 04:11:35.613166662 +0000 UTC m=+1107.711921022" observedRunningTime="2026-03-08 04:11:36.853604046 +0000 UTC m=+1108.952358396" watchObservedRunningTime="2026-03-08 04:11:36.912100393 +0000 UTC m=+1109.010854733" Mar 08 04:11:41.602643 master-0 kubenswrapper[18592]: I0308 04:11:41.602586 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-cpg92" Mar 08 04:11:54.294156 master-0 kubenswrapper[18592]: I0308 04:11:54.294062 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-c7f497896-fbcg7" Mar 08 04:12:02.207406 master-0 kubenswrapper[18592]: I0308 04:12:02.205919 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-lwx48"] Mar 08 04:12:02.207406 master-0 kubenswrapper[18592]: I0308 04:12:02.206903 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-lwx48" Mar 08 04:12:02.250854 master-0 kubenswrapper[18592]: I0308 04:12:02.245918 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-f69bb"] Mar 08 04:12:02.254920 master-0 kubenswrapper[18592]: I0308 04:12:02.254587 18592 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 08 04:12:02.288924 master-0 kubenswrapper[18592]: I0308 04:12:02.285896 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.297958 master-0 kubenswrapper[18592]: I0308 04:12:02.290095 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 08 04:12:02.297958 master-0 kubenswrapper[18592]: I0308 04:12:02.290340 18592 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 08 04:12:02.301903 master-0 kubenswrapper[18592]: I0308 04:12:02.299886 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-lwx48"] Mar 08 04:12:02.360947 master-0 kubenswrapper[18592]: I0308 04:12:02.356970 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-msbrs"] Mar 08 04:12:02.360947 master-0 kubenswrapper[18592]: I0308 04:12:02.358237 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-msbrs" Mar 08 04:12:02.360947 master-0 kubenswrapper[18592]: I0308 04:12:02.360201 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 08 04:12:02.374925 master-0 kubenswrapper[18592]: I0308 04:12:02.372382 18592 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 08 04:12:02.374925 master-0 kubenswrapper[18592]: I0308 04:12:02.372582 18592 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 08 04:12:02.407931 master-0 kubenswrapper[18592]: I0308 04:12:02.404571 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2m5v\" (UniqueName: \"kubernetes.io/projected/423b03ec-93e1-4baa-8530-8e3bba6eccb0-kube-api-access-r2m5v\") pod \"speaker-msbrs\" (UID: \"423b03ec-93e1-4baa-8530-8e3bba6eccb0\") " pod="metallb-system/speaker-msbrs" Mar 08 04:12:02.407931 master-0 kubenswrapper[18592]: I0308 04:12:02.404668 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e1532f83-fda2-443b-9bdb-aa4de5c66a13-frr-sockets\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.407931 master-0 kubenswrapper[18592]: I0308 04:12:02.404743 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/423b03ec-93e1-4baa-8530-8e3bba6eccb0-metrics-certs\") pod \"speaker-msbrs\" (UID: \"423b03ec-93e1-4baa-8530-8e3bba6eccb0\") " pod="metallb-system/speaker-msbrs" Mar 08 04:12:02.407931 master-0 kubenswrapper[18592]: I0308 04:12:02.404815 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/423b03ec-93e1-4baa-8530-8e3bba6eccb0-memberlist\") pod \"speaker-msbrs\" (UID: \"423b03ec-93e1-4baa-8530-8e3bba6eccb0\") " pod="metallb-system/speaker-msbrs" Mar 08 04:12:02.407931 master-0 kubenswrapper[18592]: I0308 04:12:02.404909 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1532f83-fda2-443b-9bdb-aa4de5c66a13-metrics-certs\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.407931 master-0 kubenswrapper[18592]: I0308 04:12:02.404959 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ltvj\" (UniqueName: \"kubernetes.io/projected/e1532f83-fda2-443b-9bdb-aa4de5c66a13-kube-api-access-4ltvj\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.407931 master-0 kubenswrapper[18592]: I0308 04:12:02.405043 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/423b03ec-93e1-4baa-8530-8e3bba6eccb0-metallb-excludel2\") pod \"speaker-msbrs\" (UID: \"423b03ec-93e1-4baa-8530-8e3bba6eccb0\") " pod="metallb-system/speaker-msbrs" Mar 08 04:12:02.407931 master-0 kubenswrapper[18592]: I0308 04:12:02.405086 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e1532f83-fda2-443b-9bdb-aa4de5c66a13-frr-startup\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.407931 master-0 kubenswrapper[18592]: I0308 04:12:02.405113 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e1532f83-fda2-443b-9bdb-aa4de5c66a13-metrics\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.407931 master-0 kubenswrapper[18592]: I0308 04:12:02.405137 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e1532f83-fda2-443b-9bdb-aa4de5c66a13-reloader\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.407931 master-0 kubenswrapper[18592]: I0308 04:12:02.405168 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e1532f83-fda2-443b-9bdb-aa4de5c66a13-frr-conf\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.407931 master-0 kubenswrapper[18592]: I0308 04:12:02.405191 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2gnq\" (UniqueName: \"kubernetes.io/projected/9097e49a-572d-40cd-8657-4dacb3d0f33b-kube-api-access-r2gnq\") pod \"frr-k8s-webhook-server-7f989f654f-lwx48\" (UID: \"9097e49a-572d-40cd-8657-4dacb3d0f33b\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-lwx48" Mar 08 04:12:02.407931 master-0 kubenswrapper[18592]: I0308 04:12:02.405211 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9097e49a-572d-40cd-8657-4dacb3d0f33b-cert\") pod \"frr-k8s-webhook-server-7f989f654f-lwx48\" (UID: \"9097e49a-572d-40cd-8657-4dacb3d0f33b\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-lwx48" Mar 08 04:12:02.433925 master-0 kubenswrapper[18592]: I0308 04:12:02.428234 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-wxdl4"] Mar 08 04:12:02.433925 master-0 kubenswrapper[18592]: I0308 04:12:02.430211 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-wxdl4" Mar 08 04:12:02.442055 master-0 kubenswrapper[18592]: I0308 04:12:02.435042 18592 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 08 04:12:02.490338 master-0 kubenswrapper[18592]: I0308 04:12:02.490233 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-wxdl4"] Mar 08 04:12:02.506638 master-0 kubenswrapper[18592]: I0308 04:12:02.506586 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2m5v\" (UniqueName: \"kubernetes.io/projected/423b03ec-93e1-4baa-8530-8e3bba6eccb0-kube-api-access-r2m5v\") pod \"speaker-msbrs\" (UID: \"423b03ec-93e1-4baa-8530-8e3bba6eccb0\") " pod="metallb-system/speaker-msbrs" Mar 08 04:12:02.506638 master-0 kubenswrapper[18592]: I0308 04:12:02.506634 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e1532f83-fda2-443b-9bdb-aa4de5c66a13-frr-sockets\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.506869 master-0 kubenswrapper[18592]: I0308 04:12:02.506669 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/423b03ec-93e1-4baa-8530-8e3bba6eccb0-metrics-certs\") pod \"speaker-msbrs\" (UID: \"423b03ec-93e1-4baa-8530-8e3bba6eccb0\") " pod="metallb-system/speaker-msbrs" Mar 08 04:12:02.506869 master-0 kubenswrapper[18592]: I0308 04:12:02.506701 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/423b03ec-93e1-4baa-8530-8e3bba6eccb0-memberlist\") pod \"speaker-msbrs\" (UID: \"423b03ec-93e1-4baa-8530-8e3bba6eccb0\") " pod="metallb-system/speaker-msbrs" Mar 08 04:12:02.506869 master-0 kubenswrapper[18592]: I0308 04:12:02.506721 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1532f83-fda2-443b-9bdb-aa4de5c66a13-metrics-certs\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.506869 master-0 kubenswrapper[18592]: I0308 04:12:02.506738 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ltvj\" (UniqueName: \"kubernetes.io/projected/e1532f83-fda2-443b-9bdb-aa4de5c66a13-kube-api-access-4ltvj\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.506869 master-0 kubenswrapper[18592]: I0308 04:12:02.506773 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/423b03ec-93e1-4baa-8530-8e3bba6eccb0-metallb-excludel2\") pod \"speaker-msbrs\" (UID: \"423b03ec-93e1-4baa-8530-8e3bba6eccb0\") " pod="metallb-system/speaker-msbrs" Mar 08 04:12:02.506869 master-0 kubenswrapper[18592]: I0308 04:12:02.506790 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e1532f83-fda2-443b-9bdb-aa4de5c66a13-frr-startup\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.506869 master-0 kubenswrapper[18592]: I0308 04:12:02.506810 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e1532f83-fda2-443b-9bdb-aa4de5c66a13-metrics\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.506869 master-0 kubenswrapper[18592]: I0308 04:12:02.506863 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e1532f83-fda2-443b-9bdb-aa4de5c66a13-reloader\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.507101 master-0 kubenswrapper[18592]: I0308 04:12:02.506894 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e1532f83-fda2-443b-9bdb-aa4de5c66a13-frr-conf\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.507101 master-0 kubenswrapper[18592]: I0308 04:12:02.506915 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2gnq\" (UniqueName: \"kubernetes.io/projected/9097e49a-572d-40cd-8657-4dacb3d0f33b-kube-api-access-r2gnq\") pod \"frr-k8s-webhook-server-7f989f654f-lwx48\" (UID: \"9097e49a-572d-40cd-8657-4dacb3d0f33b\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-lwx48" Mar 08 04:12:02.507101 master-0 kubenswrapper[18592]: I0308 04:12:02.506933 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9097e49a-572d-40cd-8657-4dacb3d0f33b-cert\") pod \"frr-k8s-webhook-server-7f989f654f-lwx48\" (UID: \"9097e49a-572d-40cd-8657-4dacb3d0f33b\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-lwx48" Mar 08 04:12:02.507101 master-0 kubenswrapper[18592]: E0308 04:12:02.507045 18592 secret.go:189] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 08 04:12:02.507101 master-0 kubenswrapper[18592]: E0308 04:12:02.507090 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9097e49a-572d-40cd-8657-4dacb3d0f33b-cert podName:9097e49a-572d-40cd-8657-4dacb3d0f33b nodeName:}" failed. No retries permitted until 2026-03-08 04:12:03.007075513 +0000 UTC m=+1135.105829863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9097e49a-572d-40cd-8657-4dacb3d0f33b-cert") pod "frr-k8s-webhook-server-7f989f654f-lwx48" (UID: "9097e49a-572d-40cd-8657-4dacb3d0f33b") : secret "frr-k8s-webhook-server-cert" not found Mar 08 04:12:02.508269 master-0 kubenswrapper[18592]: I0308 04:12:02.508241 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e1532f83-fda2-443b-9bdb-aa4de5c66a13-frr-sockets\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.508417 master-0 kubenswrapper[18592]: E0308 04:12:02.508377 18592 secret.go:189] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 08 04:12:02.508457 master-0 kubenswrapper[18592]: E0308 04:12:02.508448 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1532f83-fda2-443b-9bdb-aa4de5c66a13-metrics-certs podName:e1532f83-fda2-443b-9bdb-aa4de5c66a13 nodeName:}" failed. No retries permitted until 2026-03-08 04:12:03.008429949 +0000 UTC m=+1135.107184299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1532f83-fda2-443b-9bdb-aa4de5c66a13-metrics-certs") pod "frr-k8s-f69bb" (UID: "e1532f83-fda2-443b-9bdb-aa4de5c66a13") : secret "frr-k8s-certs-secret" not found Mar 08 04:12:02.508506 master-0 kubenswrapper[18592]: E0308 04:12:02.508489 18592 secret.go:189] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 08 04:12:02.508541 master-0 kubenswrapper[18592]: E0308 04:12:02.508518 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/423b03ec-93e1-4baa-8530-8e3bba6eccb0-metrics-certs podName:423b03ec-93e1-4baa-8530-8e3bba6eccb0 nodeName:}" failed. No retries permitted until 2026-03-08 04:12:03.008510931 +0000 UTC m=+1135.107265281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/423b03ec-93e1-4baa-8530-8e3bba6eccb0-metrics-certs") pod "speaker-msbrs" (UID: "423b03ec-93e1-4baa-8530-8e3bba6eccb0") : secret "speaker-certs-secret" not found Mar 08 04:12:02.508576 master-0 kubenswrapper[18592]: E0308 04:12:02.508547 18592 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 08 04:12:02.508576 master-0 kubenswrapper[18592]: E0308 04:12:02.508564 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/423b03ec-93e1-4baa-8530-8e3bba6eccb0-memberlist podName:423b03ec-93e1-4baa-8530-8e3bba6eccb0 nodeName:}" failed. No retries permitted until 2026-03-08 04:12:03.008559573 +0000 UTC m=+1135.107313923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/423b03ec-93e1-4baa-8530-8e3bba6eccb0-memberlist") pod "speaker-msbrs" (UID: "423b03ec-93e1-4baa-8530-8e3bba6eccb0") : secret "metallb-memberlist" not found Mar 08 04:12:02.509070 master-0 kubenswrapper[18592]: I0308 04:12:02.509040 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/423b03ec-93e1-4baa-8530-8e3bba6eccb0-metallb-excludel2\") pod \"speaker-msbrs\" (UID: \"423b03ec-93e1-4baa-8530-8e3bba6eccb0\") " pod="metallb-system/speaker-msbrs" Mar 08 04:12:02.510964 master-0 kubenswrapper[18592]: I0308 04:12:02.510665 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e1532f83-fda2-443b-9bdb-aa4de5c66a13-frr-startup\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.512995 master-0 kubenswrapper[18592]: I0308 04:12:02.512966 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e1532f83-fda2-443b-9bdb-aa4de5c66a13-metrics\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.514518 master-0 kubenswrapper[18592]: I0308 04:12:02.514478 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e1532f83-fda2-443b-9bdb-aa4de5c66a13-frr-conf\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.514518 master-0 kubenswrapper[18592]: I0308 04:12:02.514493 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e1532f83-fda2-443b-9bdb-aa4de5c66a13-reloader\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.539852 master-0 kubenswrapper[18592]: I0308 04:12:02.531945 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ltvj\" (UniqueName: \"kubernetes.io/projected/e1532f83-fda2-443b-9bdb-aa4de5c66a13-kube-api-access-4ltvj\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:02.546726 master-0 kubenswrapper[18592]: I0308 04:12:02.544490 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2m5v\" (UniqueName: \"kubernetes.io/projected/423b03ec-93e1-4baa-8530-8e3bba6eccb0-kube-api-access-r2m5v\") pod \"speaker-msbrs\" (UID: \"423b03ec-93e1-4baa-8530-8e3bba6eccb0\") " pod="metallb-system/speaker-msbrs" Mar 08 04:12:02.550481 master-0 kubenswrapper[18592]: I0308 04:12:02.548466 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2gnq\" (UniqueName: \"kubernetes.io/projected/9097e49a-572d-40cd-8657-4dacb3d0f33b-kube-api-access-r2gnq\") pod \"frr-k8s-webhook-server-7f989f654f-lwx48\" (UID: \"9097e49a-572d-40cd-8657-4dacb3d0f33b\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-lwx48" Mar 08 04:12:02.611211 master-0 kubenswrapper[18592]: I0308 04:12:02.611159 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54dfq\" (UniqueName: \"kubernetes.io/projected/51a5fdc3-c642-430a-9da8-448bd0cceae0-kube-api-access-54dfq\") pod \"controller-86ddb6bd46-wxdl4\" (UID: \"51a5fdc3-c642-430a-9da8-448bd0cceae0\") " pod="metallb-system/controller-86ddb6bd46-wxdl4" Mar 08 04:12:02.611406 master-0 kubenswrapper[18592]: I0308 04:12:02.611271 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51a5fdc3-c642-430a-9da8-448bd0cceae0-cert\") pod \"controller-86ddb6bd46-wxdl4\" (UID: \"51a5fdc3-c642-430a-9da8-448bd0cceae0\") " pod="metallb-system/controller-86ddb6bd46-wxdl4" Mar 08 04:12:02.611406 master-0 kubenswrapper[18592]: I0308 04:12:02.611300 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51a5fdc3-c642-430a-9da8-448bd0cceae0-metrics-certs\") pod \"controller-86ddb6bd46-wxdl4\" (UID: \"51a5fdc3-c642-430a-9da8-448bd0cceae0\") " pod="metallb-system/controller-86ddb6bd46-wxdl4" Mar 08 04:12:02.712657 master-0 kubenswrapper[18592]: I0308 04:12:02.712597 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54dfq\" (UniqueName: \"kubernetes.io/projected/51a5fdc3-c642-430a-9da8-448bd0cceae0-kube-api-access-54dfq\") pod \"controller-86ddb6bd46-wxdl4\" (UID: \"51a5fdc3-c642-430a-9da8-448bd0cceae0\") " pod="metallb-system/controller-86ddb6bd46-wxdl4" Mar 08 04:12:02.713342 master-0 kubenswrapper[18592]: I0308 04:12:02.713028 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51a5fdc3-c642-430a-9da8-448bd0cceae0-cert\") pod \"controller-86ddb6bd46-wxdl4\" (UID: \"51a5fdc3-c642-430a-9da8-448bd0cceae0\") " pod="metallb-system/controller-86ddb6bd46-wxdl4" Mar 08 04:12:02.713342 master-0 kubenswrapper[18592]: I0308 04:12:02.713109 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51a5fdc3-c642-430a-9da8-448bd0cceae0-metrics-certs\") pod \"controller-86ddb6bd46-wxdl4\" (UID: \"51a5fdc3-c642-430a-9da8-448bd0cceae0\") " pod="metallb-system/controller-86ddb6bd46-wxdl4" Mar 08 04:12:02.715657 master-0 kubenswrapper[18592]: I0308 04:12:02.715618 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51a5fdc3-c642-430a-9da8-448bd0cceae0-cert\") pod \"controller-86ddb6bd46-wxdl4\" (UID: \"51a5fdc3-c642-430a-9da8-448bd0cceae0\") " pod="metallb-system/controller-86ddb6bd46-wxdl4" Mar 08 04:12:02.718003 master-0 kubenswrapper[18592]: I0308 04:12:02.717969 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51a5fdc3-c642-430a-9da8-448bd0cceae0-metrics-certs\") pod \"controller-86ddb6bd46-wxdl4\" (UID: \"51a5fdc3-c642-430a-9da8-448bd0cceae0\") " pod="metallb-system/controller-86ddb6bd46-wxdl4" Mar 08 04:12:02.732863 master-0 kubenswrapper[18592]: I0308 04:12:02.732799 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54dfq\" (UniqueName: \"kubernetes.io/projected/51a5fdc3-c642-430a-9da8-448bd0cceae0-kube-api-access-54dfq\") pod \"controller-86ddb6bd46-wxdl4\" (UID: \"51a5fdc3-c642-430a-9da8-448bd0cceae0\") " pod="metallb-system/controller-86ddb6bd46-wxdl4" Mar 08 04:12:02.819382 master-0 kubenswrapper[18592]: I0308 04:12:02.819193 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-wxdl4" Mar 08 04:12:03.020050 master-0 kubenswrapper[18592]: I0308 04:12:03.019988 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9097e49a-572d-40cd-8657-4dacb3d0f33b-cert\") pod \"frr-k8s-webhook-server-7f989f654f-lwx48\" (UID: \"9097e49a-572d-40cd-8657-4dacb3d0f33b\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-lwx48" Mar 08 04:12:03.020278 master-0 kubenswrapper[18592]: I0308 04:12:03.020075 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/423b03ec-93e1-4baa-8530-8e3bba6eccb0-metrics-certs\") pod \"speaker-msbrs\" (UID: \"423b03ec-93e1-4baa-8530-8e3bba6eccb0\") " pod="metallb-system/speaker-msbrs" Mar 08 04:12:03.020278 master-0 kubenswrapper[18592]: I0308 04:12:03.020111 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/423b03ec-93e1-4baa-8530-8e3bba6eccb0-memberlist\") pod \"speaker-msbrs\" (UID: \"423b03ec-93e1-4baa-8530-8e3bba6eccb0\") " pod="metallb-system/speaker-msbrs" Mar 08 04:12:03.020278 master-0 kubenswrapper[18592]: I0308 04:12:03.020133 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1532f83-fda2-443b-9bdb-aa4de5c66a13-metrics-certs\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:03.021360 master-0 kubenswrapper[18592]: E0308 04:12:03.021009 18592 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 08 04:12:03.021360 master-0 kubenswrapper[18592]: E0308 04:12:03.021071 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/423b03ec-93e1-4baa-8530-8e3bba6eccb0-memberlist podName:423b03ec-93e1-4baa-8530-8e3bba6eccb0 nodeName:}" failed. No retries permitted until 2026-03-08 04:12:04.02105587 +0000 UTC m=+1136.119810210 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/423b03ec-93e1-4baa-8530-8e3bba6eccb0-memberlist") pod "speaker-msbrs" (UID: "423b03ec-93e1-4baa-8530-8e3bba6eccb0") : secret "metallb-memberlist" not found Mar 08 04:12:03.023221 master-0 kubenswrapper[18592]: I0308 04:12:03.023191 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1532f83-fda2-443b-9bdb-aa4de5c66a13-metrics-certs\") pod \"frr-k8s-f69bb\" (UID: \"e1532f83-fda2-443b-9bdb-aa4de5c66a13\") " pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:03.024534 master-0 kubenswrapper[18592]: I0308 04:12:03.024489 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9097e49a-572d-40cd-8657-4dacb3d0f33b-cert\") pod \"frr-k8s-webhook-server-7f989f654f-lwx48\" (UID: \"9097e49a-572d-40cd-8657-4dacb3d0f33b\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-lwx48" Mar 08 04:12:03.024861 master-0 kubenswrapper[18592]: I0308 04:12:03.024813 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/423b03ec-93e1-4baa-8530-8e3bba6eccb0-metrics-certs\") pod \"speaker-msbrs\" (UID: \"423b03ec-93e1-4baa-8530-8e3bba6eccb0\") " pod="metallb-system/speaker-msbrs" Mar 08 04:12:03.074402 master-0 kubenswrapper[18592]: I0308 04:12:03.073948 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:03.200775 master-0 kubenswrapper[18592]: I0308 04:12:03.200607 18592 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 04:12:03.265957 master-0 kubenswrapper[18592]: I0308 04:12:03.265880 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-wxdl4"] Mar 08 04:12:03.269164 master-0 kubenswrapper[18592]: W0308 04:12:03.269037 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51a5fdc3_c642_430a_9da8_448bd0cceae0.slice/crio-1afa127be643a0ebfc4c0e253337432234a41364f7d8a4c1f45fd5b45cccb386 WatchSource:0}: Error finding container 1afa127be643a0ebfc4c0e253337432234a41364f7d8a4c1f45fd5b45cccb386: Status 404 returned error can't find the container with id 1afa127be643a0ebfc4c0e253337432234a41364f7d8a4c1f45fd5b45cccb386 Mar 08 04:12:03.279196 master-0 kubenswrapper[18592]: I0308 04:12:03.279125 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-lwx48" Mar 08 04:12:03.826656 master-0 kubenswrapper[18592]: I0308 04:12:03.826601 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-lwx48"] Mar 08 04:12:03.943219 master-0 kubenswrapper[18592]: I0308 04:12:03.943141 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-lwx48" event={"ID":"9097e49a-572d-40cd-8657-4dacb3d0f33b","Type":"ContainerStarted","Data":"7b19e9b942b7ac1be2e52e4b8e55ee82f4097731b146d8a3e75ed0ef6e6914e8"} Mar 08 04:12:03.944810 master-0 kubenswrapper[18592]: I0308 04:12:03.944776 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f69bb" event={"ID":"e1532f83-fda2-443b-9bdb-aa4de5c66a13","Type":"ContainerStarted","Data":"e8d010b933fcb1e6e6e2b98490603f8fba8324858da82c379d18a8f533c4522c"} Mar 08 04:12:03.946895 master-0 kubenswrapper[18592]: I0308 04:12:03.946768 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-wxdl4" event={"ID":"51a5fdc3-c642-430a-9da8-448bd0cceae0","Type":"ContainerStarted","Data":"ac129c8c20abe872869922c49c939f050eb99bad1c4508c9259e918790223cf5"} Mar 08 04:12:03.946998 master-0 kubenswrapper[18592]: I0308 04:12:03.946962 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-wxdl4" event={"ID":"51a5fdc3-c642-430a-9da8-448bd0cceae0","Type":"ContainerStarted","Data":"1afa127be643a0ebfc4c0e253337432234a41364f7d8a4c1f45fd5b45cccb386"} Mar 08 04:12:04.042657 master-0 kubenswrapper[18592]: I0308 04:12:04.042595 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/423b03ec-93e1-4baa-8530-8e3bba6eccb0-memberlist\") pod \"speaker-msbrs\" (UID: \"423b03ec-93e1-4baa-8530-8e3bba6eccb0\") " pod="metallb-system/speaker-msbrs" Mar 08 04:12:04.045635 master-0 kubenswrapper[18592]: I0308 04:12:04.045594 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/423b03ec-93e1-4baa-8530-8e3bba6eccb0-memberlist\") pod \"speaker-msbrs\" (UID: \"423b03ec-93e1-4baa-8530-8e3bba6eccb0\") " pod="metallb-system/speaker-msbrs" Mar 08 04:12:04.258263 master-0 kubenswrapper[18592]: I0308 04:12:04.258192 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-ssvrb"] Mar 08 04:12:04.259322 master-0 kubenswrapper[18592]: I0308 04:12:04.259295 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ssvrb" Mar 08 04:12:04.260939 master-0 kubenswrapper[18592]: I0308 04:12:04.260874 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 08 04:12:04.309106 master-0 kubenswrapper[18592]: I0308 04:12:04.308987 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-msbrs" Mar 08 04:12:04.330410 master-0 kubenswrapper[18592]: W0308 04:12:04.330272 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod423b03ec_93e1_4baa_8530_8e3bba6eccb0.slice/crio-d13283769362de4d08b580e7687032fe51ecaab0dded46e6c8316db40b96cbed WatchSource:0}: Error finding container d13283769362de4d08b580e7687032fe51ecaab0dded46e6c8316db40b96cbed: Status 404 returned error can't find the container with id d13283769362de4d08b580e7687032fe51ecaab0dded46e6c8316db40b96cbed Mar 08 04:12:04.436660 master-0 kubenswrapper[18592]: I0308 04:12:04.435615 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-6ddmc"] Mar 08 04:12:04.437341 master-0 kubenswrapper[18592]: I0308 04:12:04.437186 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-6ddmc" Mar 08 04:12:04.450658 master-0 kubenswrapper[18592]: I0308 04:12:04.450483 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8aa28996-0851-498b-ba48-9c51fc1676cd-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-ssvrb\" (UID: \"8aa28996-0851-498b-ba48-9c51fc1676cd\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ssvrb" Mar 08 04:12:04.450658 master-0 kubenswrapper[18592]: I0308 04:12:04.450569 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp9f5\" (UniqueName: \"kubernetes.io/projected/8aa28996-0851-498b-ba48-9c51fc1676cd-kube-api-access-pp9f5\") pod \"nmstate-webhook-786f45cff4-ssvrb\" (UID: \"8aa28996-0851-498b-ba48-9c51fc1676cd\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ssvrb" Mar 08 04:12:04.495989 master-0 kubenswrapper[18592]: I0308 04:12:04.495928 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-ssvrb"] Mar 08 04:12:04.503533 master-0 kubenswrapper[18592]: I0308 04:12:04.503437 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-2c54k"] Mar 08 04:12:04.506022 master-0 kubenswrapper[18592]: I0308 04:12:04.505966 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-2c54k" Mar 08 04:12:04.525264 master-0 kubenswrapper[18592]: I0308 04:12:04.520676 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-6ddmc"] Mar 08 04:12:04.553057 master-0 kubenswrapper[18592]: I0308 04:12:04.552122 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbjp9\" (UniqueName: \"kubernetes.io/projected/9c2a57e8-799a-43c3-8aa2-3638e37db81a-kube-api-access-bbjp9\") pod \"nmstate-metrics-69594cc75-6ddmc\" (UID: \"9c2a57e8-799a-43c3-8aa2-3638e37db81a\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-6ddmc" Mar 08 04:12:04.553057 master-0 kubenswrapper[18592]: I0308 04:12:04.552234 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8aa28996-0851-498b-ba48-9c51fc1676cd-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-ssvrb\" (UID: \"8aa28996-0851-498b-ba48-9c51fc1676cd\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ssvrb" Mar 08 04:12:04.553057 master-0 kubenswrapper[18592]: I0308 04:12:04.552422 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp9f5\" (UniqueName: \"kubernetes.io/projected/8aa28996-0851-498b-ba48-9c51fc1676cd-kube-api-access-pp9f5\") pod \"nmstate-webhook-786f45cff4-ssvrb\" (UID: \"8aa28996-0851-498b-ba48-9c51fc1676cd\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ssvrb" Mar 08 04:12:04.553057 master-0 kubenswrapper[18592]: E0308 04:12:04.552982 18592 secret.go:189] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 08 04:12:04.553057 master-0 kubenswrapper[18592]: E0308 04:12:04.553058 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa28996-0851-498b-ba48-9c51fc1676cd-tls-key-pair podName:8aa28996-0851-498b-ba48-9c51fc1676cd nodeName:}" failed. No retries permitted until 2026-03-08 04:12:05.053038462 +0000 UTC m=+1137.151792812 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/8aa28996-0851-498b-ba48-9c51fc1676cd-tls-key-pair") pod "nmstate-webhook-786f45cff4-ssvrb" (UID: "8aa28996-0851-498b-ba48-9c51fc1676cd") : secret "openshift-nmstate-webhook" not found Mar 08 04:12:04.609925 master-0 kubenswrapper[18592]: I0308 04:12:04.609849 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp9f5\" (UniqueName: \"kubernetes.io/projected/8aa28996-0851-498b-ba48-9c51fc1676cd-kube-api-access-pp9f5\") pod \"nmstate-webhook-786f45cff4-ssvrb\" (UID: \"8aa28996-0851-498b-ba48-9c51fc1676cd\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ssvrb" Mar 08 04:12:04.638891 master-0 kubenswrapper[18592]: I0308 04:12:04.638805 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhftn"] Mar 08 04:12:04.655924 master-0 kubenswrapper[18592]: I0308 04:12:04.653683 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhftn" Mar 08 04:12:04.657212 master-0 kubenswrapper[18592]: I0308 04:12:04.657154 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnxwd\" (UniqueName: \"kubernetes.io/projected/ff56c314-c380-421c-93d1-4e3bb9dd6b08-kube-api-access-fnxwd\") pod \"nmstate-handler-2c54k\" (UID: \"ff56c314-c380-421c-93d1-4e3bb9dd6b08\") " pod="openshift-nmstate/nmstate-handler-2c54k" Mar 08 04:12:04.657283 master-0 kubenswrapper[18592]: I0308 04:12:04.657218 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ff56c314-c380-421c-93d1-4e3bb9dd6b08-nmstate-lock\") pod \"nmstate-handler-2c54k\" (UID: \"ff56c314-c380-421c-93d1-4e3bb9dd6b08\") " pod="openshift-nmstate/nmstate-handler-2c54k" Mar 08 04:12:04.660927 master-0 kubenswrapper[18592]: I0308 04:12:04.660840 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ff56c314-c380-421c-93d1-4e3bb9dd6b08-ovs-socket\") pod \"nmstate-handler-2c54k\" (UID: \"ff56c314-c380-421c-93d1-4e3bb9dd6b08\") " pod="openshift-nmstate/nmstate-handler-2c54k" Mar 08 04:12:04.661291 master-0 kubenswrapper[18592]: I0308 04:12:04.661270 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ff56c314-c380-421c-93d1-4e3bb9dd6b08-dbus-socket\") pod \"nmstate-handler-2c54k\" (UID: \"ff56c314-c380-421c-93d1-4e3bb9dd6b08\") " pod="openshift-nmstate/nmstate-handler-2c54k" Mar 08 04:12:04.661475 master-0 kubenswrapper[18592]: I0308 04:12:04.661458 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbjp9\" (UniqueName: \"kubernetes.io/projected/9c2a57e8-799a-43c3-8aa2-3638e37db81a-kube-api-access-bbjp9\") pod \"nmstate-metrics-69594cc75-6ddmc\" (UID: \"9c2a57e8-799a-43c3-8aa2-3638e37db81a\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-6ddmc" Mar 08 04:12:04.665874 master-0 kubenswrapper[18592]: I0308 04:12:04.665795 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 08 04:12:04.666174 master-0 kubenswrapper[18592]: I0308 04:12:04.666147 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 08 04:12:04.696204 master-0 kubenswrapper[18592]: I0308 04:12:04.696132 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhftn"] Mar 08 04:12:04.699049 master-0 kubenswrapper[18592]: I0308 04:12:04.699018 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbjp9\" (UniqueName: \"kubernetes.io/projected/9c2a57e8-799a-43c3-8aa2-3638e37db81a-kube-api-access-bbjp9\") pod \"nmstate-metrics-69594cc75-6ddmc\" (UID: \"9c2a57e8-799a-43c3-8aa2-3638e37db81a\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-6ddmc" Mar 08 04:12:04.764885 master-0 kubenswrapper[18592]: I0308 04:12:04.763450 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d20e92b-2799-4d90-9bfb-6175bebe39b3-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-fhftn\" (UID: \"6d20e92b-2799-4d90-9bfb-6175bebe39b3\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhftn" Mar 08 04:12:04.764885 master-0 kubenswrapper[18592]: I0308 04:12:04.763643 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnxwd\" (UniqueName: \"kubernetes.io/projected/ff56c314-c380-421c-93d1-4e3bb9dd6b08-kube-api-access-fnxwd\") pod \"nmstate-handler-2c54k\" (UID: \"ff56c314-c380-421c-93d1-4e3bb9dd6b08\") " pod="openshift-nmstate/nmstate-handler-2c54k" Mar 08 04:12:04.764885 master-0 kubenswrapper[18592]: I0308 04:12:04.763716 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ff56c314-c380-421c-93d1-4e3bb9dd6b08-nmstate-lock\") pod \"nmstate-handler-2c54k\" (UID: \"ff56c314-c380-421c-93d1-4e3bb9dd6b08\") " pod="openshift-nmstate/nmstate-handler-2c54k" Mar 08 04:12:04.764885 master-0 kubenswrapper[18592]: I0308 04:12:04.763798 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ff56c314-c380-421c-93d1-4e3bb9dd6b08-ovs-socket\") pod \"nmstate-handler-2c54k\" (UID: \"ff56c314-c380-421c-93d1-4e3bb9dd6b08\") " pod="openshift-nmstate/nmstate-handler-2c54k" Mar 08 04:12:04.764885 master-0 kubenswrapper[18592]: I0308 04:12:04.763873 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ff56c314-c380-421c-93d1-4e3bb9dd6b08-dbus-socket\") pod \"nmstate-handler-2c54k\" (UID: \"ff56c314-c380-421c-93d1-4e3bb9dd6b08\") " pod="openshift-nmstate/nmstate-handler-2c54k" Mar 08 04:12:04.764885 master-0 kubenswrapper[18592]: I0308 04:12:04.763900 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6d20e92b-2799-4d90-9bfb-6175bebe39b3-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-fhftn\" (UID: \"6d20e92b-2799-4d90-9bfb-6175bebe39b3\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhftn" Mar 08 04:12:04.764885 master-0 kubenswrapper[18592]: I0308 04:12:04.763929 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvkpd\" (UniqueName: \"kubernetes.io/projected/6d20e92b-2799-4d90-9bfb-6175bebe39b3-kube-api-access-dvkpd\") pod \"nmstate-console-plugin-5dcbbd79cf-fhftn\" (UID: \"6d20e92b-2799-4d90-9bfb-6175bebe39b3\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhftn" Mar 08 04:12:04.764885 master-0 kubenswrapper[18592]: I0308 04:12:04.764256 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ff56c314-c380-421c-93d1-4e3bb9dd6b08-nmstate-lock\") pod \"nmstate-handler-2c54k\" (UID: \"ff56c314-c380-421c-93d1-4e3bb9dd6b08\") " pod="openshift-nmstate/nmstate-handler-2c54k" Mar 08 04:12:04.764885 master-0 kubenswrapper[18592]: I0308 04:12:04.764299 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ff56c314-c380-421c-93d1-4e3bb9dd6b08-ovs-socket\") pod \"nmstate-handler-2c54k\" (UID: \"ff56c314-c380-421c-93d1-4e3bb9dd6b08\") " pod="openshift-nmstate/nmstate-handler-2c54k" Mar 08 04:12:04.764885 master-0 kubenswrapper[18592]: I0308 04:12:04.764341 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ff56c314-c380-421c-93d1-4e3bb9dd6b08-dbus-socket\") pod \"nmstate-handler-2c54k\" (UID: \"ff56c314-c380-421c-93d1-4e3bb9dd6b08\") " pod="openshift-nmstate/nmstate-handler-2c54k" Mar 08 04:12:04.782912 master-0 kubenswrapper[18592]: I0308 04:12:04.776190 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-6ddmc" Mar 08 04:12:04.800532 master-0 kubenswrapper[18592]: I0308 04:12:04.799948 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5795766c6-lq2d8"] Mar 08 04:12:04.801404 master-0 kubenswrapper[18592]: I0308 04:12:04.801375 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnxwd\" (UniqueName: \"kubernetes.io/projected/ff56c314-c380-421c-93d1-4e3bb9dd6b08-kube-api-access-fnxwd\") pod \"nmstate-handler-2c54k\" (UID: \"ff56c314-c380-421c-93d1-4e3bb9dd6b08\") " pod="openshift-nmstate/nmstate-handler-2c54k" Mar 08 04:12:04.801519 master-0 kubenswrapper[18592]: I0308 04:12:04.801490 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:04.817931 master-0 kubenswrapper[18592]: I0308 04:12:04.817875 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5795766c6-lq2d8"] Mar 08 04:12:04.868113 master-0 kubenswrapper[18592]: I0308 04:12:04.865209 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6d20e92b-2799-4d90-9bfb-6175bebe39b3-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-fhftn\" (UID: \"6d20e92b-2799-4d90-9bfb-6175bebe39b3\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhftn" Mar 08 04:12:04.868113 master-0 kubenswrapper[18592]: I0308 04:12:04.865268 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvkpd\" (UniqueName: \"kubernetes.io/projected/6d20e92b-2799-4d90-9bfb-6175bebe39b3-kube-api-access-dvkpd\") pod \"nmstate-console-plugin-5dcbbd79cf-fhftn\" (UID: \"6d20e92b-2799-4d90-9bfb-6175bebe39b3\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhftn" Mar 08 04:12:04.868113 master-0 kubenswrapper[18592]: I0308 04:12:04.865292 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d20e92b-2799-4d90-9bfb-6175bebe39b3-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-fhftn\" (UID: \"6d20e92b-2799-4d90-9bfb-6175bebe39b3\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhftn" Mar 08 04:12:04.868761 master-0 kubenswrapper[18592]: I0308 04:12:04.868724 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6d20e92b-2799-4d90-9bfb-6175bebe39b3-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-fhftn\" (UID: \"6d20e92b-2799-4d90-9bfb-6175bebe39b3\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhftn" Mar 08 04:12:04.872888 master-0 kubenswrapper[18592]: I0308 04:12:04.872309 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d20e92b-2799-4d90-9bfb-6175bebe39b3-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-fhftn\" (UID: \"6d20e92b-2799-4d90-9bfb-6175bebe39b3\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhftn" Mar 08 04:12:04.884211 master-0 kubenswrapper[18592]: I0308 04:12:04.884175 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvkpd\" (UniqueName: \"kubernetes.io/projected/6d20e92b-2799-4d90-9bfb-6175bebe39b3-kube-api-access-dvkpd\") pod \"nmstate-console-plugin-5dcbbd79cf-fhftn\" (UID: \"6d20e92b-2799-4d90-9bfb-6175bebe39b3\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhftn" Mar 08 04:12:04.912175 master-0 kubenswrapper[18592]: I0308 04:12:04.912105 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-2c54k" Mar 08 04:12:04.968005 master-0 kubenswrapper[18592]: I0308 04:12:04.966648 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc4dec3e-6f82-404f-9aad-9bad08ca306c-console-serving-cert\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:04.968005 master-0 kubenswrapper[18592]: I0308 04:12:04.966733 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9z27\" (UniqueName: \"kubernetes.io/projected/dc4dec3e-6f82-404f-9aad-9bad08ca306c-kube-api-access-w9z27\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:04.968005 master-0 kubenswrapper[18592]: I0308 04:12:04.966757 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dc4dec3e-6f82-404f-9aad-9bad08ca306c-console-config\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:04.968005 master-0 kubenswrapper[18592]: I0308 04:12:04.966800 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dc4dec3e-6f82-404f-9aad-9bad08ca306c-oauth-serving-cert\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:04.968005 master-0 kubenswrapper[18592]: I0308 04:12:04.967045 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc4dec3e-6f82-404f-9aad-9bad08ca306c-service-ca\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:04.968005 master-0 kubenswrapper[18592]: I0308 04:12:04.967124 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc4dec3e-6f82-404f-9aad-9bad08ca306c-trusted-ca-bundle\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:04.968005 master-0 kubenswrapper[18592]: I0308 04:12:04.967296 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dc4dec3e-6f82-404f-9aad-9bad08ca306c-console-oauth-config\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:04.993468 master-0 kubenswrapper[18592]: I0308 04:12:04.993409 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-2c54k" event={"ID":"ff56c314-c380-421c-93d1-4e3bb9dd6b08","Type":"ContainerStarted","Data":"54c5cbf3d5d8a30a1a395eab5d892e3dc4935d97d168ad0194dee23a43f23bf2"} Mar 08 04:12:04.994561 master-0 kubenswrapper[18592]: I0308 04:12:04.994526 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-msbrs" event={"ID":"423b03ec-93e1-4baa-8530-8e3bba6eccb0","Type":"ContainerStarted","Data":"f58c937d3dff783c8db9c906718e3b5bc515065442c8ce9ec2f65b605a558796"} Mar 08 04:12:04.994561 master-0 kubenswrapper[18592]: I0308 04:12:04.994557 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-msbrs" event={"ID":"423b03ec-93e1-4baa-8530-8e3bba6eccb0","Type":"ContainerStarted","Data":"d13283769362de4d08b580e7687032fe51ecaab0dded46e6c8316db40b96cbed"} Mar 08 04:12:05.062361 master-0 kubenswrapper[18592]: I0308 04:12:05.062250 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhftn" Mar 08 04:12:05.068816 master-0 kubenswrapper[18592]: I0308 04:12:05.068779 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc4dec3e-6f82-404f-9aad-9bad08ca306c-service-ca\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:05.068929 master-0 kubenswrapper[18592]: I0308 04:12:05.068842 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc4dec3e-6f82-404f-9aad-9bad08ca306c-trusted-ca-bundle\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:05.068929 master-0 kubenswrapper[18592]: I0308 04:12:05.068887 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8aa28996-0851-498b-ba48-9c51fc1676cd-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-ssvrb\" (UID: \"8aa28996-0851-498b-ba48-9c51fc1676cd\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ssvrb" Mar 08 04:12:05.068929 master-0 kubenswrapper[18592]: I0308 04:12:05.068914 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dc4dec3e-6f82-404f-9aad-9bad08ca306c-console-oauth-config\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:05.069034 master-0 kubenswrapper[18592]: I0308 04:12:05.068933 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc4dec3e-6f82-404f-9aad-9bad08ca306c-console-serving-cert\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:05.069034 master-0 kubenswrapper[18592]: I0308 04:12:05.068967 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9z27\" (UniqueName: \"kubernetes.io/projected/dc4dec3e-6f82-404f-9aad-9bad08ca306c-kube-api-access-w9z27\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:05.069034 master-0 kubenswrapper[18592]: I0308 04:12:05.068986 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dc4dec3e-6f82-404f-9aad-9bad08ca306c-console-config\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:05.069034 master-0 kubenswrapper[18592]: I0308 04:12:05.069022 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dc4dec3e-6f82-404f-9aad-9bad08ca306c-oauth-serving-cert\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:05.069739 master-0 kubenswrapper[18592]: I0308 04:12:05.069702 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc4dec3e-6f82-404f-9aad-9bad08ca306c-service-ca\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:05.069797 master-0 kubenswrapper[18592]: I0308 04:12:05.069780 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dc4dec3e-6f82-404f-9aad-9bad08ca306c-oauth-serving-cert\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:05.070572 master-0 kubenswrapper[18592]: I0308 04:12:05.070525 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc4dec3e-6f82-404f-9aad-9bad08ca306c-trusted-ca-bundle\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:05.070714 master-0 kubenswrapper[18592]: I0308 04:12:05.070671 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dc4dec3e-6f82-404f-9aad-9bad08ca306c-console-config\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:05.072556 master-0 kubenswrapper[18592]: I0308 04:12:05.072499 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc4dec3e-6f82-404f-9aad-9bad08ca306c-console-serving-cert\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:05.073109 master-0 kubenswrapper[18592]: I0308 04:12:05.073078 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8aa28996-0851-498b-ba48-9c51fc1676cd-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-ssvrb\" (UID: \"8aa28996-0851-498b-ba48-9c51fc1676cd\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ssvrb" Mar 08 04:12:05.075885 master-0 kubenswrapper[18592]: I0308 04:12:05.074624 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dc4dec3e-6f82-404f-9aad-9bad08ca306c-console-oauth-config\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:05.087281 master-0 kubenswrapper[18592]: I0308 04:12:05.087244 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9z27\" (UniqueName: \"kubernetes.io/projected/dc4dec3e-6f82-404f-9aad-9bad08ca306c-kube-api-access-w9z27\") pod \"console-5795766c6-lq2d8\" (UID: \"dc4dec3e-6f82-404f-9aad-9bad08ca306c\") " pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:05.153912 master-0 kubenswrapper[18592]: I0308 04:12:05.152608 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:05.192938 master-0 kubenswrapper[18592]: I0308 04:12:05.188868 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ssvrb" Mar 08 04:12:05.257806 master-0 kubenswrapper[18592]: I0308 04:12:05.256272 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-6ddmc"] Mar 08 04:12:05.267040 master-0 kubenswrapper[18592]: W0308 04:12:05.266992 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c2a57e8_799a_43c3_8aa2_3638e37db81a.slice/crio-ea159b5085b86be802b7a27b375bc8aa825b15d46f804d2c8e16d24a650417f5 WatchSource:0}: Error finding container ea159b5085b86be802b7a27b375bc8aa825b15d46f804d2c8e16d24a650417f5: Status 404 returned error can't find the container with id ea159b5085b86be802b7a27b375bc8aa825b15d46f804d2c8e16d24a650417f5 Mar 08 04:12:05.525879 master-0 kubenswrapper[18592]: I0308 04:12:05.525349 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhftn"] Mar 08 04:12:05.645279 master-0 kubenswrapper[18592]: I0308 04:12:05.643513 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5795766c6-lq2d8"] Mar 08 04:12:05.815721 master-0 kubenswrapper[18592]: I0308 04:12:05.815665 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-ssvrb"] Mar 08 04:12:05.818560 master-0 kubenswrapper[18592]: W0308 04:12:05.818511 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8aa28996_0851_498b_ba48_9c51fc1676cd.slice/crio-b40a865d5b41b67c8c1faf17ef627e81f1c02acfa83e2bce7e0307e7a476984d WatchSource:0}: Error finding container b40a865d5b41b67c8c1faf17ef627e81f1c02acfa83e2bce7e0307e7a476984d: Status 404 returned error can't find the container with id b40a865d5b41b67c8c1faf17ef627e81f1c02acfa83e2bce7e0307e7a476984d Mar 08 04:12:06.004094 master-0 kubenswrapper[18592]: I0308 04:12:06.004047 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-wxdl4" event={"ID":"51a5fdc3-c642-430a-9da8-448bd0cceae0","Type":"ContainerStarted","Data":"f19b8dfadd76614f78816e41c171097fc69fc4b7fa1e38c2c578a5286098bb42"} Mar 08 04:12:06.005122 master-0 kubenswrapper[18592]: I0308 04:12:06.005079 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-wxdl4" Mar 08 04:12:06.007647 master-0 kubenswrapper[18592]: I0308 04:12:06.007594 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhftn" event={"ID":"6d20e92b-2799-4d90-9bfb-6175bebe39b3","Type":"ContainerStarted","Data":"c7302f2dc116743be704ad6ebf4243ce618a29ecec0ac591b803bf9709eb4fab"} Mar 08 04:12:06.009390 master-0 kubenswrapper[18592]: I0308 04:12:06.009361 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5795766c6-lq2d8" event={"ID":"dc4dec3e-6f82-404f-9aad-9bad08ca306c","Type":"ContainerStarted","Data":"cd261887518d89f0c7a081080829bca7c4cb5261ae172654101a3932aef074aa"} Mar 08 04:12:06.009524 master-0 kubenswrapper[18592]: I0308 04:12:06.009506 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5795766c6-lq2d8" event={"ID":"dc4dec3e-6f82-404f-9aad-9bad08ca306c","Type":"ContainerStarted","Data":"26819f0ba9f26f28631665a78282f0b8a17fd6932d554814f701e55de2250ac6"} Mar 08 04:12:06.011235 master-0 kubenswrapper[18592]: I0308 04:12:06.011209 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ssvrb" event={"ID":"8aa28996-0851-498b-ba48-9c51fc1676cd","Type":"ContainerStarted","Data":"b40a865d5b41b67c8c1faf17ef627e81f1c02acfa83e2bce7e0307e7a476984d"} Mar 08 04:12:06.012628 master-0 kubenswrapper[18592]: I0308 04:12:06.012583 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-6ddmc" event={"ID":"9c2a57e8-799a-43c3-8aa2-3638e37db81a","Type":"ContainerStarted","Data":"ea159b5085b86be802b7a27b375bc8aa825b15d46f804d2c8e16d24a650417f5"} Mar 08 04:12:06.034405 master-0 kubenswrapper[18592]: I0308 04:12:06.034332 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-wxdl4" podStartSLOduration=2.487047972 podStartE2EDuration="4.0343105s" podCreationTimestamp="2026-03-08 04:12:02 +0000 UTC" firstStartedPulling="2026-03-08 04:12:03.47765452 +0000 UTC m=+1135.576408880" lastFinishedPulling="2026-03-08 04:12:05.024917058 +0000 UTC m=+1137.123671408" observedRunningTime="2026-03-08 04:12:06.02641302 +0000 UTC m=+1138.125167370" watchObservedRunningTime="2026-03-08 04:12:06.0343105 +0000 UTC m=+1138.133064860" Mar 08 04:12:06.050351 master-0 kubenswrapper[18592]: I0308 04:12:06.050082 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5795766c6-lq2d8" podStartSLOduration=2.050061829 podStartE2EDuration="2.050061829s" podCreationTimestamp="2026-03-08 04:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:12:06.042460797 +0000 UTC m=+1138.141215147" watchObservedRunningTime="2026-03-08 04:12:06.050061829 +0000 UTC m=+1138.148816189" Mar 08 04:12:07.029946 master-0 kubenswrapper[18592]: I0308 04:12:07.028173 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-msbrs" event={"ID":"423b03ec-93e1-4baa-8530-8e3bba6eccb0","Type":"ContainerStarted","Data":"3f2e3255259797d2ae603e96989f17875067d5c8b397ed4352e97eb8b730ad26"} Mar 08 04:12:07.030494 master-0 kubenswrapper[18592]: I0308 04:12:07.030003 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-msbrs" Mar 08 04:12:07.048086 master-0 kubenswrapper[18592]: I0308 04:12:07.047680 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-msbrs" podStartSLOduration=3.937220721 podStartE2EDuration="5.047664039s" podCreationTimestamp="2026-03-08 04:12:02 +0000 UTC" firstStartedPulling="2026-03-08 04:12:04.822082189 +0000 UTC m=+1136.920836539" lastFinishedPulling="2026-03-08 04:12:05.932525507 +0000 UTC m=+1138.031279857" observedRunningTime="2026-03-08 04:12:07.047580597 +0000 UTC m=+1139.146334957" watchObservedRunningTime="2026-03-08 04:12:07.047664039 +0000 UTC m=+1139.146418389" Mar 08 04:12:12.112656 master-0 kubenswrapper[18592]: I0308 04:12:12.112508 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-lwx48" event={"ID":"9097e49a-572d-40cd-8657-4dacb3d0f33b","Type":"ContainerStarted","Data":"7d2eee9a9e3628c4ec78a4334f3106f042dc059f85b2c182dfc61180b77c60cd"} Mar 08 04:12:12.112656 master-0 kubenswrapper[18592]: I0308 04:12:12.112595 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-lwx48" Mar 08 04:12:12.116336 master-0 kubenswrapper[18592]: I0308 04:12:12.114638 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ssvrb" event={"ID":"8aa28996-0851-498b-ba48-9c51fc1676cd","Type":"ContainerStarted","Data":"7095bc07c18eec1c9b75f991a5a657a65192cbc0d90f4dbd045753bc56039c51"} Mar 08 04:12:12.116336 master-0 kubenswrapper[18592]: I0308 04:12:12.115628 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ssvrb" Mar 08 04:12:12.120336 master-0 kubenswrapper[18592]: I0308 04:12:12.120275 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-6ddmc" event={"ID":"9c2a57e8-799a-43c3-8aa2-3638e37db81a","Type":"ContainerStarted","Data":"f39881e6826e9457ad9be941537e4204be4163f250843826a91dec8bb79bf5af"} Mar 08 04:12:12.120336 master-0 kubenswrapper[18592]: I0308 04:12:12.120333 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-6ddmc" event={"ID":"9c2a57e8-799a-43c3-8aa2-3638e37db81a","Type":"ContainerStarted","Data":"20aef96b685d3cc3045bba362760dda1631d8408f491790ccf7efebd56c0d9f6"} Mar 08 04:12:12.124481 master-0 kubenswrapper[18592]: I0308 04:12:12.124416 18592 generic.go:334] "Generic (PLEG): container finished" podID="e1532f83-fda2-443b-9bdb-aa4de5c66a13" containerID="0bf2562c5ee7d65f663ba28e57d090bced8c3e79e0da447f93d11ae32ab853fb" exitCode=0 Mar 08 04:12:12.124755 master-0 kubenswrapper[18592]: I0308 04:12:12.124547 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f69bb" event={"ID":"e1532f83-fda2-443b-9bdb-aa4de5c66a13","Type":"ContainerDied","Data":"0bf2562c5ee7d65f663ba28e57d090bced8c3e79e0da447f93d11ae32ab853fb"} Mar 08 04:12:12.128746 master-0 kubenswrapper[18592]: I0308 04:12:12.128321 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhftn" event={"ID":"6d20e92b-2799-4d90-9bfb-6175bebe39b3","Type":"ContainerStarted","Data":"43b874454e07fa2f213710a9805463633187138673ea20c7efcb9595aa0260b7"} Mar 08 04:12:12.148696 master-0 kubenswrapper[18592]: I0308 04:12:12.147543 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-lwx48" podStartSLOduration=2.83454377 podStartE2EDuration="10.147517859s" podCreationTimestamp="2026-03-08 04:12:02 +0000 UTC" firstStartedPulling="2026-03-08 04:12:03.838192864 +0000 UTC m=+1135.936947224" lastFinishedPulling="2026-03-08 04:12:11.151166953 +0000 UTC m=+1143.249921313" observedRunningTime="2026-03-08 04:12:12.129926422 +0000 UTC m=+1144.228680782" watchObservedRunningTime="2026-03-08 04:12:12.147517859 +0000 UTC m=+1144.246272219" Mar 08 04:12:12.173931 master-0 kubenswrapper[18592]: I0308 04:12:12.173863 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-2c54k" event={"ID":"ff56c314-c380-421c-93d1-4e3bb9dd6b08","Type":"ContainerStarted","Data":"89361fe6b03ff0a83e1edb7b370aba5a9c656d89d55f30da74dd890fc189bb8f"} Mar 08 04:12:12.173931 master-0 kubenswrapper[18592]: I0308 04:12:12.173942 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-2c54k" Mar 08 04:12:12.176646 master-0 kubenswrapper[18592]: I0308 04:12:12.176571 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-6ddmc" podStartSLOduration=2.360380322 podStartE2EDuration="8.17654421s" podCreationTimestamp="2026-03-08 04:12:04 +0000 UTC" firstStartedPulling="2026-03-08 04:12:05.271228091 +0000 UTC m=+1137.369982441" lastFinishedPulling="2026-03-08 04:12:11.087391979 +0000 UTC m=+1143.186146329" observedRunningTime="2026-03-08 04:12:12.155659345 +0000 UTC m=+1144.254413705" watchObservedRunningTime="2026-03-08 04:12:12.17654421 +0000 UTC m=+1144.275298570" Mar 08 04:12:12.254268 master-0 kubenswrapper[18592]: I0308 04:12:12.254175 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhftn" podStartSLOduration=2.6912815119999998 podStartE2EDuration="8.254133371s" podCreationTimestamp="2026-03-08 04:12:04 +0000 UTC" firstStartedPulling="2026-03-08 04:12:05.530485327 +0000 UTC m=+1137.629239677" lastFinishedPulling="2026-03-08 04:12:11.093337146 +0000 UTC m=+1143.192091536" observedRunningTime="2026-03-08 04:12:12.209863205 +0000 UTC m=+1144.308617565" watchObservedRunningTime="2026-03-08 04:12:12.254133371 +0000 UTC m=+1144.352887711" Mar 08 04:12:12.267849 master-0 kubenswrapper[18592]: I0308 04:12:12.266351 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ssvrb" podStartSLOduration=2.991708562 podStartE2EDuration="8.266325585s" podCreationTimestamp="2026-03-08 04:12:04 +0000 UTC" firstStartedPulling="2026-03-08 04:12:05.820652505 +0000 UTC m=+1137.919406855" lastFinishedPulling="2026-03-08 04:12:11.095269488 +0000 UTC m=+1143.194023878" observedRunningTime="2026-03-08 04:12:12.251591433 +0000 UTC m=+1144.350345793" watchObservedRunningTime="2026-03-08 04:12:12.266325585 +0000 UTC m=+1144.365079925" Mar 08 04:12:12.282914 master-0 kubenswrapper[18592]: I0308 04:12:12.282318 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-2c54k" podStartSLOduration=2.151041281 podStartE2EDuration="8.282300349s" podCreationTimestamp="2026-03-08 04:12:04 +0000 UTC" firstStartedPulling="2026-03-08 04:12:04.962220262 +0000 UTC m=+1137.060974612" lastFinishedPulling="2026-03-08 04:12:11.09347932 +0000 UTC m=+1143.192233680" observedRunningTime="2026-03-08 04:12:12.27631887 +0000 UTC m=+1144.375073220" watchObservedRunningTime="2026-03-08 04:12:12.282300349 +0000 UTC m=+1144.381054699" Mar 08 04:12:13.158566 master-0 kubenswrapper[18592]: I0308 04:12:13.158465 18592 generic.go:334] "Generic (PLEG): container finished" podID="e1532f83-fda2-443b-9bdb-aa4de5c66a13" containerID="71e07a5457c7977bb84a6fb8d01f03ddbd69115ea7e66dbfdc86c91d78cb40c6" exitCode=0 Mar 08 04:12:13.159615 master-0 kubenswrapper[18592]: I0308 04:12:13.158645 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f69bb" event={"ID":"e1532f83-fda2-443b-9bdb-aa4de5c66a13","Type":"ContainerDied","Data":"71e07a5457c7977bb84a6fb8d01f03ddbd69115ea7e66dbfdc86c91d78cb40c6"} Mar 08 04:12:14.271741 master-0 kubenswrapper[18592]: I0308 04:12:14.271651 18592 generic.go:334] "Generic (PLEG): container finished" podID="e1532f83-fda2-443b-9bdb-aa4de5c66a13" containerID="d24a64a584305f8cdf85ecacbe5ffb2c199d67013185e1879b47555ee50bf1d6" exitCode=0 Mar 08 04:12:14.281185 master-0 kubenswrapper[18592]: I0308 04:12:14.281133 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f69bb" event={"ID":"e1532f83-fda2-443b-9bdb-aa4de5c66a13","Type":"ContainerDied","Data":"d24a64a584305f8cdf85ecacbe5ffb2c199d67013185e1879b47555ee50bf1d6"} Mar 08 04:12:14.316717 master-0 kubenswrapper[18592]: I0308 04:12:14.316500 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-msbrs" Mar 08 04:12:15.153771 master-0 kubenswrapper[18592]: I0308 04:12:15.153697 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:15.153771 master-0 kubenswrapper[18592]: I0308 04:12:15.153755 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:15.159909 master-0 kubenswrapper[18592]: I0308 04:12:15.159267 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:15.297923 master-0 kubenswrapper[18592]: I0308 04:12:15.297833 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f69bb" event={"ID":"e1532f83-fda2-443b-9bdb-aa4de5c66a13","Type":"ContainerStarted","Data":"21f47b390e8de01c86d31b3ff1e6a2fa5bd2e4dfe3349bb9e194859efc7fe3d9"} Mar 08 04:12:15.297923 master-0 kubenswrapper[18592]: I0308 04:12:15.297906 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f69bb" event={"ID":"e1532f83-fda2-443b-9bdb-aa4de5c66a13","Type":"ContainerStarted","Data":"ebd9e5267cf969f6fe95d1eddeebd60327a41fcdf1a733255a7c05d7fe4cba2f"} Mar 08 04:12:15.297923 master-0 kubenswrapper[18592]: I0308 04:12:15.297921 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f69bb" event={"ID":"e1532f83-fda2-443b-9bdb-aa4de5c66a13","Type":"ContainerStarted","Data":"ec83d1f298c4799393144248c3fdd25403b0f65775cf9331d7f1ed57ca3a4594"} Mar 08 04:12:15.299100 master-0 kubenswrapper[18592]: I0308 04:12:15.297950 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f69bb" event={"ID":"e1532f83-fda2-443b-9bdb-aa4de5c66a13","Type":"ContainerStarted","Data":"b9e50eaff694b3e566df5828ad7ed2e62bb35faef21f77a60498a622f2b8bc82"} Mar 08 04:12:15.299100 master-0 kubenswrapper[18592]: I0308 04:12:15.297964 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f69bb" event={"ID":"e1532f83-fda2-443b-9bdb-aa4de5c66a13","Type":"ContainerStarted","Data":"a71dd3d7a4c95789225573f70357dd81fdf4151b742cc691b876e4ee5b37d490"} Mar 08 04:12:15.302352 master-0 kubenswrapper[18592]: I0308 04:12:15.301236 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5795766c6-lq2d8" Mar 08 04:12:15.409713 master-0 kubenswrapper[18592]: I0308 04:12:15.408392 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b94585997-5czft"] Mar 08 04:12:16.317356 master-0 kubenswrapper[18592]: I0308 04:12:16.317251 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f69bb" event={"ID":"e1532f83-fda2-443b-9bdb-aa4de5c66a13","Type":"ContainerStarted","Data":"f3aebe88f0e519e0109ebf4a2d376f5d243875f6d33615c8a2c07cfe90866e4f"} Mar 08 04:12:16.317356 master-0 kubenswrapper[18592]: I0308 04:12:16.317350 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:16.361324 master-0 kubenswrapper[18592]: I0308 04:12:16.361195 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-f69bb" podStartSLOduration=6.426265224 podStartE2EDuration="14.361167728s" podCreationTimestamp="2026-03-08 04:12:02 +0000 UTC" firstStartedPulling="2026-03-08 04:12:03.20050957 +0000 UTC m=+1135.299263930" lastFinishedPulling="2026-03-08 04:12:11.135412054 +0000 UTC m=+1143.234166434" observedRunningTime="2026-03-08 04:12:16.350583167 +0000 UTC m=+1148.449337557" watchObservedRunningTime="2026-03-08 04:12:16.361167728 +0000 UTC m=+1148.459922118" Mar 08 04:12:18.075055 master-0 kubenswrapper[18592]: I0308 04:12:18.074663 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:18.158654 master-0 kubenswrapper[18592]: I0308 04:12:18.158592 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:19.955187 master-0 kubenswrapper[18592]: I0308 04:12:19.955087 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-2c54k" Mar 08 04:12:22.825194 master-0 kubenswrapper[18592]: I0308 04:12:22.825085 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-wxdl4" Mar 08 04:12:23.283466 master-0 kubenswrapper[18592]: I0308 04:12:23.283359 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-lwx48" Mar 08 04:12:25.197437 master-0 kubenswrapper[18592]: I0308 04:12:25.197350 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ssvrb" Mar 08 04:12:30.250421 master-0 kubenswrapper[18592]: I0308 04:12:30.250339 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/vg-manager-czdzp"] Mar 08 04:12:30.263298 master-0 kubenswrapper[18592]: I0308 04:12:30.263220 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.268789 master-0 kubenswrapper[18592]: I0308 04:12:30.268740 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"vg-manager-metrics-cert" Mar 08 04:12:30.296813 master-0 kubenswrapper[18592]: I0308 04:12:30.294637 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-czdzp"] Mar 08 04:12:30.315012 master-0 kubenswrapper[18592]: I0308 04:12:30.314714 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-lvmd-config\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.315012 master-0 kubenswrapper[18592]: I0308 04:12:30.314794 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-registration-dir\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.315963 master-0 kubenswrapper[18592]: I0308 04:12:30.315393 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-node-plugin-dir\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.315963 master-0 kubenswrapper[18592]: I0308 04:12:30.315505 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkkts\" (UniqueName: \"kubernetes.io/projected/99655eb4-10c0-4fa1-a87a-d05ed22409e0-kube-api-access-bkkts\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.316912 master-0 kubenswrapper[18592]: I0308 04:12:30.316882 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-csi-plugin-dir\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.317009 master-0 kubenswrapper[18592]: I0308 04:12:30.316981 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-pod-volumes-dir\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.317124 master-0 kubenswrapper[18592]: I0308 04:12:30.317088 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/99655eb4-10c0-4fa1-a87a-d05ed22409e0-metrics-cert\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.317227 master-0 kubenswrapper[18592]: I0308 04:12:30.317190 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-device-dir\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.317322 master-0 kubenswrapper[18592]: I0308 04:12:30.317296 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-sys\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.317374 master-0 kubenswrapper[18592]: I0308 04:12:30.317346 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-run-udev\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.317511 master-0 kubenswrapper[18592]: I0308 04:12:30.317450 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-file-lock-dir\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.419478 master-0 kubenswrapper[18592]: I0308 04:12:30.419416 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-pod-volumes-dir\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.419740 master-0 kubenswrapper[18592]: I0308 04:12:30.419500 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/99655eb4-10c0-4fa1-a87a-d05ed22409e0-metrics-cert\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.419740 master-0 kubenswrapper[18592]: I0308 04:12:30.419544 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-device-dir\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.419740 master-0 kubenswrapper[18592]: I0308 04:12:30.419595 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-sys\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.419740 master-0 kubenswrapper[18592]: I0308 04:12:30.419622 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-run-udev\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.419740 master-0 kubenswrapper[18592]: I0308 04:12:30.419632 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-pod-volumes-dir\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.419740 master-0 kubenswrapper[18592]: I0308 04:12:30.419650 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-file-lock-dir\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.419740 master-0 kubenswrapper[18592]: I0308 04:12:30.419708 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-lvmd-config\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.419740 master-0 kubenswrapper[18592]: I0308 04:12:30.419734 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-registration-dir\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.420125 master-0 kubenswrapper[18592]: I0308 04:12:30.419761 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-node-plugin-dir\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.420125 master-0 kubenswrapper[18592]: I0308 04:12:30.419783 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkkts\" (UniqueName: \"kubernetes.io/projected/99655eb4-10c0-4fa1-a87a-d05ed22409e0-kube-api-access-bkkts\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.420125 master-0 kubenswrapper[18592]: I0308 04:12:30.419846 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-csi-plugin-dir\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.420125 master-0 kubenswrapper[18592]: I0308 04:12:30.419957 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-file-lock-dir\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.420125 master-0 kubenswrapper[18592]: I0308 04:12:30.420055 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-csi-plugin-dir\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.420349 master-0 kubenswrapper[18592]: I0308 04:12:30.420154 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-lvmd-config\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.420349 master-0 kubenswrapper[18592]: I0308 04:12:30.420198 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-registration-dir\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.420349 master-0 kubenswrapper[18592]: I0308 04:12:30.420290 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-node-plugin-dir\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.420600 master-0 kubenswrapper[18592]: I0308 04:12:30.420490 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-sys\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.420600 master-0 kubenswrapper[18592]: I0308 04:12:30.420537 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-device-dir\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.420600 master-0 kubenswrapper[18592]: I0308 04:12:30.420517 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/99655eb4-10c0-4fa1-a87a-d05ed22409e0-run-udev\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.423090 master-0 kubenswrapper[18592]: I0308 04:12:30.423053 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/99655eb4-10c0-4fa1-a87a-d05ed22409e0-metrics-cert\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.439722 master-0 kubenswrapper[18592]: I0308 04:12:30.439685 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkkts\" (UniqueName: \"kubernetes.io/projected/99655eb4-10c0-4fa1-a87a-d05ed22409e0-kube-api-access-bkkts\") pod \"vg-manager-czdzp\" (UID: \"99655eb4-10c0-4fa1-a87a-d05ed22409e0\") " pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:30.589431 master-0 kubenswrapper[18592]: I0308 04:12:30.589258 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:31.121264 master-0 kubenswrapper[18592]: W0308 04:12:31.121195 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99655eb4_10c0_4fa1_a87a_d05ed22409e0.slice/crio-0f4ddbde45f8a867f8161cca6ff362c4259e20b9bed6c9919f9ba8e98e212a9c WatchSource:0}: Error finding container 0f4ddbde45f8a867f8161cca6ff362c4259e20b9bed6c9919f9ba8e98e212a9c: Status 404 returned error can't find the container with id 0f4ddbde45f8a867f8161cca6ff362c4259e20b9bed6c9919f9ba8e98e212a9c Mar 08 04:12:31.128589 master-0 kubenswrapper[18592]: I0308 04:12:31.128530 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-czdzp"] Mar 08 04:12:31.493914 master-0 kubenswrapper[18592]: I0308 04:12:31.493793 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-czdzp" event={"ID":"99655eb4-10c0-4fa1-a87a-d05ed22409e0","Type":"ContainerStarted","Data":"22739d01f091f6c87e31bbfead9c6fc4bcfda6e98469eab3c91ea5c4fb10afd0"} Mar 08 04:12:31.493914 master-0 kubenswrapper[18592]: I0308 04:12:31.493907 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-czdzp" event={"ID":"99655eb4-10c0-4fa1-a87a-d05ed22409e0","Type":"ContainerStarted","Data":"0f4ddbde45f8a867f8161cca6ff362c4259e20b9bed6c9919f9ba8e98e212a9c"} Mar 08 04:12:31.530640 master-0 kubenswrapper[18592]: I0308 04:12:31.530531 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/vg-manager-czdzp" podStartSLOduration=1.530499817 podStartE2EDuration="1.530499817s" podCreationTimestamp="2026-03-08 04:12:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:12:31.52796397 +0000 UTC m=+1163.626718330" watchObservedRunningTime="2026-03-08 04:12:31.530499817 +0000 UTC m=+1163.629254207" Mar 08 04:12:33.079464 master-0 kubenswrapper[18592]: I0308 04:12:33.078185 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-f69bb" Mar 08 04:12:33.518599 master-0 kubenswrapper[18592]: I0308 04:12:33.518547 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-czdzp_99655eb4-10c0-4fa1-a87a-d05ed22409e0/vg-manager/0.log" Mar 08 04:12:33.518800 master-0 kubenswrapper[18592]: I0308 04:12:33.518604 18592 generic.go:334] "Generic (PLEG): container finished" podID="99655eb4-10c0-4fa1-a87a-d05ed22409e0" containerID="22739d01f091f6c87e31bbfead9c6fc4bcfda6e98469eab3c91ea5c4fb10afd0" exitCode=1 Mar 08 04:12:33.518800 master-0 kubenswrapper[18592]: I0308 04:12:33.518633 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-czdzp" event={"ID":"99655eb4-10c0-4fa1-a87a-d05ed22409e0","Type":"ContainerDied","Data":"22739d01f091f6c87e31bbfead9c6fc4bcfda6e98469eab3c91ea5c4fb10afd0"} Mar 08 04:12:33.519166 master-0 kubenswrapper[18592]: I0308 04:12:33.519139 18592 scope.go:117] "RemoveContainer" containerID="22739d01f091f6c87e31bbfead9c6fc4bcfda6e98469eab3c91ea5c4fb10afd0" Mar 08 04:12:33.853679 master-0 kubenswrapper[18592]: I0308 04:12:33.853533 18592 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock" Mar 08 04:12:34.536867 master-0 kubenswrapper[18592]: I0308 04:12:34.533806 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-czdzp_99655eb4-10c0-4fa1-a87a-d05ed22409e0/vg-manager/0.log" Mar 08 04:12:34.536867 master-0 kubenswrapper[18592]: I0308 04:12:34.533991 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-czdzp" event={"ID":"99655eb4-10c0-4fa1-a87a-d05ed22409e0","Type":"ContainerStarted","Data":"82d6aa12706e1179056a67c04e163be76107f7d0e55dfa54aa5845e4f7f0a1c3"} Mar 08 04:12:34.603443 master-0 kubenswrapper[18592]: I0308 04:12:34.603183 18592 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock","Timestamp":"2026-03-08T04:12:33.853571187Z","Handler":null,"Name":""} Mar 08 04:12:34.605720 master-0 kubenswrapper[18592]: I0308 04:12:34.605678 18592 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: topolvm.io endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock versions: 1.0.0 Mar 08 04:12:34.605775 master-0 kubenswrapper[18592]: I0308 04:12:34.605727 18592 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: topolvm.io at endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock Mar 08 04:12:40.462939 master-0 kubenswrapper[18592]: I0308 04:12:40.462758 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7b94585997-5czft" podUID="22aef614-003c-4e25-97fc-7386104c00d4" containerName="console" containerID="cri-o://deef64b51c530954aa19af5dc030e4d742635c0cf831046cf3fec50b1f6b4f3a" gracePeriod=15 Mar 08 04:12:40.591799 master-0 kubenswrapper[18592]: I0308 04:12:40.591699 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:40.594064 master-0 kubenswrapper[18592]: I0308 04:12:40.594008 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:40.620015 master-0 kubenswrapper[18592]: I0308 04:12:40.619970 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b94585997-5czft_22aef614-003c-4e25-97fc-7386104c00d4/console/0.log" Mar 08 04:12:40.620321 master-0 kubenswrapper[18592]: I0308 04:12:40.620262 18592 generic.go:334] "Generic (PLEG): container finished" podID="22aef614-003c-4e25-97fc-7386104c00d4" containerID="deef64b51c530954aa19af5dc030e4d742635c0cf831046cf3fec50b1f6b4f3a" exitCode=2 Mar 08 04:12:40.620526 master-0 kubenswrapper[18592]: I0308 04:12:40.620321 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b94585997-5czft" event={"ID":"22aef614-003c-4e25-97fc-7386104c00d4","Type":"ContainerDied","Data":"deef64b51c530954aa19af5dc030e4d742635c0cf831046cf3fec50b1f6b4f3a"} Mar 08 04:12:40.621083 master-0 kubenswrapper[18592]: I0308 04:12:40.621034 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:40.622868 master-0 kubenswrapper[18592]: I0308 04:12:40.622770 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/vg-manager-czdzp" Mar 08 04:12:40.998332 master-0 kubenswrapper[18592]: I0308 04:12:40.998152 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b94585997-5czft_22aef614-003c-4e25-97fc-7386104c00d4/console/0.log" Mar 08 04:12:40.998332 master-0 kubenswrapper[18592]: I0308 04:12:40.998263 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b94585997-5czft" Mar 08 04:12:41.008472 master-0 kubenswrapper[18592]: I0308 04:12:41.008384 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-oauth-serving-cert\") pod \"22aef614-003c-4e25-97fc-7386104c00d4\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " Mar 08 04:12:41.008707 master-0 kubenswrapper[18592]: I0308 04:12:41.008506 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-trusted-ca-bundle\") pod \"22aef614-003c-4e25-97fc-7386104c00d4\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " Mar 08 04:12:41.008707 master-0 kubenswrapper[18592]: I0308 04:12:41.008574 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22aef614-003c-4e25-97fc-7386104c00d4-console-serving-cert\") pod \"22aef614-003c-4e25-97fc-7386104c00d4\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " Mar 08 04:12:41.008707 master-0 kubenswrapper[18592]: I0308 04:12:41.008654 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-service-ca\") pod \"22aef614-003c-4e25-97fc-7386104c00d4\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " Mar 08 04:12:41.009425 master-0 kubenswrapper[18592]: I0308 04:12:41.008763 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-console-config\") pod \"22aef614-003c-4e25-97fc-7386104c00d4\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " Mar 08 04:12:41.009803 master-0 kubenswrapper[18592]: I0308 04:12:41.009462 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22aef614-003c-4e25-97fc-7386104c00d4-console-oauth-config\") pod \"22aef614-003c-4e25-97fc-7386104c00d4\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " Mar 08 04:12:41.009886 master-0 kubenswrapper[18592]: I0308 04:12:41.009855 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb9l2\" (UniqueName: \"kubernetes.io/projected/22aef614-003c-4e25-97fc-7386104c00d4-kube-api-access-jb9l2\") pod \"22aef614-003c-4e25-97fc-7386104c00d4\" (UID: \"22aef614-003c-4e25-97fc-7386104c00d4\") " Mar 08 04:12:41.009932 master-0 kubenswrapper[18592]: I0308 04:12:41.008924 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "22aef614-003c-4e25-97fc-7386104c00d4" (UID: "22aef614-003c-4e25-97fc-7386104c00d4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:12:41.009976 master-0 kubenswrapper[18592]: I0308 04:12:41.009278 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-console-config" (OuterVolumeSpecName: "console-config") pod "22aef614-003c-4e25-97fc-7386104c00d4" (UID: "22aef614-003c-4e25-97fc-7386104c00d4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:12:41.009976 master-0 kubenswrapper[18592]: I0308 04:12:41.009365 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-service-ca" (OuterVolumeSpecName: "service-ca") pod "22aef614-003c-4e25-97fc-7386104c00d4" (UID: "22aef614-003c-4e25-97fc-7386104c00d4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:12:41.009976 master-0 kubenswrapper[18592]: I0308 04:12:41.009580 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "22aef614-003c-4e25-97fc-7386104c00d4" (UID: "22aef614-003c-4e25-97fc-7386104c00d4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:12:41.011962 master-0 kubenswrapper[18592]: I0308 04:12:41.011927 18592 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 04:12:41.011962 master-0 kubenswrapper[18592]: I0308 04:12:41.011957 18592 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:12:41.012064 master-0 kubenswrapper[18592]: I0308 04:12:41.011971 18592 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 04:12:41.012064 master-0 kubenswrapper[18592]: I0308 04:12:41.011984 18592 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22aef614-003c-4e25-97fc-7386104c00d4-console-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:12:41.013122 master-0 kubenswrapper[18592]: I0308 04:12:41.013055 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22aef614-003c-4e25-97fc-7386104c00d4-kube-api-access-jb9l2" (OuterVolumeSpecName: "kube-api-access-jb9l2") pod "22aef614-003c-4e25-97fc-7386104c00d4" (UID: "22aef614-003c-4e25-97fc-7386104c00d4"). InnerVolumeSpecName "kube-api-access-jb9l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:12:41.014388 master-0 kubenswrapper[18592]: I0308 04:12:41.014353 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22aef614-003c-4e25-97fc-7386104c00d4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "22aef614-003c-4e25-97fc-7386104c00d4" (UID: "22aef614-003c-4e25-97fc-7386104c00d4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:12:41.016106 master-0 kubenswrapper[18592]: I0308 04:12:41.016065 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22aef614-003c-4e25-97fc-7386104c00d4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "22aef614-003c-4e25-97fc-7386104c00d4" (UID: "22aef614-003c-4e25-97fc-7386104c00d4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:12:41.113150 master-0 kubenswrapper[18592]: I0308 04:12:41.113088 18592 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22aef614-003c-4e25-97fc-7386104c00d4-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 04:12:41.113150 master-0 kubenswrapper[18592]: I0308 04:12:41.113130 18592 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22aef614-003c-4e25-97fc-7386104c00d4-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:12:41.113150 master-0 kubenswrapper[18592]: I0308 04:12:41.113143 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb9l2\" (UniqueName: \"kubernetes.io/projected/22aef614-003c-4e25-97fc-7386104c00d4-kube-api-access-jb9l2\") on node \"master-0\" DevicePath \"\"" Mar 08 04:12:41.636173 master-0 kubenswrapper[18592]: I0308 04:12:41.633882 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b94585997-5czft_22aef614-003c-4e25-97fc-7386104c00d4/console/0.log" Mar 08 04:12:41.636173 master-0 kubenswrapper[18592]: I0308 04:12:41.634035 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b94585997-5czft" event={"ID":"22aef614-003c-4e25-97fc-7386104c00d4","Type":"ContainerDied","Data":"1ab84d6571aac061317d69a9d313d89b7992fa815baece92ecd2c6bf8ca6e059"} Mar 08 04:12:41.636173 master-0 kubenswrapper[18592]: I0308 04:12:41.634085 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b94585997-5czft" Mar 08 04:12:41.636173 master-0 kubenswrapper[18592]: I0308 04:12:41.634133 18592 scope.go:117] "RemoveContainer" containerID="deef64b51c530954aa19af5dc030e4d742635c0cf831046cf3fec50b1f6b4f3a" Mar 08 04:12:41.699982 master-0 kubenswrapper[18592]: I0308 04:12:41.699394 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b94585997-5czft"] Mar 08 04:12:41.712929 master-0 kubenswrapper[18592]: I0308 04:12:41.712791 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b94585997-5czft"] Mar 08 04:12:42.162796 master-0 kubenswrapper[18592]: I0308 04:12:42.162683 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22aef614-003c-4e25-97fc-7386104c00d4" path="/var/lib/kubelet/pods/22aef614-003c-4e25-97fc-7386104c00d4/volumes" Mar 08 04:12:42.734280 master-0 kubenswrapper[18592]: I0308 04:12:42.734217 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jtfgj"] Mar 08 04:12:42.734805 master-0 kubenswrapper[18592]: E0308 04:12:42.734576 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22aef614-003c-4e25-97fc-7386104c00d4" containerName="console" Mar 08 04:12:42.734805 master-0 kubenswrapper[18592]: I0308 04:12:42.734590 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="22aef614-003c-4e25-97fc-7386104c00d4" containerName="console" Mar 08 04:12:42.734805 master-0 kubenswrapper[18592]: I0308 04:12:42.734775 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="22aef614-003c-4e25-97fc-7386104c00d4" containerName="console" Mar 08 04:12:42.736133 master-0 kubenswrapper[18592]: I0308 04:12:42.735340 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jtfgj" Mar 08 04:12:42.739888 master-0 kubenswrapper[18592]: I0308 04:12:42.739509 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 08 04:12:42.739888 master-0 kubenswrapper[18592]: I0308 04:12:42.739844 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 08 04:12:42.781844 master-0 kubenswrapper[18592]: I0308 04:12:42.780337 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jtfgj"] Mar 08 04:12:42.842481 master-0 kubenswrapper[18592]: I0308 04:12:42.842426 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzkgc\" (UniqueName: \"kubernetes.io/projected/8e8d50dd-8695-40fd-b24f-110e4e3477bf-kube-api-access-tzkgc\") pod \"openstack-operator-index-jtfgj\" (UID: \"8e8d50dd-8695-40fd-b24f-110e4e3477bf\") " pod="openstack-operators/openstack-operator-index-jtfgj" Mar 08 04:12:42.944111 master-0 kubenswrapper[18592]: I0308 04:12:42.944030 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzkgc\" (UniqueName: \"kubernetes.io/projected/8e8d50dd-8695-40fd-b24f-110e4e3477bf-kube-api-access-tzkgc\") pod \"openstack-operator-index-jtfgj\" (UID: \"8e8d50dd-8695-40fd-b24f-110e4e3477bf\") " pod="openstack-operators/openstack-operator-index-jtfgj" Mar 08 04:12:42.958439 master-0 kubenswrapper[18592]: I0308 04:12:42.958382 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzkgc\" (UniqueName: \"kubernetes.io/projected/8e8d50dd-8695-40fd-b24f-110e4e3477bf-kube-api-access-tzkgc\") pod \"openstack-operator-index-jtfgj\" (UID: \"8e8d50dd-8695-40fd-b24f-110e4e3477bf\") " pod="openstack-operators/openstack-operator-index-jtfgj" Mar 08 04:12:43.107935 master-0 kubenswrapper[18592]: I0308 04:12:43.107389 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jtfgj" Mar 08 04:12:43.601972 master-0 kubenswrapper[18592]: I0308 04:12:43.601898 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jtfgj"] Mar 08 04:12:43.664740 master-0 kubenswrapper[18592]: I0308 04:12:43.664645 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jtfgj" event={"ID":"8e8d50dd-8695-40fd-b24f-110e4e3477bf","Type":"ContainerStarted","Data":"59ecffe6abe5b30ddf5dddff016d497642ab6772eb5bab5150ca17c3d6b2f9da"} Mar 08 04:12:46.869272 master-0 kubenswrapper[18592]: I0308 04:12:46.869205 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jtfgj"] Mar 08 04:12:47.481488 master-0 kubenswrapper[18592]: I0308 04:12:47.481389 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rjv2t"] Mar 08 04:12:47.482424 master-0 kubenswrapper[18592]: I0308 04:12:47.482376 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rjv2t" Mar 08 04:12:47.502070 master-0 kubenswrapper[18592]: I0308 04:12:47.501958 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rjv2t"] Mar 08 04:12:47.546724 master-0 kubenswrapper[18592]: I0308 04:12:47.546623 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s8n6\" (UniqueName: \"kubernetes.io/projected/30315ebc-2caa-4310-a7c0-338b80433779-kube-api-access-2s8n6\") pod \"openstack-operator-index-rjv2t\" (UID: \"30315ebc-2caa-4310-a7c0-338b80433779\") " pod="openstack-operators/openstack-operator-index-rjv2t" Mar 08 04:12:47.648741 master-0 kubenswrapper[18592]: I0308 04:12:47.648680 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s8n6\" (UniqueName: \"kubernetes.io/projected/30315ebc-2caa-4310-a7c0-338b80433779-kube-api-access-2s8n6\") pod \"openstack-operator-index-rjv2t\" (UID: \"30315ebc-2caa-4310-a7c0-338b80433779\") " pod="openstack-operators/openstack-operator-index-rjv2t" Mar 08 04:12:47.678935 master-0 kubenswrapper[18592]: I0308 04:12:47.678870 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s8n6\" (UniqueName: \"kubernetes.io/projected/30315ebc-2caa-4310-a7c0-338b80433779-kube-api-access-2s8n6\") pod \"openstack-operator-index-rjv2t\" (UID: \"30315ebc-2caa-4310-a7c0-338b80433779\") " pod="openstack-operators/openstack-operator-index-rjv2t" Mar 08 04:12:47.747033 master-0 kubenswrapper[18592]: I0308 04:12:47.746854 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jtfgj" event={"ID":"8e8d50dd-8695-40fd-b24f-110e4e3477bf","Type":"ContainerStarted","Data":"b757d1ef4250ac2306f777d96f4df983b31f1fd114fc47a4ce69c91f74afeb4e"} Mar 08 04:12:47.747283 master-0 kubenswrapper[18592]: I0308 04:12:47.747064 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-jtfgj" podUID="8e8d50dd-8695-40fd-b24f-110e4e3477bf" containerName="registry-server" containerID="cri-o://b757d1ef4250ac2306f777d96f4df983b31f1fd114fc47a4ce69c91f74afeb4e" gracePeriod=2 Mar 08 04:12:47.804913 master-0 kubenswrapper[18592]: I0308 04:12:47.803404 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rjv2t" Mar 08 04:12:47.821593 master-0 kubenswrapper[18592]: I0308 04:12:47.814264 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jtfgj" podStartSLOduration=2.409894659 podStartE2EDuration="5.814241242s" podCreationTimestamp="2026-03-08 04:12:42 +0000 UTC" firstStartedPulling="2026-03-08 04:12:43.61294699 +0000 UTC m=+1175.711701370" lastFinishedPulling="2026-03-08 04:12:47.017293603 +0000 UTC m=+1179.116047953" observedRunningTime="2026-03-08 04:12:47.80326046 +0000 UTC m=+1179.902014830" watchObservedRunningTime="2026-03-08 04:12:47.814241242 +0000 UTC m=+1179.912995612" Mar 08 04:12:48.242112 master-0 kubenswrapper[18592]: I0308 04:12:48.242064 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jtfgj" Mar 08 04:12:48.361434 master-0 kubenswrapper[18592]: I0308 04:12:48.361301 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzkgc\" (UniqueName: \"kubernetes.io/projected/8e8d50dd-8695-40fd-b24f-110e4e3477bf-kube-api-access-tzkgc\") pod \"8e8d50dd-8695-40fd-b24f-110e4e3477bf\" (UID: \"8e8d50dd-8695-40fd-b24f-110e4e3477bf\") " Mar 08 04:12:48.381037 master-0 kubenswrapper[18592]: I0308 04:12:48.365477 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e8d50dd-8695-40fd-b24f-110e4e3477bf-kube-api-access-tzkgc" (OuterVolumeSpecName: "kube-api-access-tzkgc") pod "8e8d50dd-8695-40fd-b24f-110e4e3477bf" (UID: "8e8d50dd-8695-40fd-b24f-110e4e3477bf"). InnerVolumeSpecName "kube-api-access-tzkgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:12:48.381037 master-0 kubenswrapper[18592]: I0308 04:12:48.376320 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rjv2t"] Mar 08 04:12:48.464189 master-0 kubenswrapper[18592]: I0308 04:12:48.464097 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzkgc\" (UniqueName: \"kubernetes.io/projected/8e8d50dd-8695-40fd-b24f-110e4e3477bf-kube-api-access-tzkgc\") on node \"master-0\" DevicePath \"\"" Mar 08 04:12:48.757427 master-0 kubenswrapper[18592]: I0308 04:12:48.757310 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rjv2t" event={"ID":"30315ebc-2caa-4310-a7c0-338b80433779","Type":"ContainerStarted","Data":"963a20ef01c1028b93ac62cf9b8607299ee243f3b584ada18a5f42dfce255caa"} Mar 08 04:12:48.757427 master-0 kubenswrapper[18592]: I0308 04:12:48.757396 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rjv2t" event={"ID":"30315ebc-2caa-4310-a7c0-338b80433779","Type":"ContainerStarted","Data":"59cb0bec12fdd9a24fb33bee1275e0191e14ff82f0124571dafd185dce3c1e9a"} Mar 08 04:12:48.760431 master-0 kubenswrapper[18592]: I0308 04:12:48.760380 18592 generic.go:334] "Generic (PLEG): container finished" podID="8e8d50dd-8695-40fd-b24f-110e4e3477bf" containerID="b757d1ef4250ac2306f777d96f4df983b31f1fd114fc47a4ce69c91f74afeb4e" exitCode=0 Mar 08 04:12:48.760561 master-0 kubenswrapper[18592]: I0308 04:12:48.760425 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jtfgj" event={"ID":"8e8d50dd-8695-40fd-b24f-110e4e3477bf","Type":"ContainerDied","Data":"b757d1ef4250ac2306f777d96f4df983b31f1fd114fc47a4ce69c91f74afeb4e"} Mar 08 04:12:48.760561 master-0 kubenswrapper[18592]: I0308 04:12:48.760486 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jtfgj" event={"ID":"8e8d50dd-8695-40fd-b24f-110e4e3477bf","Type":"ContainerDied","Data":"59ecffe6abe5b30ddf5dddff016d497642ab6772eb5bab5150ca17c3d6b2f9da"} Mar 08 04:12:48.760561 master-0 kubenswrapper[18592]: I0308 04:12:48.760506 18592 scope.go:117] "RemoveContainer" containerID="b757d1ef4250ac2306f777d96f4df983b31f1fd114fc47a4ce69c91f74afeb4e" Mar 08 04:12:48.760776 master-0 kubenswrapper[18592]: I0308 04:12:48.760442 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jtfgj" Mar 08 04:12:48.791379 master-0 kubenswrapper[18592]: I0308 04:12:48.791276 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rjv2t" podStartSLOduration=1.738845742 podStartE2EDuration="1.791247115s" podCreationTimestamp="2026-03-08 04:12:47 +0000 UTC" firstStartedPulling="2026-03-08 04:12:48.38826938 +0000 UTC m=+1180.487023730" lastFinishedPulling="2026-03-08 04:12:48.440670752 +0000 UTC m=+1180.539425103" observedRunningTime="2026-03-08 04:12:48.784911027 +0000 UTC m=+1180.883665417" watchObservedRunningTime="2026-03-08 04:12:48.791247115 +0000 UTC m=+1180.890001475" Mar 08 04:12:48.793173 master-0 kubenswrapper[18592]: I0308 04:12:48.793117 18592 scope.go:117] "RemoveContainer" containerID="b757d1ef4250ac2306f777d96f4df983b31f1fd114fc47a4ce69c91f74afeb4e" Mar 08 04:12:48.793755 master-0 kubenswrapper[18592]: E0308 04:12:48.793615 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b757d1ef4250ac2306f777d96f4df983b31f1fd114fc47a4ce69c91f74afeb4e\": container with ID starting with b757d1ef4250ac2306f777d96f4df983b31f1fd114fc47a4ce69c91f74afeb4e not found: ID does not exist" containerID="b757d1ef4250ac2306f777d96f4df983b31f1fd114fc47a4ce69c91f74afeb4e" Mar 08 04:12:48.793755 master-0 kubenswrapper[18592]: I0308 04:12:48.793714 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b757d1ef4250ac2306f777d96f4df983b31f1fd114fc47a4ce69c91f74afeb4e"} err="failed to get container status \"b757d1ef4250ac2306f777d96f4df983b31f1fd114fc47a4ce69c91f74afeb4e\": rpc error: code = NotFound desc = could not find container \"b757d1ef4250ac2306f777d96f4df983b31f1fd114fc47a4ce69c91f74afeb4e\": container with ID starting with b757d1ef4250ac2306f777d96f4df983b31f1fd114fc47a4ce69c91f74afeb4e not found: ID does not exist" Mar 08 04:12:48.823061 master-0 kubenswrapper[18592]: I0308 04:12:48.822981 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jtfgj"] Mar 08 04:12:48.838442 master-0 kubenswrapper[18592]: I0308 04:12:48.838366 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-jtfgj"] Mar 08 04:12:50.160279 master-0 kubenswrapper[18592]: I0308 04:12:50.160178 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e8d50dd-8695-40fd-b24f-110e4e3477bf" path="/var/lib/kubelet/pods/8e8d50dd-8695-40fd-b24f-110e4e3477bf/volumes" Mar 08 04:12:57.804866 master-0 kubenswrapper[18592]: I0308 04:12:57.804742 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-rjv2t" Mar 08 04:12:57.806032 master-0 kubenswrapper[18592]: I0308 04:12:57.804884 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-rjv2t" Mar 08 04:12:57.843780 master-0 kubenswrapper[18592]: I0308 04:12:57.843712 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-rjv2t" Mar 08 04:12:57.913433 master-0 kubenswrapper[18592]: I0308 04:12:57.913345 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-rjv2t" Mar 08 04:12:58.959336 master-0 kubenswrapper[18592]: I0308 04:12:58.959219 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7"] Mar 08 04:12:58.960229 master-0 kubenswrapper[18592]: E0308 04:12:58.959779 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8d50dd-8695-40fd-b24f-110e4e3477bf" containerName="registry-server" Mar 08 04:12:58.960229 master-0 kubenswrapper[18592]: I0308 04:12:58.959803 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8d50dd-8695-40fd-b24f-110e4e3477bf" containerName="registry-server" Mar 08 04:12:58.960229 master-0 kubenswrapper[18592]: I0308 04:12:58.960131 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e8d50dd-8695-40fd-b24f-110e4e3477bf" containerName="registry-server" Mar 08 04:12:58.962022 master-0 kubenswrapper[18592]: I0308 04:12:58.961980 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7" Mar 08 04:12:58.995953 master-0 kubenswrapper[18592]: I0308 04:12:58.995328 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7"] Mar 08 04:12:59.094336 master-0 kubenswrapper[18592]: I0308 04:12:59.094239 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e476107-935e-4d00-9290-a8e242341c5a-bundle\") pod \"084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7\" (UID: \"1e476107-935e-4d00-9290-a8e242341c5a\") " pod="openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7" Mar 08 04:12:59.094727 master-0 kubenswrapper[18592]: I0308 04:12:59.094690 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4t5b\" (UniqueName: \"kubernetes.io/projected/1e476107-935e-4d00-9290-a8e242341c5a-kube-api-access-b4t5b\") pod \"084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7\" (UID: \"1e476107-935e-4d00-9290-a8e242341c5a\") " pod="openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7" Mar 08 04:12:59.094952 master-0 kubenswrapper[18592]: I0308 04:12:59.094929 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e476107-935e-4d00-9290-a8e242341c5a-util\") pod \"084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7\" (UID: \"1e476107-935e-4d00-9290-a8e242341c5a\") " pod="openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7" Mar 08 04:12:59.197007 master-0 kubenswrapper[18592]: I0308 04:12:59.196946 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e476107-935e-4d00-9290-a8e242341c5a-bundle\") pod \"084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7\" (UID: \"1e476107-935e-4d00-9290-a8e242341c5a\") " pod="openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7" Mar 08 04:12:59.197222 master-0 kubenswrapper[18592]: I0308 04:12:59.197088 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4t5b\" (UniqueName: \"kubernetes.io/projected/1e476107-935e-4d00-9290-a8e242341c5a-kube-api-access-b4t5b\") pod \"084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7\" (UID: \"1e476107-935e-4d00-9290-a8e242341c5a\") " pod="openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7" Mar 08 04:12:59.197222 master-0 kubenswrapper[18592]: I0308 04:12:59.197170 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e476107-935e-4d00-9290-a8e242341c5a-util\") pod \"084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7\" (UID: \"1e476107-935e-4d00-9290-a8e242341c5a\") " pod="openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7" Mar 08 04:12:59.197985 master-0 kubenswrapper[18592]: I0308 04:12:59.197924 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e476107-935e-4d00-9290-a8e242341c5a-bundle\") pod \"084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7\" (UID: \"1e476107-935e-4d00-9290-a8e242341c5a\") " pod="openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7" Mar 08 04:12:59.198068 master-0 kubenswrapper[18592]: I0308 04:12:59.197983 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e476107-935e-4d00-9290-a8e242341c5a-util\") pod \"084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7\" (UID: \"1e476107-935e-4d00-9290-a8e242341c5a\") " pod="openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7" Mar 08 04:12:59.222037 master-0 kubenswrapper[18592]: I0308 04:12:59.221931 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4t5b\" (UniqueName: \"kubernetes.io/projected/1e476107-935e-4d00-9290-a8e242341c5a-kube-api-access-b4t5b\") pod \"084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7\" (UID: \"1e476107-935e-4d00-9290-a8e242341c5a\") " pod="openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7" Mar 08 04:12:59.291576 master-0 kubenswrapper[18592]: I0308 04:12:59.291495 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7" Mar 08 04:12:59.837972 master-0 kubenswrapper[18592]: I0308 04:12:59.833518 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7"] Mar 08 04:12:59.904022 master-0 kubenswrapper[18592]: I0308 04:12:59.903508 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7" event={"ID":"1e476107-935e-4d00-9290-a8e242341c5a","Type":"ContainerStarted","Data":"70ba18070c8e054cb72be7b871f980b61b9541686db6bb1582a6f04f97b568f9"} Mar 08 04:13:00.939753 master-0 kubenswrapper[18592]: I0308 04:13:00.939610 18592 generic.go:334] "Generic (PLEG): container finished" podID="1e476107-935e-4d00-9290-a8e242341c5a" containerID="9d36a53fc31e9b760622a8063cfc7422b0a1640a71270d744ef96204d363dbf7" exitCode=0 Mar 08 04:13:00.941091 master-0 kubenswrapper[18592]: I0308 04:13:00.939719 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7" event={"ID":"1e476107-935e-4d00-9290-a8e242341c5a","Type":"ContainerDied","Data":"9d36a53fc31e9b760622a8063cfc7422b0a1640a71270d744ef96204d363dbf7"} Mar 08 04:13:01.956503 master-0 kubenswrapper[18592]: I0308 04:13:01.956370 18592 generic.go:334] "Generic (PLEG): container finished" podID="1e476107-935e-4d00-9290-a8e242341c5a" containerID="50cf288baf65310fc6d8fd29fe98cc384df2df3e538cecb8704a59ecb3dcdca5" exitCode=0 Mar 08 04:13:01.957317 master-0 kubenswrapper[18592]: I0308 04:13:01.956476 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7" event={"ID":"1e476107-935e-4d00-9290-a8e242341c5a","Type":"ContainerDied","Data":"50cf288baf65310fc6d8fd29fe98cc384df2df3e538cecb8704a59ecb3dcdca5"} Mar 08 04:13:02.970418 master-0 kubenswrapper[18592]: I0308 04:13:02.970324 18592 generic.go:334] "Generic (PLEG): container finished" podID="1e476107-935e-4d00-9290-a8e242341c5a" containerID="5d0ac6e0ee7b279cadf1fce095b597a983dc1fe59a6de59dbc3aeecbac95821b" exitCode=0 Mar 08 04:13:02.971242 master-0 kubenswrapper[18592]: I0308 04:13:02.970434 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7" event={"ID":"1e476107-935e-4d00-9290-a8e242341c5a","Type":"ContainerDied","Data":"5d0ac6e0ee7b279cadf1fce095b597a983dc1fe59a6de59dbc3aeecbac95821b"} Mar 08 04:13:04.541865 master-0 kubenswrapper[18592]: I0308 04:13:04.541787 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7" Mar 08 04:13:04.607723 master-0 kubenswrapper[18592]: I0308 04:13:04.607637 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e476107-935e-4d00-9290-a8e242341c5a-bundle\") pod \"1e476107-935e-4d00-9290-a8e242341c5a\" (UID: \"1e476107-935e-4d00-9290-a8e242341c5a\") " Mar 08 04:13:04.607986 master-0 kubenswrapper[18592]: I0308 04:13:04.607792 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e476107-935e-4d00-9290-a8e242341c5a-util\") pod \"1e476107-935e-4d00-9290-a8e242341c5a\" (UID: \"1e476107-935e-4d00-9290-a8e242341c5a\") " Mar 08 04:13:04.607986 master-0 kubenswrapper[18592]: I0308 04:13:04.607945 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4t5b\" (UniqueName: \"kubernetes.io/projected/1e476107-935e-4d00-9290-a8e242341c5a-kube-api-access-b4t5b\") pod \"1e476107-935e-4d00-9290-a8e242341c5a\" (UID: \"1e476107-935e-4d00-9290-a8e242341c5a\") " Mar 08 04:13:04.609532 master-0 kubenswrapper[18592]: I0308 04:13:04.609464 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e476107-935e-4d00-9290-a8e242341c5a-bundle" (OuterVolumeSpecName: "bundle") pod "1e476107-935e-4d00-9290-a8e242341c5a" (UID: "1e476107-935e-4d00-9290-a8e242341c5a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:13:04.612776 master-0 kubenswrapper[18592]: I0308 04:13:04.612707 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e476107-935e-4d00-9290-a8e242341c5a-kube-api-access-b4t5b" (OuterVolumeSpecName: "kube-api-access-b4t5b") pod "1e476107-935e-4d00-9290-a8e242341c5a" (UID: "1e476107-935e-4d00-9290-a8e242341c5a"). InnerVolumeSpecName "kube-api-access-b4t5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:13:04.629613 master-0 kubenswrapper[18592]: I0308 04:13:04.629531 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e476107-935e-4d00-9290-a8e242341c5a-util" (OuterVolumeSpecName: "util") pod "1e476107-935e-4d00-9290-a8e242341c5a" (UID: "1e476107-935e-4d00-9290-a8e242341c5a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:13:04.711640 master-0 kubenswrapper[18592]: I0308 04:13:04.711486 18592 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e476107-935e-4d00-9290-a8e242341c5a-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:13:04.711884 master-0 kubenswrapper[18592]: I0308 04:13:04.711789 18592 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e476107-935e-4d00-9290-a8e242341c5a-util\") on node \"master-0\" DevicePath \"\"" Mar 08 04:13:04.711884 master-0 kubenswrapper[18592]: I0308 04:13:04.711816 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4t5b\" (UniqueName: \"kubernetes.io/projected/1e476107-935e-4d00-9290-a8e242341c5a-kube-api-access-b4t5b\") on node \"master-0\" DevicePath \"\"" Mar 08 04:13:05.000336 master-0 kubenswrapper[18592]: I0308 04:13:05.000148 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7" event={"ID":"1e476107-935e-4d00-9290-a8e242341c5a","Type":"ContainerDied","Data":"70ba18070c8e054cb72be7b871f980b61b9541686db6bb1582a6f04f97b568f9"} Mar 08 04:13:05.000336 master-0 kubenswrapper[18592]: I0308 04:13:05.000214 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70ba18070c8e054cb72be7b871f980b61b9541686db6bb1582a6f04f97b568f9" Mar 08 04:13:05.000336 master-0 kubenswrapper[18592]: I0308 04:13:05.000294 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7" Mar 08 04:13:11.237266 master-0 kubenswrapper[18592]: I0308 04:13:11.237153 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7b5776d56c-lf2qx"] Mar 08 04:13:11.238210 master-0 kubenswrapper[18592]: E0308 04:13:11.237486 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e476107-935e-4d00-9290-a8e242341c5a" containerName="extract" Mar 08 04:13:11.238210 master-0 kubenswrapper[18592]: I0308 04:13:11.237500 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e476107-935e-4d00-9290-a8e242341c5a" containerName="extract" Mar 08 04:13:11.238210 master-0 kubenswrapper[18592]: E0308 04:13:11.237518 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e476107-935e-4d00-9290-a8e242341c5a" containerName="util" Mar 08 04:13:11.238210 master-0 kubenswrapper[18592]: I0308 04:13:11.237524 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e476107-935e-4d00-9290-a8e242341c5a" containerName="util" Mar 08 04:13:11.238210 master-0 kubenswrapper[18592]: E0308 04:13:11.237545 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e476107-935e-4d00-9290-a8e242341c5a" containerName="pull" Mar 08 04:13:11.238210 master-0 kubenswrapper[18592]: I0308 04:13:11.237552 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e476107-935e-4d00-9290-a8e242341c5a" containerName="pull" Mar 08 04:13:11.238210 master-0 kubenswrapper[18592]: I0308 04:13:11.237712 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e476107-935e-4d00-9290-a8e242341c5a" containerName="extract" Mar 08 04:13:11.238751 master-0 kubenswrapper[18592]: I0308 04:13:11.238250 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7b5776d56c-lf2qx" Mar 08 04:13:11.270737 master-0 kubenswrapper[18592]: I0308 04:13:11.270658 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7b5776d56c-lf2qx"] Mar 08 04:13:11.338619 master-0 kubenswrapper[18592]: I0308 04:13:11.338581 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wrss\" (UniqueName: \"kubernetes.io/projected/00ae769f-71d3-44bb-8454-798e4bafa365-kube-api-access-5wrss\") pod \"openstack-operator-controller-init-7b5776d56c-lf2qx\" (UID: \"00ae769f-71d3-44bb-8454-798e4bafa365\") " pod="openstack-operators/openstack-operator-controller-init-7b5776d56c-lf2qx" Mar 08 04:13:11.441044 master-0 kubenswrapper[18592]: I0308 04:13:11.440970 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wrss\" (UniqueName: \"kubernetes.io/projected/00ae769f-71d3-44bb-8454-798e4bafa365-kube-api-access-5wrss\") pod \"openstack-operator-controller-init-7b5776d56c-lf2qx\" (UID: \"00ae769f-71d3-44bb-8454-798e4bafa365\") " pod="openstack-operators/openstack-operator-controller-init-7b5776d56c-lf2qx" Mar 08 04:13:11.458006 master-0 kubenswrapper[18592]: I0308 04:13:11.457950 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wrss\" (UniqueName: \"kubernetes.io/projected/00ae769f-71d3-44bb-8454-798e4bafa365-kube-api-access-5wrss\") pod \"openstack-operator-controller-init-7b5776d56c-lf2qx\" (UID: \"00ae769f-71d3-44bb-8454-798e4bafa365\") " pod="openstack-operators/openstack-operator-controller-init-7b5776d56c-lf2qx" Mar 08 04:13:11.555276 master-0 kubenswrapper[18592]: I0308 04:13:11.555147 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7b5776d56c-lf2qx" Mar 08 04:13:12.037289 master-0 kubenswrapper[18592]: I0308 04:13:12.037203 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7b5776d56c-lf2qx"] Mar 08 04:13:12.078075 master-0 kubenswrapper[18592]: I0308 04:13:12.078011 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7b5776d56c-lf2qx" event={"ID":"00ae769f-71d3-44bb-8454-798e4bafa365","Type":"ContainerStarted","Data":"55a82662b2c4be4dc05e5cd6227d62bcc9e8159e8857837e59ab088dfc80e1c7"} Mar 08 04:13:17.121948 master-0 kubenswrapper[18592]: I0308 04:13:17.121808 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7b5776d56c-lf2qx" event={"ID":"00ae769f-71d3-44bb-8454-798e4bafa365","Type":"ContainerStarted","Data":"896f9ce6b3b92d59ca6245ae7d7252db6b9eedb5cdfb9c7550e86d3b7bcf77f3"} Mar 08 04:13:17.121948 master-0 kubenswrapper[18592]: I0308 04:13:17.121903 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7b5776d56c-lf2qx" Mar 08 04:13:17.164092 master-0 kubenswrapper[18592]: I0308 04:13:17.163996 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7b5776d56c-lf2qx" podStartSLOduration=1.842046477 podStartE2EDuration="6.163976783s" podCreationTimestamp="2026-03-08 04:13:11 +0000 UTC" firstStartedPulling="2026-03-08 04:13:12.03842962 +0000 UTC m=+1204.137183980" lastFinishedPulling="2026-03-08 04:13:16.360359936 +0000 UTC m=+1208.459114286" observedRunningTime="2026-03-08 04:13:17.163511181 +0000 UTC m=+1209.262265541" watchObservedRunningTime="2026-03-08 04:13:17.163976783 +0000 UTC m=+1209.262731153" Mar 08 04:13:21.559384 master-0 kubenswrapper[18592]: I0308 04:13:21.559306 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7b5776d56c-lf2qx" Mar 08 04:13:42.344490 master-0 kubenswrapper[18592]: I0308 04:13:42.344402 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-j8xwt"] Mar 08 04:13:42.346224 master-0 kubenswrapper[18592]: I0308 04:13:42.345750 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j8xwt" Mar 08 04:13:42.400841 master-0 kubenswrapper[18592]: I0308 04:13:42.398384 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hq9nx"] Mar 08 04:13:42.405901 master-0 kubenswrapper[18592]: I0308 04:13:42.402535 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-j8xwt"] Mar 08 04:13:42.405901 master-0 kubenswrapper[18592]: I0308 04:13:42.402640 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hq9nx" Mar 08 04:13:42.422353 master-0 kubenswrapper[18592]: I0308 04:13:42.421114 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hq9nx"] Mar 08 04:13:42.452721 master-0 kubenswrapper[18592]: I0308 04:13:42.448249 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn9c5\" (UniqueName: \"kubernetes.io/projected/73cadee4-f498-45a6-93be-258e903c9a53-kube-api-access-bn9c5\") pod \"barbican-operator-controller-manager-6db6876945-j8xwt\" (UID: \"73cadee4-f498-45a6-93be-258e903c9a53\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j8xwt" Mar 08 04:13:42.452721 master-0 kubenswrapper[18592]: I0308 04:13:42.451455 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-jtbw5"] Mar 08 04:13:42.452721 master-0 kubenswrapper[18592]: I0308 04:13:42.452460 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jtbw5" Mar 08 04:13:42.532868 master-0 kubenswrapper[18592]: I0308 04:13:42.525597 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-jtbw5"] Mar 08 04:13:42.551488 master-0 kubenswrapper[18592]: I0308 04:13:42.549712 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-nmjrc"] Mar 08 04:13:42.551488 master-0 kubenswrapper[18592]: I0308 04:13:42.550997 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-nmjrc" Mar 08 04:13:42.556047 master-0 kubenswrapper[18592]: I0308 04:13:42.552236 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn9c5\" (UniqueName: \"kubernetes.io/projected/73cadee4-f498-45a6-93be-258e903c9a53-kube-api-access-bn9c5\") pod \"barbican-operator-controller-manager-6db6876945-j8xwt\" (UID: \"73cadee4-f498-45a6-93be-258e903c9a53\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j8xwt" Mar 08 04:13:42.556047 master-0 kubenswrapper[18592]: I0308 04:13:42.552356 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcxvc\" (UniqueName: \"kubernetes.io/projected/eaa663c3-d979-4bf2-aa45-def619616016-kube-api-access-vcxvc\") pod \"designate-operator-controller-manager-5d87c9d997-jtbw5\" (UID: \"eaa663c3-d979-4bf2-aa45-def619616016\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jtbw5" Mar 08 04:13:42.556047 master-0 kubenswrapper[18592]: I0308 04:13:42.552382 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g59l\" (UniqueName: \"kubernetes.io/projected/ee358d70-7ec7-4e97-8e8c-b9144c6f1bff-kube-api-access-4g59l\") pod \"cinder-operator-controller-manager-55d77d7b5c-hq9nx\" (UID: \"ee358d70-7ec7-4e97-8e8c-b9144c6f1bff\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hq9nx" Mar 08 04:13:42.571413 master-0 kubenswrapper[18592]: I0308 04:13:42.571310 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-nmjrc"] Mar 08 04:13:42.597259 master-0 kubenswrapper[18592]: I0308 04:13:42.596481 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-6w8p8"] Mar 08 04:13:42.598055 master-0 kubenswrapper[18592]: I0308 04:13:42.597519 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-6w8p8" Mar 08 04:13:42.621879 master-0 kubenswrapper[18592]: I0308 04:13:42.621842 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn9c5\" (UniqueName: \"kubernetes.io/projected/73cadee4-f498-45a6-93be-258e903c9a53-kube-api-access-bn9c5\") pod \"barbican-operator-controller-manager-6db6876945-j8xwt\" (UID: \"73cadee4-f498-45a6-93be-258e903c9a53\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j8xwt" Mar 08 04:13:42.658077 master-0 kubenswrapper[18592]: I0308 04:13:42.657840 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkw8d\" (UniqueName: \"kubernetes.io/projected/7a78dbb5-7df9-47bc-b7b1-a0ea7d42f8c3-kube-api-access-bkw8d\") pod \"glance-operator-controller-manager-64db6967f8-nmjrc\" (UID: \"7a78dbb5-7df9-47bc-b7b1-a0ea7d42f8c3\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-nmjrc" Mar 08 04:13:42.658077 master-0 kubenswrapper[18592]: I0308 04:13:42.658068 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcxvc\" (UniqueName: \"kubernetes.io/projected/eaa663c3-d979-4bf2-aa45-def619616016-kube-api-access-vcxvc\") pod \"designate-operator-controller-manager-5d87c9d997-jtbw5\" (UID: \"eaa663c3-d979-4bf2-aa45-def619616016\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jtbw5" Mar 08 04:13:42.658327 master-0 kubenswrapper[18592]: I0308 04:13:42.658092 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g59l\" (UniqueName: \"kubernetes.io/projected/ee358d70-7ec7-4e97-8e8c-b9144c6f1bff-kube-api-access-4g59l\") pod \"cinder-operator-controller-manager-55d77d7b5c-hq9nx\" (UID: \"ee358d70-7ec7-4e97-8e8c-b9144c6f1bff\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hq9nx" Mar 08 04:13:42.684351 master-0 kubenswrapper[18592]: I0308 04:13:42.684135 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g59l\" (UniqueName: \"kubernetes.io/projected/ee358d70-7ec7-4e97-8e8c-b9144c6f1bff-kube-api-access-4g59l\") pod \"cinder-operator-controller-manager-55d77d7b5c-hq9nx\" (UID: \"ee358d70-7ec7-4e97-8e8c-b9144c6f1bff\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hq9nx" Mar 08 04:13:42.694796 master-0 kubenswrapper[18592]: I0308 04:13:42.688206 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcxvc\" (UniqueName: \"kubernetes.io/projected/eaa663c3-d979-4bf2-aa45-def619616016-kube-api-access-vcxvc\") pod \"designate-operator-controller-manager-5d87c9d997-jtbw5\" (UID: \"eaa663c3-d979-4bf2-aa45-def619616016\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jtbw5" Mar 08 04:13:42.715376 master-0 kubenswrapper[18592]: I0308 04:13:42.712464 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j8xwt" Mar 08 04:13:42.800875 master-0 kubenswrapper[18592]: I0308 04:13:42.796691 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj8m8\" (UniqueName: \"kubernetes.io/projected/2fa88bb5-6717-4aeb-b4b5-fb838c711a9a-kube-api-access-hj8m8\") pod \"heat-operator-controller-manager-cf99c678f-6w8p8\" (UID: \"2fa88bb5-6717-4aeb-b4b5-fb838c711a9a\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-6w8p8" Mar 08 04:13:42.800875 master-0 kubenswrapper[18592]: I0308 04:13:42.796854 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkw8d\" (UniqueName: \"kubernetes.io/projected/7a78dbb5-7df9-47bc-b7b1-a0ea7d42f8c3-kube-api-access-bkw8d\") pod \"glance-operator-controller-manager-64db6967f8-nmjrc\" (UID: \"7a78dbb5-7df9-47bc-b7b1-a0ea7d42f8c3\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-nmjrc" Mar 08 04:13:42.886363 master-0 kubenswrapper[18592]: I0308 04:13:42.806103 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jtbw5" Mar 08 04:13:42.886363 master-0 kubenswrapper[18592]: I0308 04:13:42.806933 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hq9nx" Mar 08 04:13:42.886363 master-0 kubenswrapper[18592]: I0308 04:13:42.863842 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkw8d\" (UniqueName: \"kubernetes.io/projected/7a78dbb5-7df9-47bc-b7b1-a0ea7d42f8c3-kube-api-access-bkw8d\") pod \"glance-operator-controller-manager-64db6967f8-nmjrc\" (UID: \"7a78dbb5-7df9-47bc-b7b1-a0ea7d42f8c3\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-nmjrc" Mar 08 04:13:42.899132 master-0 kubenswrapper[18592]: I0308 04:13:42.898115 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-nmjrc" Mar 08 04:13:42.900599 master-0 kubenswrapper[18592]: I0308 04:13:42.900178 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj8m8\" (UniqueName: \"kubernetes.io/projected/2fa88bb5-6717-4aeb-b4b5-fb838c711a9a-kube-api-access-hj8m8\") pod \"heat-operator-controller-manager-cf99c678f-6w8p8\" (UID: \"2fa88bb5-6717-4aeb-b4b5-fb838c711a9a\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-6w8p8" Mar 08 04:13:42.960845 master-0 kubenswrapper[18592]: I0308 04:13:42.959876 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-6w8p8"] Mar 08 04:13:42.979134 master-0 kubenswrapper[18592]: I0308 04:13:42.972567 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj8m8\" (UniqueName: \"kubernetes.io/projected/2fa88bb5-6717-4aeb-b4b5-fb838c711a9a-kube-api-access-hj8m8\") pod \"heat-operator-controller-manager-cf99c678f-6w8p8\" (UID: \"2fa88bb5-6717-4aeb-b4b5-fb838c711a9a\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-6w8p8" Mar 08 04:13:43.089742 master-0 kubenswrapper[18592]: I0308 04:13:43.089681 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-27xxz"] Mar 08 04:13:43.163206 master-0 kubenswrapper[18592]: I0308 04:13:43.163175 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7"] Mar 08 04:13:43.165417 master-0 kubenswrapper[18592]: I0308 04:13:43.165379 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-27xxz" Mar 08 04:13:43.177936 master-0 kubenswrapper[18592]: I0308 04:13:43.171597 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" Mar 08 04:13:43.177936 master-0 kubenswrapper[18592]: I0308 04:13:43.172581 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-27xxz"] Mar 08 04:13:43.180034 master-0 kubenswrapper[18592]: I0308 04:13:43.179722 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 08 04:13:43.275090 master-0 kubenswrapper[18592]: I0308 04:13:43.251393 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-6w8p8" Mar 08 04:13:43.275090 master-0 kubenswrapper[18592]: I0308 04:13:43.270742 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-585d849c57-sh9z8"] Mar 08 04:13:43.275090 master-0 kubenswrapper[18592]: I0308 04:13:43.271742 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-585d849c57-sh9z8" Mar 08 04:13:43.275090 master-0 kubenswrapper[18592]: I0308 04:13:43.273360 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7"] Mar 08 04:13:43.276646 master-0 kubenswrapper[18592]: I0308 04:13:43.276565 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-7vpk7\" (UID: \"42c8b424-b12a-48b3-94e7-22ba1d126949\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" Mar 08 04:13:43.277571 master-0 kubenswrapper[18592]: I0308 04:13:43.276656 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6bxq\" (UniqueName: \"kubernetes.io/projected/f20011d2-eebe-44f5-a184-a79992c664ff-kube-api-access-w6bxq\") pod \"ironic-operator-controller-manager-585d849c57-sh9z8\" (UID: \"f20011d2-eebe-44f5-a184-a79992c664ff\") " pod="openstack-operators/ironic-operator-controller-manager-585d849c57-sh9z8" Mar 08 04:13:43.277571 master-0 kubenswrapper[18592]: I0308 04:13:43.276676 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jxn6\" (UniqueName: \"kubernetes.io/projected/42c8b424-b12a-48b3-94e7-22ba1d126949-kube-api-access-2jxn6\") pod \"infra-operator-controller-manager-b8c8d7cc8-7vpk7\" (UID: \"42c8b424-b12a-48b3-94e7-22ba1d126949\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" Mar 08 04:13:43.277571 master-0 kubenswrapper[18592]: I0308 04:13:43.276712 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46cwt\" (UniqueName: \"kubernetes.io/projected/64afb73a-1b22-4506-98d4-98d7ddf38a93-kube-api-access-46cwt\") pod \"horizon-operator-controller-manager-78bc7f9bd9-27xxz\" (UID: \"64afb73a-1b22-4506-98d4-98d7ddf38a93\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-27xxz" Mar 08 04:13:43.314776 master-0 kubenswrapper[18592]: I0308 04:13:43.294905 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-585d849c57-sh9z8"] Mar 08 04:13:43.320355 master-0 kubenswrapper[18592]: I0308 04:13:43.317384 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wgmst"] Mar 08 04:13:43.320355 master-0 kubenswrapper[18592]: I0308 04:13:43.318539 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wgmst" Mar 08 04:13:43.350362 master-0 kubenswrapper[18592]: I0308 04:13:43.349687 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wgmst"] Mar 08 04:13:43.361081 master-0 kubenswrapper[18592]: I0308 04:13:43.361030 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-cgc7b"] Mar 08 04:13:43.363416 master-0 kubenswrapper[18592]: I0308 04:13:43.362132 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-cgc7b" Mar 08 04:13:43.382891 master-0 kubenswrapper[18592]: I0308 04:13:43.378644 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6bxq\" (UniqueName: \"kubernetes.io/projected/f20011d2-eebe-44f5-a184-a79992c664ff-kube-api-access-w6bxq\") pod \"ironic-operator-controller-manager-585d849c57-sh9z8\" (UID: \"f20011d2-eebe-44f5-a184-a79992c664ff\") " pod="openstack-operators/ironic-operator-controller-manager-585d849c57-sh9z8" Mar 08 04:13:43.382891 master-0 kubenswrapper[18592]: I0308 04:13:43.378718 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jxn6\" (UniqueName: \"kubernetes.io/projected/42c8b424-b12a-48b3-94e7-22ba1d126949-kube-api-access-2jxn6\") pod \"infra-operator-controller-manager-b8c8d7cc8-7vpk7\" (UID: \"42c8b424-b12a-48b3-94e7-22ba1d126949\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" Mar 08 04:13:43.382891 master-0 kubenswrapper[18592]: I0308 04:13:43.378746 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46cwt\" (UniqueName: \"kubernetes.io/projected/64afb73a-1b22-4506-98d4-98d7ddf38a93-kube-api-access-46cwt\") pod \"horizon-operator-controller-manager-78bc7f9bd9-27xxz\" (UID: \"64afb73a-1b22-4506-98d4-98d7ddf38a93\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-27xxz" Mar 08 04:13:43.382891 master-0 kubenswrapper[18592]: I0308 04:13:43.378809 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgvxj\" (UniqueName: \"kubernetes.io/projected/9b73e51b-5898-4d03-b81e-977e96f47ae5-kube-api-access-wgvxj\") pod \"manila-operator-controller-manager-67d996989d-cgc7b\" (UID: \"9b73e51b-5898-4d03-b81e-977e96f47ae5\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-cgc7b" Mar 08 04:13:43.382891 master-0 kubenswrapper[18592]: I0308 04:13:43.378900 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvm7d\" (UniqueName: \"kubernetes.io/projected/b0a44319-b717-43ff-9799-15d4191de116-kube-api-access-tvm7d\") pod \"keystone-operator-controller-manager-7c789f89c6-wgmst\" (UID: \"b0a44319-b717-43ff-9799-15d4191de116\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wgmst" Mar 08 04:13:43.382891 master-0 kubenswrapper[18592]: I0308 04:13:43.378960 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-7vpk7\" (UID: \"42c8b424-b12a-48b3-94e7-22ba1d126949\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" Mar 08 04:13:43.382891 master-0 kubenswrapper[18592]: E0308 04:13:43.379112 18592 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 04:13:43.382891 master-0 kubenswrapper[18592]: E0308 04:13:43.379179 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert podName:42c8b424-b12a-48b3-94e7-22ba1d126949 nodeName:}" failed. No retries permitted until 2026-03-08 04:13:43.879158672 +0000 UTC m=+1235.977913022 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert") pod "infra-operator-controller-manager-b8c8d7cc8-7vpk7" (UID: "42c8b424-b12a-48b3-94e7-22ba1d126949") : secret "infra-operator-webhook-server-cert" not found Mar 08 04:13:43.382891 master-0 kubenswrapper[18592]: I0308 04:13:43.379591 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-cgc7b"] Mar 08 04:13:43.399062 master-0 kubenswrapper[18592]: W0308 04:13:43.399019 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73cadee4_f498_45a6_93be_258e903c9a53.slice/crio-b157f7668e686d9f0bfe95a11433bf3571c1ede209f43bb04da6dd304ba54b37 WatchSource:0}: Error finding container b157f7668e686d9f0bfe95a11433bf3571c1ede209f43bb04da6dd304ba54b37: Status 404 returned error can't find the container with id b157f7668e686d9f0bfe95a11433bf3571c1ede209f43bb04da6dd304ba54b37 Mar 08 04:13:43.399170 master-0 kubenswrapper[18592]: I0308 04:13:43.399075 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-rtchq"] Mar 08 04:13:43.400219 master-0 kubenswrapper[18592]: I0308 04:13:43.400195 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-rtchq" Mar 08 04:13:43.402981 master-0 kubenswrapper[18592]: I0308 04:13:43.402916 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6bxq\" (UniqueName: \"kubernetes.io/projected/f20011d2-eebe-44f5-a184-a79992c664ff-kube-api-access-w6bxq\") pod \"ironic-operator-controller-manager-585d849c57-sh9z8\" (UID: \"f20011d2-eebe-44f5-a184-a79992c664ff\") " pod="openstack-operators/ironic-operator-controller-manager-585d849c57-sh9z8" Mar 08 04:13:43.404741 master-0 kubenswrapper[18592]: I0308 04:13:43.404691 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jxn6\" (UniqueName: \"kubernetes.io/projected/42c8b424-b12a-48b3-94e7-22ba1d126949-kube-api-access-2jxn6\") pod \"infra-operator-controller-manager-b8c8d7cc8-7vpk7\" (UID: \"42c8b424-b12a-48b3-94e7-22ba1d126949\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" Mar 08 04:13:43.437923 master-0 kubenswrapper[18592]: I0308 04:13:43.424848 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-rtchq"] Mar 08 04:13:43.437923 master-0 kubenswrapper[18592]: I0308 04:13:43.425752 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j8xwt" event={"ID":"73cadee4-f498-45a6-93be-258e903c9a53","Type":"ContainerStarted","Data":"b157f7668e686d9f0bfe95a11433bf3571c1ede209f43bb04da6dd304ba54b37"} Mar 08 04:13:43.439034 master-0 kubenswrapper[18592]: I0308 04:13:43.439008 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46cwt\" (UniqueName: \"kubernetes.io/projected/64afb73a-1b22-4506-98d4-98d7ddf38a93-kube-api-access-46cwt\") pod \"horizon-operator-controller-manager-78bc7f9bd9-27xxz\" (UID: \"64afb73a-1b22-4506-98d4-98d7ddf38a93\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-27xxz" Mar 08 04:13:43.445980 master-0 kubenswrapper[18592]: I0308 04:13:43.440745 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-ssj26"] Mar 08 04:13:43.445980 master-0 kubenswrapper[18592]: I0308 04:13:43.441835 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-ssj26" Mar 08 04:13:43.458209 master-0 kubenswrapper[18592]: I0308 04:13:43.458182 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-ssj26"] Mar 08 04:13:43.467260 master-0 kubenswrapper[18592]: I0308 04:13:43.467231 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-llvv7"] Mar 08 04:13:43.468761 master-0 kubenswrapper[18592]: I0308 04:13:43.468738 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-llvv7" Mar 08 04:13:43.476004 master-0 kubenswrapper[18592]: I0308 04:13:43.475732 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-6jmjm"] Mar 08 04:13:43.477814 master-0 kubenswrapper[18592]: I0308 04:13:43.476788 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6jmjm" Mar 08 04:13:43.481603 master-0 kubenswrapper[18592]: I0308 04:13:43.481152 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scfp7\" (UniqueName: \"kubernetes.io/projected/987df199-3a6e-40aa-8736-deadb057262a-kube-api-access-scfp7\") pod \"neutron-operator-controller-manager-54688575f-ssj26\" (UID: \"987df199-3a6e-40aa-8736-deadb057262a\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-ssj26" Mar 08 04:13:43.481603 master-0 kubenswrapper[18592]: I0308 04:13:43.481277 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgvxj\" (UniqueName: \"kubernetes.io/projected/9b73e51b-5898-4d03-b81e-977e96f47ae5-kube-api-access-wgvxj\") pod \"manila-operator-controller-manager-67d996989d-cgc7b\" (UID: \"9b73e51b-5898-4d03-b81e-977e96f47ae5\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-cgc7b" Mar 08 04:13:43.481603 master-0 kubenswrapper[18592]: I0308 04:13:43.481313 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7jdh\" (UniqueName: \"kubernetes.io/projected/97dbb8f0-af27-4e9a-ae6f-8311025a0168-kube-api-access-j7jdh\") pod \"octavia-operator-controller-manager-5d86c7ddb7-llvv7\" (UID: \"97dbb8f0-af27-4e9a-ae6f-8311025a0168\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-llvv7" Mar 08 04:13:43.481603 master-0 kubenswrapper[18592]: I0308 04:13:43.481371 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkfhw\" (UniqueName: \"kubernetes.io/projected/64139e00-f7e0-4225-9ca7-0942e38b5455-kube-api-access-jkfhw\") pod \"mariadb-operator-controller-manager-7b6bfb6475-rtchq\" (UID: \"64139e00-f7e0-4225-9ca7-0942e38b5455\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-rtchq" Mar 08 04:13:43.481603 master-0 kubenswrapper[18592]: I0308 04:13:43.481395 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvm7d\" (UniqueName: \"kubernetes.io/projected/b0a44319-b717-43ff-9799-15d4191de116-kube-api-access-tvm7d\") pod \"keystone-operator-controller-manager-7c789f89c6-wgmst\" (UID: \"b0a44319-b717-43ff-9799-15d4191de116\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wgmst" Mar 08 04:13:43.484967 master-0 kubenswrapper[18592]: I0308 04:13:43.484926 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-6jmjm"] Mar 08 04:13:43.506877 master-0 kubenswrapper[18592]: I0308 04:13:43.506807 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-llvv7"] Mar 08 04:13:43.516764 master-0 kubenswrapper[18592]: I0308 04:13:43.516727 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvm7d\" (UniqueName: \"kubernetes.io/projected/b0a44319-b717-43ff-9799-15d4191de116-kube-api-access-tvm7d\") pod \"keystone-operator-controller-manager-7c789f89c6-wgmst\" (UID: \"b0a44319-b717-43ff-9799-15d4191de116\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wgmst" Mar 08 04:13:43.520619 master-0 kubenswrapper[18592]: I0308 04:13:43.516961 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgvxj\" (UniqueName: \"kubernetes.io/projected/9b73e51b-5898-4d03-b81e-977e96f47ae5-kube-api-access-wgvxj\") pod \"manila-operator-controller-manager-67d996989d-cgc7b\" (UID: \"9b73e51b-5898-4d03-b81e-977e96f47ae5\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-cgc7b" Mar 08 04:13:43.520619 master-0 kubenswrapper[18592]: I0308 04:13:43.518867 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j"] Mar 08 04:13:43.520619 master-0 kubenswrapper[18592]: I0308 04:13:43.520016 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" Mar 08 04:13:43.521976 master-0 kubenswrapper[18592]: I0308 04:13:43.521491 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 08 04:13:43.545866 master-0 kubenswrapper[18592]: I0308 04:13:43.544327 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-27xxz" Mar 08 04:13:43.556304 master-0 kubenswrapper[18592]: I0308 04:13:43.556044 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-mtlx8"] Mar 08 04:13:43.558857 master-0 kubenswrapper[18592]: I0308 04:13:43.557403 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-mtlx8" Mar 08 04:13:43.575318 master-0 kubenswrapper[18592]: I0308 04:13:43.575256 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-lpjvx"] Mar 08 04:13:43.579884 master-0 kubenswrapper[18592]: I0308 04:13:43.579517 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-mtlx8"] Mar 08 04:13:43.579884 master-0 kubenswrapper[18592]: I0308 04:13:43.579583 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-lpjvx" Mar 08 04:13:43.583002 master-0 kubenswrapper[18592]: I0308 04:13:43.582869 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scfp7\" (UniqueName: \"kubernetes.io/projected/987df199-3a6e-40aa-8736-deadb057262a-kube-api-access-scfp7\") pod \"neutron-operator-controller-manager-54688575f-ssj26\" (UID: \"987df199-3a6e-40aa-8736-deadb057262a\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-ssj26" Mar 08 04:13:43.583002 master-0 kubenswrapper[18592]: I0308 04:13:43.582928 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7jdh\" (UniqueName: \"kubernetes.io/projected/97dbb8f0-af27-4e9a-ae6f-8311025a0168-kube-api-access-j7jdh\") pod \"octavia-operator-controller-manager-5d86c7ddb7-llvv7\" (UID: \"97dbb8f0-af27-4e9a-ae6f-8311025a0168\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-llvv7" Mar 08 04:13:43.583002 master-0 kubenswrapper[18592]: I0308 04:13:43.582951 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqwfg\" (UniqueName: \"kubernetes.io/projected/393711cd-0c1e-4052-89b0-498e0e2a09ba-kube-api-access-xqwfg\") pod \"nova-operator-controller-manager-74b6b5dc96-6jmjm\" (UID: \"393711cd-0c1e-4052-89b0-498e0e2a09ba\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6jmjm" Mar 08 04:13:43.583158 master-0 kubenswrapper[18592]: I0308 04:13:43.583005 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkfhw\" (UniqueName: \"kubernetes.io/projected/64139e00-f7e0-4225-9ca7-0942e38b5455-kube-api-access-jkfhw\") pod \"mariadb-operator-controller-manager-7b6bfb6475-rtchq\" (UID: \"64139e00-f7e0-4225-9ca7-0942e38b5455\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-rtchq" Mar 08 04:13:43.583158 master-0 kubenswrapper[18592]: I0308 04:13:43.583035 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/156ae5e7-80bd-4788-9fbe-94fb9a036161-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cf849j\" (UID: \"156ae5e7-80bd-4788-9fbe-94fb9a036161\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" Mar 08 04:13:43.583158 master-0 kubenswrapper[18592]: I0308 04:13:43.583065 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzvjw\" (UniqueName: \"kubernetes.io/projected/156ae5e7-80bd-4788-9fbe-94fb9a036161-kube-api-access-fzvjw\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cf849j\" (UID: \"156ae5e7-80bd-4788-9fbe-94fb9a036161\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" Mar 08 04:13:43.583158 master-0 kubenswrapper[18592]: I0308 04:13:43.583087 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85ml5\" (UniqueName: \"kubernetes.io/projected/2723e73b-52d4-4d1a-9778-2c14ffa390c2-kube-api-access-85ml5\") pod \"ovn-operator-controller-manager-75684d597f-mtlx8\" (UID: \"2723e73b-52d4-4d1a-9778-2c14ffa390c2\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-mtlx8" Mar 08 04:13:43.604635 master-0 kubenswrapper[18592]: I0308 04:13:43.604590 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkfhw\" (UniqueName: \"kubernetes.io/projected/64139e00-f7e0-4225-9ca7-0942e38b5455-kube-api-access-jkfhw\") pod \"mariadb-operator-controller-manager-7b6bfb6475-rtchq\" (UID: \"64139e00-f7e0-4225-9ca7-0942e38b5455\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-rtchq" Mar 08 04:13:43.605012 master-0 kubenswrapper[18592]: I0308 04:13:43.604967 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j"] Mar 08 04:13:43.607271 master-0 kubenswrapper[18592]: I0308 04:13:43.607223 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7jdh\" (UniqueName: \"kubernetes.io/projected/97dbb8f0-af27-4e9a-ae6f-8311025a0168-kube-api-access-j7jdh\") pod \"octavia-operator-controller-manager-5d86c7ddb7-llvv7\" (UID: \"97dbb8f0-af27-4e9a-ae6f-8311025a0168\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-llvv7" Mar 08 04:13:43.607607 master-0 kubenswrapper[18592]: I0308 04:13:43.607589 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scfp7\" (UniqueName: \"kubernetes.io/projected/987df199-3a6e-40aa-8736-deadb057262a-kube-api-access-scfp7\") pod \"neutron-operator-controller-manager-54688575f-ssj26\" (UID: \"987df199-3a6e-40aa-8736-deadb057262a\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-ssj26" Mar 08 04:13:43.622288 master-0 kubenswrapper[18592]: I0308 04:13:43.622257 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-585d849c57-sh9z8" Mar 08 04:13:43.625105 master-0 kubenswrapper[18592]: W0308 04:13:43.625065 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaa663c3_d979_4bf2_aa45_def619616016.slice/crio-c56f5d2e68019472dc2ecf78ed2afa36424275236a15c8c382e181156fe0b67b WatchSource:0}: Error finding container c56f5d2e68019472dc2ecf78ed2afa36424275236a15c8c382e181156fe0b67b: Status 404 returned error can't find the container with id c56f5d2e68019472dc2ecf78ed2afa36424275236a15c8c382e181156fe0b67b Mar 08 04:13:43.636356 master-0 kubenswrapper[18592]: I0308 04:13:43.636032 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-lpjvx"] Mar 08 04:13:43.648107 master-0 kubenswrapper[18592]: I0308 04:13:43.646180 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-c4wjd"] Mar 08 04:13:43.648107 master-0 kubenswrapper[18592]: I0308 04:13:43.647763 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-c4wjd" Mar 08 04:13:43.658552 master-0 kubenswrapper[18592]: I0308 04:13:43.658513 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-sbvfq"] Mar 08 04:13:43.660096 master-0 kubenswrapper[18592]: I0308 04:13:43.660066 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sbvfq" Mar 08 04:13:43.673166 master-0 kubenswrapper[18592]: I0308 04:13:43.673115 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wgmst" Mar 08 04:13:43.674769 master-0 kubenswrapper[18592]: I0308 04:13:43.673796 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-c4wjd"] Mar 08 04:13:43.687845 master-0 kubenswrapper[18592]: I0308 04:13:43.687478 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h746\" (UniqueName: \"kubernetes.io/projected/dfa2bccb-a39e-4939-bdd9-e6916672e53e-kube-api-access-7h746\") pod \"swift-operator-controller-manager-9b9ff9f4d-c4wjd\" (UID: \"dfa2bccb-a39e-4939-bdd9-e6916672e53e\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-c4wjd" Mar 08 04:13:43.687845 master-0 kubenswrapper[18592]: I0308 04:13:43.687552 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnpk8\" (UniqueName: \"kubernetes.io/projected/50e962ee-4845-422a-948b-05e4f5b1097a-kube-api-access-vnpk8\") pod \"placement-operator-controller-manager-648564c9fc-lpjvx\" (UID: \"50e962ee-4845-422a-948b-05e4f5b1097a\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-lpjvx" Mar 08 04:13:43.687845 master-0 kubenswrapper[18592]: I0308 04:13:43.687606 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqwfg\" (UniqueName: \"kubernetes.io/projected/393711cd-0c1e-4052-89b0-498e0e2a09ba-kube-api-access-xqwfg\") pod \"nova-operator-controller-manager-74b6b5dc96-6jmjm\" (UID: \"393711cd-0c1e-4052-89b0-498e0e2a09ba\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6jmjm" Mar 08 04:13:43.687845 master-0 kubenswrapper[18592]: I0308 04:13:43.687662 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/156ae5e7-80bd-4788-9fbe-94fb9a036161-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cf849j\" (UID: \"156ae5e7-80bd-4788-9fbe-94fb9a036161\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" Mar 08 04:13:43.687845 master-0 kubenswrapper[18592]: I0308 04:13:43.687725 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzvjw\" (UniqueName: \"kubernetes.io/projected/156ae5e7-80bd-4788-9fbe-94fb9a036161-kube-api-access-fzvjw\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cf849j\" (UID: \"156ae5e7-80bd-4788-9fbe-94fb9a036161\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" Mar 08 04:13:43.687845 master-0 kubenswrapper[18592]: I0308 04:13:43.687777 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85ml5\" (UniqueName: \"kubernetes.io/projected/2723e73b-52d4-4d1a-9778-2c14ffa390c2-kube-api-access-85ml5\") pod \"ovn-operator-controller-manager-75684d597f-mtlx8\" (UID: \"2723e73b-52d4-4d1a-9778-2c14ffa390c2\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-mtlx8" Mar 08 04:13:43.688111 master-0 kubenswrapper[18592]: I0308 04:13:43.687937 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-sbvfq"] Mar 08 04:13:43.688111 master-0 kubenswrapper[18592]: I0308 04:13:43.687852 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq4x4\" (UniqueName: \"kubernetes.io/projected/6f82adce-b4bf-4a3e-a31e-c3c33c669501-kube-api-access-fq4x4\") pod \"telemetry-operator-controller-manager-5fdb694969-sbvfq\" (UID: \"6f82adce-b4bf-4a3e-a31e-c3c33c669501\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sbvfq" Mar 08 04:13:43.688167 master-0 kubenswrapper[18592]: E0308 04:13:43.688141 18592 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 04:13:43.688202 master-0 kubenswrapper[18592]: E0308 04:13:43.688184 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/156ae5e7-80bd-4788-9fbe-94fb9a036161-cert podName:156ae5e7-80bd-4788-9fbe-94fb9a036161 nodeName:}" failed. No retries permitted until 2026-03-08 04:13:44.18816867 +0000 UTC m=+1236.286923020 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/156ae5e7-80bd-4788-9fbe-94fb9a036161-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" (UID: "156ae5e7-80bd-4788-9fbe-94fb9a036161") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 04:13:43.690592 master-0 kubenswrapper[18592]: I0308 04:13:43.690556 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-cgc7b" Mar 08 04:13:43.696638 master-0 kubenswrapper[18592]: I0308 04:13:43.696596 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-xk2sd"] Mar 08 04:13:43.697745 master-0 kubenswrapper[18592]: I0308 04:13:43.697717 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xk2sd" Mar 08 04:13:43.709033 master-0 kubenswrapper[18592]: I0308 04:13:43.706449 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-xk2sd"] Mar 08 04:13:43.714056 master-0 kubenswrapper[18592]: I0308 04:13:43.713984 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-rhdgk"] Mar 08 04:13:43.725919 master-0 kubenswrapper[18592]: I0308 04:13:43.715731 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-rhdgk" Mar 08 04:13:43.725919 master-0 kubenswrapper[18592]: I0308 04:13:43.723385 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-rhdgk"] Mar 08 04:13:43.729109 master-0 kubenswrapper[18592]: I0308 04:13:43.729052 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85ml5\" (UniqueName: \"kubernetes.io/projected/2723e73b-52d4-4d1a-9778-2c14ffa390c2-kube-api-access-85ml5\") pod \"ovn-operator-controller-manager-75684d597f-mtlx8\" (UID: \"2723e73b-52d4-4d1a-9778-2c14ffa390c2\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-mtlx8" Mar 08 04:13:43.729707 master-0 kubenswrapper[18592]: I0308 04:13:43.729585 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzvjw\" (UniqueName: \"kubernetes.io/projected/156ae5e7-80bd-4788-9fbe-94fb9a036161-kube-api-access-fzvjw\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cf849j\" (UID: \"156ae5e7-80bd-4788-9fbe-94fb9a036161\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" Mar 08 04:13:43.736590 master-0 kubenswrapper[18592]: I0308 04:13:43.736418 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqwfg\" (UniqueName: \"kubernetes.io/projected/393711cd-0c1e-4052-89b0-498e0e2a09ba-kube-api-access-xqwfg\") pod \"nova-operator-controller-manager-74b6b5dc96-6jmjm\" (UID: \"393711cd-0c1e-4052-89b0-498e0e2a09ba\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6jmjm" Mar 08 04:13:43.744871 master-0 kubenswrapper[18592]: I0308 04:13:43.740256 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-rtchq" Mar 08 04:13:43.744871 master-0 kubenswrapper[18592]: I0308 04:13:43.743336 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f"] Mar 08 04:13:43.760946 master-0 kubenswrapper[18592]: I0308 04:13:43.751468 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:43.760946 master-0 kubenswrapper[18592]: I0308 04:13:43.758347 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 08 04:13:43.760946 master-0 kubenswrapper[18592]: I0308 04:13:43.758671 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 08 04:13:43.763533 master-0 kubenswrapper[18592]: I0308 04:13:43.762858 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-ssj26" Mar 08 04:13:43.781995 master-0 kubenswrapper[18592]: I0308 04:13:43.781949 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f"] Mar 08 04:13:43.784593 master-0 kubenswrapper[18592]: I0308 04:13:43.784534 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-llvv7" Mar 08 04:13:43.792577 master-0 kubenswrapper[18592]: I0308 04:13:43.791787 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-metrics-certs\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:43.793961 master-0 kubenswrapper[18592]: I0308 04:13:43.793874 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq4x4\" (UniqueName: \"kubernetes.io/projected/6f82adce-b4bf-4a3e-a31e-c3c33c669501-kube-api-access-fq4x4\") pod \"telemetry-operator-controller-manager-5fdb694969-sbvfq\" (UID: \"6f82adce-b4bf-4a3e-a31e-c3c33c669501\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sbvfq" Mar 08 04:13:43.794092 master-0 kubenswrapper[18592]: I0308 04:13:43.794034 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h746\" (UniqueName: \"kubernetes.io/projected/dfa2bccb-a39e-4939-bdd9-e6916672e53e-kube-api-access-7h746\") pod \"swift-operator-controller-manager-9b9ff9f4d-c4wjd\" (UID: \"dfa2bccb-a39e-4939-bdd9-e6916672e53e\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-c4wjd" Mar 08 04:13:43.794092 master-0 kubenswrapper[18592]: I0308 04:13:43.794064 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-webhook-certs\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:43.794234 master-0 kubenswrapper[18592]: I0308 04:13:43.794146 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnpk8\" (UniqueName: \"kubernetes.io/projected/50e962ee-4845-422a-948b-05e4f5b1097a-kube-api-access-vnpk8\") pod \"placement-operator-controller-manager-648564c9fc-lpjvx\" (UID: \"50e962ee-4845-422a-948b-05e4f5b1097a\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-lpjvx" Mar 08 04:13:43.794458 master-0 kubenswrapper[18592]: I0308 04:13:43.794214 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt4cn\" (UniqueName: \"kubernetes.io/projected/5c5e5398-174a-4dfd-b5ad-27496731960b-kube-api-access-wt4cn\") pod \"test-operator-controller-manager-55b5ff4dbb-xk2sd\" (UID: \"5c5e5398-174a-4dfd-b5ad-27496731960b\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xk2sd" Mar 08 04:13:43.797854 master-0 kubenswrapper[18592]: I0308 04:13:43.797775 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vplh\" (UniqueName: \"kubernetes.io/projected/cbed35cc-1e50-495e-8e4e-5b3765218089-kube-api-access-5vplh\") pod \"watcher-operator-controller-manager-bccc79885-rhdgk\" (UID: \"cbed35cc-1e50-495e-8e4e-5b3765218089\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-rhdgk" Mar 08 04:13:43.798047 master-0 kubenswrapper[18592]: I0308 04:13:43.798030 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-529tc\" (UniqueName: \"kubernetes.io/projected/09a67a89-922f-4e74-8621-27fbfb770f0b-kube-api-access-529tc\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:43.819763 master-0 kubenswrapper[18592]: I0308 04:13:43.806544 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6jmjm" Mar 08 04:13:43.903668 master-0 kubenswrapper[18592]: I0308 04:13:43.903626 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fr47r"] Mar 08 04:13:43.907677 master-0 kubenswrapper[18592]: I0308 04:13:43.907650 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fr47r" Mar 08 04:13:43.926543 master-0 kubenswrapper[18592]: I0308 04:13:43.925822 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-mtlx8" Mar 08 04:13:43.927504 master-0 kubenswrapper[18592]: I0308 04:13:43.927429 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-metrics-certs\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:43.927591 master-0 kubenswrapper[18592]: I0308 04:13:43.927521 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-7vpk7\" (UID: \"42c8b424-b12a-48b3-94e7-22ba1d126949\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" Mar 08 04:13:43.927639 master-0 kubenswrapper[18592]: I0308 04:13:43.927618 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-webhook-certs\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:43.927749 master-0 kubenswrapper[18592]: I0308 04:13:43.927714 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt4cn\" (UniqueName: \"kubernetes.io/projected/5c5e5398-174a-4dfd-b5ad-27496731960b-kube-api-access-wt4cn\") pod \"test-operator-controller-manager-55b5ff4dbb-xk2sd\" (UID: \"5c5e5398-174a-4dfd-b5ad-27496731960b\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xk2sd" Mar 08 04:13:43.927927 master-0 kubenswrapper[18592]: I0308 04:13:43.927797 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vplh\" (UniqueName: \"kubernetes.io/projected/cbed35cc-1e50-495e-8e4e-5b3765218089-kube-api-access-5vplh\") pod \"watcher-operator-controller-manager-bccc79885-rhdgk\" (UID: \"cbed35cc-1e50-495e-8e4e-5b3765218089\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-rhdgk" Mar 08 04:13:43.927927 master-0 kubenswrapper[18592]: I0308 04:13:43.927865 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-529tc\" (UniqueName: \"kubernetes.io/projected/09a67a89-922f-4e74-8621-27fbfb770f0b-kube-api-access-529tc\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:43.931936 master-0 kubenswrapper[18592]: E0308 04:13:43.928804 18592 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 04:13:43.931936 master-0 kubenswrapper[18592]: E0308 04:13:43.928875 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-metrics-certs podName:09a67a89-922f-4e74-8621-27fbfb770f0b nodeName:}" failed. No retries permitted until 2026-03-08 04:13:44.428855554 +0000 UTC m=+1236.527609894 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-metrics-certs") pod "openstack-operator-controller-manager-6bbb7bcd9-l7q5f" (UID: "09a67a89-922f-4e74-8621-27fbfb770f0b") : secret "metrics-server-cert" not found Mar 08 04:13:43.931936 master-0 kubenswrapper[18592]: E0308 04:13:43.929194 18592 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 04:13:43.931936 master-0 kubenswrapper[18592]: E0308 04:13:43.929254 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert podName:42c8b424-b12a-48b3-94e7-22ba1d126949 nodeName:}" failed. No retries permitted until 2026-03-08 04:13:44.929238024 +0000 UTC m=+1237.027992374 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert") pod "infra-operator-controller-manager-b8c8d7cc8-7vpk7" (UID: "42c8b424-b12a-48b3-94e7-22ba1d126949") : secret "infra-operator-webhook-server-cert" not found Mar 08 04:13:43.931936 master-0 kubenswrapper[18592]: E0308 04:13:43.929299 18592 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 04:13:43.931936 master-0 kubenswrapper[18592]: E0308 04:13:43.929329 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-webhook-certs podName:09a67a89-922f-4e74-8621-27fbfb770f0b nodeName:}" failed. No retries permitted until 2026-03-08 04:13:44.429322046 +0000 UTC m=+1236.528076396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-webhook-certs") pod "openstack-operator-controller-manager-6bbb7bcd9-l7q5f" (UID: "09a67a89-922f-4e74-8621-27fbfb770f0b") : secret "webhook-server-cert" not found Mar 08 04:13:43.944083 master-0 kubenswrapper[18592]: I0308 04:13:43.941661 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fr47r"] Mar 08 04:13:43.944892 master-0 kubenswrapper[18592]: I0308 04:13:43.944421 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnpk8\" (UniqueName: \"kubernetes.io/projected/50e962ee-4845-422a-948b-05e4f5b1097a-kube-api-access-vnpk8\") pod \"placement-operator-controller-manager-648564c9fc-lpjvx\" (UID: \"50e962ee-4845-422a-948b-05e4f5b1097a\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-lpjvx" Mar 08 04:13:43.980062 master-0 kubenswrapper[18592]: I0308 04:13:43.977795 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-529tc\" (UniqueName: \"kubernetes.io/projected/09a67a89-922f-4e74-8621-27fbfb770f0b-kube-api-access-529tc\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:44.020932 master-0 kubenswrapper[18592]: W0308 04:13:44.017561 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a78dbb5_7df9_47bc_b7b1_a0ea7d42f8c3.slice/crio-fb7f73fbcc8bff8dbbd5d77a0e09d37eb667c1a13cd8d3d4a1ba2dbab2e60ac7 WatchSource:0}: Error finding container fb7f73fbcc8bff8dbbd5d77a0e09d37eb667c1a13cd8d3d4a1ba2dbab2e60ac7: Status 404 returned error can't find the container with id fb7f73fbcc8bff8dbbd5d77a0e09d37eb667c1a13cd8d3d4a1ba2dbab2e60ac7 Mar 08 04:13:44.028229 master-0 kubenswrapper[18592]: I0308 04:13:44.028192 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq4x4\" (UniqueName: \"kubernetes.io/projected/6f82adce-b4bf-4a3e-a31e-c3c33c669501-kube-api-access-fq4x4\") pod \"telemetry-operator-controller-manager-5fdb694969-sbvfq\" (UID: \"6f82adce-b4bf-4a3e-a31e-c3c33c669501\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sbvfq" Mar 08 04:13:44.031520 master-0 kubenswrapper[18592]: I0308 04:13:44.030646 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h746\" (UniqueName: \"kubernetes.io/projected/dfa2bccb-a39e-4939-bdd9-e6916672e53e-kube-api-access-7h746\") pod \"swift-operator-controller-manager-9b9ff9f4d-c4wjd\" (UID: \"dfa2bccb-a39e-4939-bdd9-e6916672e53e\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-c4wjd" Mar 08 04:13:44.033876 master-0 kubenswrapper[18592]: I0308 04:13:44.033789 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-j8xwt"] Mar 08 04:13:44.040428 master-0 kubenswrapper[18592]: I0308 04:13:44.040395 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vplh\" (UniqueName: \"kubernetes.io/projected/cbed35cc-1e50-495e-8e4e-5b3765218089-kube-api-access-5vplh\") pod \"watcher-operator-controller-manager-bccc79885-rhdgk\" (UID: \"cbed35cc-1e50-495e-8e4e-5b3765218089\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-rhdgk" Mar 08 04:13:44.044109 master-0 kubenswrapper[18592]: I0308 04:13:44.041467 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt4cn\" (UniqueName: \"kubernetes.io/projected/5c5e5398-174a-4dfd-b5ad-27496731960b-kube-api-access-wt4cn\") pod \"test-operator-controller-manager-55b5ff4dbb-xk2sd\" (UID: \"5c5e5398-174a-4dfd-b5ad-27496731960b\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xk2sd" Mar 08 04:13:44.072888 master-0 kubenswrapper[18592]: I0308 04:13:44.071008 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-lpjvx" Mar 08 04:13:44.086541 master-0 kubenswrapper[18592]: I0308 04:13:44.084078 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-c4wjd" Mar 08 04:13:44.117119 master-0 kubenswrapper[18592]: I0308 04:13:44.116146 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sbvfq" Mar 08 04:13:44.120037 master-0 kubenswrapper[18592]: I0308 04:13:44.120007 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-jtbw5"] Mar 08 04:13:44.142705 master-0 kubenswrapper[18592]: I0308 04:13:44.142645 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqngh\" (UniqueName: \"kubernetes.io/projected/d56131ac-d36a-4ee0-b997-ee7640a1fd15-kube-api-access-cqngh\") pod \"rabbitmq-cluster-operator-manager-668c99d594-fr47r\" (UID: \"d56131ac-d36a-4ee0-b997-ee7640a1fd15\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fr47r" Mar 08 04:13:44.207924 master-0 kubenswrapper[18592]: I0308 04:13:44.203180 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xk2sd" Mar 08 04:13:44.226338 master-0 kubenswrapper[18592]: I0308 04:13:44.224130 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hq9nx"] Mar 08 04:13:44.226338 master-0 kubenswrapper[18592]: I0308 04:13:44.224558 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-rhdgk" Mar 08 04:13:44.244605 master-0 kubenswrapper[18592]: I0308 04:13:44.244542 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqngh\" (UniqueName: \"kubernetes.io/projected/d56131ac-d36a-4ee0-b997-ee7640a1fd15-kube-api-access-cqngh\") pod \"rabbitmq-cluster-operator-manager-668c99d594-fr47r\" (UID: \"d56131ac-d36a-4ee0-b997-ee7640a1fd15\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fr47r" Mar 08 04:13:44.244694 master-0 kubenswrapper[18592]: I0308 04:13:44.244668 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/156ae5e7-80bd-4788-9fbe-94fb9a036161-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cf849j\" (UID: \"156ae5e7-80bd-4788-9fbe-94fb9a036161\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" Mar 08 04:13:44.244915 master-0 kubenswrapper[18592]: E0308 04:13:44.244895 18592 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 04:13:44.245058 master-0 kubenswrapper[18592]: E0308 04:13:44.245042 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/156ae5e7-80bd-4788-9fbe-94fb9a036161-cert podName:156ae5e7-80bd-4788-9fbe-94fb9a036161 nodeName:}" failed. No retries permitted until 2026-03-08 04:13:45.245026512 +0000 UTC m=+1237.343780862 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/156ae5e7-80bd-4788-9fbe-94fb9a036161-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" (UID: "156ae5e7-80bd-4788-9fbe-94fb9a036161") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 04:13:44.270770 master-0 kubenswrapper[18592]: I0308 04:13:44.270725 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqngh\" (UniqueName: \"kubernetes.io/projected/d56131ac-d36a-4ee0-b997-ee7640a1fd15-kube-api-access-cqngh\") pod \"rabbitmq-cluster-operator-manager-668c99d594-fr47r\" (UID: \"d56131ac-d36a-4ee0-b997-ee7640a1fd15\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fr47r" Mar 08 04:13:44.271957 master-0 kubenswrapper[18592]: W0308 04:13:44.271382 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf20011d2_eebe_44f5_a184_a79992c664ff.slice/crio-10cfb47629cbf966c8806b1ec078ffb3c115aa8fb6f3ce832fffcafbf1293c1a WatchSource:0}: Error finding container 10cfb47629cbf966c8806b1ec078ffb3c115aa8fb6f3ce832fffcafbf1293c1a: Status 404 returned error can't find the container with id 10cfb47629cbf966c8806b1ec078ffb3c115aa8fb6f3ce832fffcafbf1293c1a Mar 08 04:13:44.275560 master-0 kubenswrapper[18592]: I0308 04:13:44.275527 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fr47r" Mar 08 04:13:44.287716 master-0 kubenswrapper[18592]: I0308 04:13:44.287671 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-nmjrc"] Mar 08 04:13:44.308455 master-0 kubenswrapper[18592]: I0308 04:13:44.308320 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-6w8p8"] Mar 08 04:13:44.344686 master-0 kubenswrapper[18592]: I0308 04:13:44.344529 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-27xxz"] Mar 08 04:13:44.357491 master-0 kubenswrapper[18592]: I0308 04:13:44.357452 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-585d849c57-sh9z8"] Mar 08 04:13:44.469476 master-0 kubenswrapper[18592]: I0308 04:13:44.469192 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-metrics-certs\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:44.469476 master-0 kubenswrapper[18592]: I0308 04:13:44.469300 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-webhook-certs\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:44.469476 master-0 kubenswrapper[18592]: E0308 04:13:44.469312 18592 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 04:13:44.469476 master-0 kubenswrapper[18592]: E0308 04:13:44.469379 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-metrics-certs podName:09a67a89-922f-4e74-8621-27fbfb770f0b nodeName:}" failed. No retries permitted until 2026-03-08 04:13:45.469362601 +0000 UTC m=+1237.568116951 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-metrics-certs") pod "openstack-operator-controller-manager-6bbb7bcd9-l7q5f" (UID: "09a67a89-922f-4e74-8621-27fbfb770f0b") : secret "metrics-server-cert" not found Mar 08 04:13:44.469476 master-0 kubenswrapper[18592]: E0308 04:13:44.469471 18592 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 04:13:44.469762 master-0 kubenswrapper[18592]: E0308 04:13:44.469522 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-webhook-certs podName:09a67a89-922f-4e74-8621-27fbfb770f0b nodeName:}" failed. No retries permitted until 2026-03-08 04:13:45.469506875 +0000 UTC m=+1237.568261225 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-webhook-certs") pod "openstack-operator-controller-manager-6bbb7bcd9-l7q5f" (UID: "09a67a89-922f-4e74-8621-27fbfb770f0b") : secret "webhook-server-cert" not found Mar 08 04:13:44.476307 master-0 kubenswrapper[18592]: I0308 04:13:44.476257 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-27xxz" event={"ID":"64afb73a-1b22-4506-98d4-98d7ddf38a93","Type":"ContainerStarted","Data":"0f67f6a175840abfd51dc886d58cf310e398c4a3203ab4b189408f98248c8d29"} Mar 08 04:13:44.480379 master-0 kubenswrapper[18592]: I0308 04:13:44.480339 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-nmjrc" event={"ID":"7a78dbb5-7df9-47bc-b7b1-a0ea7d42f8c3","Type":"ContainerStarted","Data":"fb7f73fbcc8bff8dbbd5d77a0e09d37eb667c1a13cd8d3d4a1ba2dbab2e60ac7"} Mar 08 04:13:44.481652 master-0 kubenswrapper[18592]: I0308 04:13:44.481620 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hq9nx" event={"ID":"ee358d70-7ec7-4e97-8e8c-b9144c6f1bff","Type":"ContainerStarted","Data":"6e821953a907b17bfc1658448a821c0ca702cd94f2d2fa9b46a91f870d127b8e"} Mar 08 04:13:44.482935 master-0 kubenswrapper[18592]: I0308 04:13:44.482909 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jtbw5" event={"ID":"eaa663c3-d979-4bf2-aa45-def619616016","Type":"ContainerStarted","Data":"c56f5d2e68019472dc2ecf78ed2afa36424275236a15c8c382e181156fe0b67b"} Mar 08 04:13:44.485775 master-0 kubenswrapper[18592]: I0308 04:13:44.485726 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-6w8p8" event={"ID":"2fa88bb5-6717-4aeb-b4b5-fb838c711a9a","Type":"ContainerStarted","Data":"608cde8aa69a1d989510daad1eda9e471b2cf7d9b7e6a55e044c041d665546c8"} Mar 08 04:13:44.507919 master-0 kubenswrapper[18592]: I0308 04:13:44.501063 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-585d849c57-sh9z8" event={"ID":"f20011d2-eebe-44f5-a184-a79992c664ff","Type":"ContainerStarted","Data":"10cfb47629cbf966c8806b1ec078ffb3c115aa8fb6f3ce832fffcafbf1293c1a"} Mar 08 04:13:44.709816 master-0 kubenswrapper[18592]: W0308 04:13:44.709545 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97dbb8f0_af27_4e9a_ae6f_8311025a0168.slice/crio-7c6f7b46135a001cfbbf33e647b425ddd875f8c321dab57211d68f97951c44b4 WatchSource:0}: Error finding container 7c6f7b46135a001cfbbf33e647b425ddd875f8c321dab57211d68f97951c44b4: Status 404 returned error can't find the container with id 7c6f7b46135a001cfbbf33e647b425ddd875f8c321dab57211d68f97951c44b4 Mar 08 04:13:44.714381 master-0 kubenswrapper[18592]: I0308 04:13:44.714251 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-llvv7"] Mar 08 04:13:44.729527 master-0 kubenswrapper[18592]: I0308 04:13:44.729494 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-cgc7b"] Mar 08 04:13:44.765015 master-0 kubenswrapper[18592]: I0308 04:13:44.764941 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-rtchq"] Mar 08 04:13:44.775356 master-0 kubenswrapper[18592]: I0308 04:13:44.775319 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wgmst"] Mar 08 04:13:44.787841 master-0 kubenswrapper[18592]: W0308 04:13:44.787783 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0a44319_b717_43ff_9799_15d4191de116.slice/crio-6fc971794e49971f3ea4ab09e28a2bf61608a70a6c74f94dfd405a86c88abe58 WatchSource:0}: Error finding container 6fc971794e49971f3ea4ab09e28a2bf61608a70a6c74f94dfd405a86c88abe58: Status 404 returned error can't find the container with id 6fc971794e49971f3ea4ab09e28a2bf61608a70a6c74f94dfd405a86c88abe58 Mar 08 04:13:44.792669 master-0 kubenswrapper[18592]: I0308 04:13:44.792585 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-6jmjm"] Mar 08 04:13:44.981443 master-0 kubenswrapper[18592]: I0308 04:13:44.981349 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-7vpk7\" (UID: \"42c8b424-b12a-48b3-94e7-22ba1d126949\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" Mar 08 04:13:44.981604 master-0 kubenswrapper[18592]: E0308 04:13:44.981544 18592 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 04:13:44.981604 master-0 kubenswrapper[18592]: E0308 04:13:44.981590 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert podName:42c8b424-b12a-48b3-94e7-22ba1d126949 nodeName:}" failed. No retries permitted until 2026-03-08 04:13:46.981575548 +0000 UTC m=+1239.080329898 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert") pod "infra-operator-controller-manager-b8c8d7cc8-7vpk7" (UID: "42c8b424-b12a-48b3-94e7-22ba1d126949") : secret "infra-operator-webhook-server-cert" not found Mar 08 04:13:45.146351 master-0 kubenswrapper[18592]: I0308 04:13:45.146290 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-c4wjd"] Mar 08 04:13:45.157810 master-0 kubenswrapper[18592]: W0308 04:13:45.156963 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfa2bccb_a39e_4939_bdd9_e6916672e53e.slice/crio-582fa845a4c1a68c799ea9ed3327d5008a8dbe3c0d8f4fa493a47e5bb8386c75 WatchSource:0}: Error finding container 582fa845a4c1a68c799ea9ed3327d5008a8dbe3c0d8f4fa493a47e5bb8386c75: Status 404 returned error can't find the container with id 582fa845a4c1a68c799ea9ed3327d5008a8dbe3c0d8f4fa493a47e5bb8386c75 Mar 08 04:13:45.210965 master-0 kubenswrapper[18592]: I0308 04:13:45.210495 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-ssj26"] Mar 08 04:13:45.222026 master-0 kubenswrapper[18592]: I0308 04:13:45.221983 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-sbvfq"] Mar 08 04:13:45.246729 master-0 kubenswrapper[18592]: I0308 04:13:45.246187 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-mtlx8"] Mar 08 04:13:45.258561 master-0 kubenswrapper[18592]: I0308 04:13:45.258294 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-lpjvx"] Mar 08 04:13:45.293466 master-0 kubenswrapper[18592]: I0308 04:13:45.293123 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/156ae5e7-80bd-4788-9fbe-94fb9a036161-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cf849j\" (UID: \"156ae5e7-80bd-4788-9fbe-94fb9a036161\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" Mar 08 04:13:45.293466 master-0 kubenswrapper[18592]: E0308 04:13:45.293358 18592 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 04:13:45.293466 master-0 kubenswrapper[18592]: E0308 04:13:45.293415 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/156ae5e7-80bd-4788-9fbe-94fb9a036161-cert podName:156ae5e7-80bd-4788-9fbe-94fb9a036161 nodeName:}" failed. No retries permitted until 2026-03-08 04:13:47.293399121 +0000 UTC m=+1239.392153471 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/156ae5e7-80bd-4788-9fbe-94fb9a036161-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" (UID: "156ae5e7-80bd-4788-9fbe-94fb9a036161") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 04:13:45.500972 master-0 kubenswrapper[18592]: I0308 04:13:45.496434 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-metrics-certs\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:45.500972 master-0 kubenswrapper[18592]: I0308 04:13:45.496638 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-webhook-certs\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:45.500972 master-0 kubenswrapper[18592]: E0308 04:13:45.496814 18592 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 04:13:45.500972 master-0 kubenswrapper[18592]: E0308 04:13:45.496910 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-webhook-certs podName:09a67a89-922f-4e74-8621-27fbfb770f0b nodeName:}" failed. No retries permitted until 2026-03-08 04:13:47.496895797 +0000 UTC m=+1239.595650147 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-webhook-certs") pod "openstack-operator-controller-manager-6bbb7bcd9-l7q5f" (UID: "09a67a89-922f-4e74-8621-27fbfb770f0b") : secret "webhook-server-cert" not found Mar 08 04:13:45.500972 master-0 kubenswrapper[18592]: E0308 04:13:45.497238 18592 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 04:13:45.500972 master-0 kubenswrapper[18592]: E0308 04:13:45.497265 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-metrics-certs podName:09a67a89-922f-4e74-8621-27fbfb770f0b nodeName:}" failed. No retries permitted until 2026-03-08 04:13:47.497258496 +0000 UTC m=+1239.596012846 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-metrics-certs") pod "openstack-operator-controller-manager-6bbb7bcd9-l7q5f" (UID: "09a67a89-922f-4e74-8621-27fbfb770f0b") : secret "metrics-server-cert" not found Mar 08 04:13:45.532205 master-0 kubenswrapper[18592]: I0308 04:13:45.531982 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-llvv7" event={"ID":"97dbb8f0-af27-4e9a-ae6f-8311025a0168","Type":"ContainerStarted","Data":"7c6f7b46135a001cfbbf33e647b425ddd875f8c321dab57211d68f97951c44b4"} Mar 08 04:13:45.544351 master-0 kubenswrapper[18592]: I0308 04:13:45.544281 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-cgc7b" event={"ID":"9b73e51b-5898-4d03-b81e-977e96f47ae5","Type":"ContainerStarted","Data":"8523d02b2e69fa96a113e78fe3b5daa57949693a2fd870a17578b31eb22d7ab0"} Mar 08 04:13:45.546568 master-0 kubenswrapper[18592]: I0308 04:13:45.546489 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-ssj26" event={"ID":"987df199-3a6e-40aa-8736-deadb057262a","Type":"ContainerStarted","Data":"559779da5464412954024beeb69406d35f84c1f51a19f2127ab05bf3c18e59d4"} Mar 08 04:13:45.549020 master-0 kubenswrapper[18592]: I0308 04:13:45.548946 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-lpjvx" event={"ID":"50e962ee-4845-422a-948b-05e4f5b1097a","Type":"ContainerStarted","Data":"04103595626b839227f7b456b5d8f4ed043990d85bd2b240c1665be4b627d2f3"} Mar 08 04:13:45.550930 master-0 kubenswrapper[18592]: I0308 04:13:45.550484 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-mtlx8" event={"ID":"2723e73b-52d4-4d1a-9778-2c14ffa390c2","Type":"ContainerStarted","Data":"a47ce9321fb00c0dfec6146230d82d31a9456ae2cca2a89a1281385f87c9f174"} Mar 08 04:13:45.554871 master-0 kubenswrapper[18592]: I0308 04:13:45.554798 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6jmjm" event={"ID":"393711cd-0c1e-4052-89b0-498e0e2a09ba","Type":"ContainerStarted","Data":"2d084d92e6f1c9891a334cf3d971903f322752f7ff0ad54ded33ffb97ad5a5ee"} Mar 08 04:13:45.559176 master-0 kubenswrapper[18592]: I0308 04:13:45.559104 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-c4wjd" event={"ID":"dfa2bccb-a39e-4939-bdd9-e6916672e53e","Type":"ContainerStarted","Data":"582fa845a4c1a68c799ea9ed3327d5008a8dbe3c0d8f4fa493a47e5bb8386c75"} Mar 08 04:13:45.567974 master-0 kubenswrapper[18592]: I0308 04:13:45.561622 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-rhdgk"] Mar 08 04:13:45.567974 master-0 kubenswrapper[18592]: I0308 04:13:45.565163 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sbvfq" event={"ID":"6f82adce-b4bf-4a3e-a31e-c3c33c669501","Type":"ContainerStarted","Data":"c8984d002836ed34c9881b74b4cb999b5c8eca65875cea797be5df3d510647b4"} Mar 08 04:13:45.585668 master-0 kubenswrapper[18592]: I0308 04:13:45.571581 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-rtchq" event={"ID":"64139e00-f7e0-4225-9ca7-0942e38b5455","Type":"ContainerStarted","Data":"434e10ba6e739d3ce004081e6dbea9227fce8d979a59b16c3875988ebbb11125"} Mar 08 04:13:45.585668 master-0 kubenswrapper[18592]: I0308 04:13:45.572684 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-xk2sd"] Mar 08 04:13:45.585668 master-0 kubenswrapper[18592]: I0308 04:13:45.574642 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wgmst" event={"ID":"b0a44319-b717-43ff-9799-15d4191de116","Type":"ContainerStarted","Data":"6fc971794e49971f3ea4ab09e28a2bf61608a70a6c74f94dfd405a86c88abe58"} Mar 08 04:13:45.594813 master-0 kubenswrapper[18592]: I0308 04:13:45.589881 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fr47r"] Mar 08 04:13:46.585580 master-0 kubenswrapper[18592]: I0308 04:13:46.585520 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fr47r" event={"ID":"d56131ac-d36a-4ee0-b997-ee7640a1fd15","Type":"ContainerStarted","Data":"ee7dcd79f70868f086ba2b8fc7aaf3d25f6fcaeec25b88c2fe183b0dbe702cd6"} Mar 08 04:13:46.589601 master-0 kubenswrapper[18592]: I0308 04:13:46.589566 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-rhdgk" event={"ID":"cbed35cc-1e50-495e-8e4e-5b3765218089","Type":"ContainerStarted","Data":"4c4846392f0924e4dd9d9a7801edfe30ce4ca75a461942e20b3f47cb46028ed4"} Mar 08 04:13:46.593671 master-0 kubenswrapper[18592]: I0308 04:13:46.593634 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xk2sd" event={"ID":"5c5e5398-174a-4dfd-b5ad-27496731960b","Type":"ContainerStarted","Data":"125fbebee3e13ccd0c55c24bf16c0052ae4b7bead6c0d4e02dd760e90e8c0b59"} Mar 08 04:13:47.034939 master-0 kubenswrapper[18592]: I0308 04:13:47.034877 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-7vpk7\" (UID: \"42c8b424-b12a-48b3-94e7-22ba1d126949\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" Mar 08 04:13:47.035258 master-0 kubenswrapper[18592]: E0308 04:13:47.035087 18592 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 04:13:47.035258 master-0 kubenswrapper[18592]: E0308 04:13:47.035178 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert podName:42c8b424-b12a-48b3-94e7-22ba1d126949 nodeName:}" failed. No retries permitted until 2026-03-08 04:13:51.035154688 +0000 UTC m=+1243.133909038 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert") pod "infra-operator-controller-manager-b8c8d7cc8-7vpk7" (UID: "42c8b424-b12a-48b3-94e7-22ba1d126949") : secret "infra-operator-webhook-server-cert" not found Mar 08 04:13:47.341359 master-0 kubenswrapper[18592]: I0308 04:13:47.341056 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/156ae5e7-80bd-4788-9fbe-94fb9a036161-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cf849j\" (UID: \"156ae5e7-80bd-4788-9fbe-94fb9a036161\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" Mar 08 04:13:47.341533 master-0 kubenswrapper[18592]: E0308 04:13:47.341440 18592 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 04:13:47.341533 master-0 kubenswrapper[18592]: E0308 04:13:47.341489 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/156ae5e7-80bd-4788-9fbe-94fb9a036161-cert podName:156ae5e7-80bd-4788-9fbe-94fb9a036161 nodeName:}" failed. No retries permitted until 2026-03-08 04:13:51.341473075 +0000 UTC m=+1243.440227425 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/156ae5e7-80bd-4788-9fbe-94fb9a036161-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" (UID: "156ae5e7-80bd-4788-9fbe-94fb9a036161") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 04:13:47.545922 master-0 kubenswrapper[18592]: I0308 04:13:47.544266 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-webhook-certs\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:47.545922 master-0 kubenswrapper[18592]: I0308 04:13:47.544404 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-metrics-certs\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:47.545922 master-0 kubenswrapper[18592]: E0308 04:13:47.544471 18592 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 04:13:47.545922 master-0 kubenswrapper[18592]: E0308 04:13:47.544566 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-webhook-certs podName:09a67a89-922f-4e74-8621-27fbfb770f0b nodeName:}" failed. No retries permitted until 2026-03-08 04:13:51.544542139 +0000 UTC m=+1243.643296479 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-webhook-certs") pod "openstack-operator-controller-manager-6bbb7bcd9-l7q5f" (UID: "09a67a89-922f-4e74-8621-27fbfb770f0b") : secret "webhook-server-cert" not found Mar 08 04:13:47.545922 master-0 kubenswrapper[18592]: E0308 04:13:47.544628 18592 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 04:13:47.545922 master-0 kubenswrapper[18592]: E0308 04:13:47.544741 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-metrics-certs podName:09a67a89-922f-4e74-8621-27fbfb770f0b nodeName:}" failed. No retries permitted until 2026-03-08 04:13:51.544712484 +0000 UTC m=+1243.643466894 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-metrics-certs") pod "openstack-operator-controller-manager-6bbb7bcd9-l7q5f" (UID: "09a67a89-922f-4e74-8621-27fbfb770f0b") : secret "metrics-server-cert" not found Mar 08 04:13:51.043417 master-0 kubenswrapper[18592]: I0308 04:13:51.043356 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-7vpk7\" (UID: \"42c8b424-b12a-48b3-94e7-22ba1d126949\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" Mar 08 04:13:51.043993 master-0 kubenswrapper[18592]: E0308 04:13:51.043588 18592 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 04:13:51.043993 master-0 kubenswrapper[18592]: E0308 04:13:51.043699 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert podName:42c8b424-b12a-48b3-94e7-22ba1d126949 nodeName:}" failed. No retries permitted until 2026-03-08 04:13:59.043674209 +0000 UTC m=+1251.142428569 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert") pod "infra-operator-controller-manager-b8c8d7cc8-7vpk7" (UID: "42c8b424-b12a-48b3-94e7-22ba1d126949") : secret "infra-operator-webhook-server-cert" not found Mar 08 04:13:51.350154 master-0 kubenswrapper[18592]: I0308 04:13:51.350019 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/156ae5e7-80bd-4788-9fbe-94fb9a036161-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cf849j\" (UID: \"156ae5e7-80bd-4788-9fbe-94fb9a036161\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" Mar 08 04:13:51.350485 master-0 kubenswrapper[18592]: E0308 04:13:51.350211 18592 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 04:13:51.350485 master-0 kubenswrapper[18592]: E0308 04:13:51.350310 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/156ae5e7-80bd-4788-9fbe-94fb9a036161-cert podName:156ae5e7-80bd-4788-9fbe-94fb9a036161 nodeName:}" failed. No retries permitted until 2026-03-08 04:13:59.350286403 +0000 UTC m=+1251.449040753 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/156ae5e7-80bd-4788-9fbe-94fb9a036161-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" (UID: "156ae5e7-80bd-4788-9fbe-94fb9a036161") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 04:13:51.555333 master-0 kubenswrapper[18592]: I0308 04:13:51.555228 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-metrics-certs\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:51.555575 master-0 kubenswrapper[18592]: E0308 04:13:51.555451 18592 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 04:13:51.555575 master-0 kubenswrapper[18592]: I0308 04:13:51.555483 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-webhook-certs\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:51.555575 master-0 kubenswrapper[18592]: E0308 04:13:51.555516 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-metrics-certs podName:09a67a89-922f-4e74-8621-27fbfb770f0b nodeName:}" failed. No retries permitted until 2026-03-08 04:13:59.555499045 +0000 UTC m=+1251.654253395 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-metrics-certs") pod "openstack-operator-controller-manager-6bbb7bcd9-l7q5f" (UID: "09a67a89-922f-4e74-8621-27fbfb770f0b") : secret "metrics-server-cert" not found Mar 08 04:13:51.555710 master-0 kubenswrapper[18592]: E0308 04:13:51.555652 18592 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 04:13:51.555710 master-0 kubenswrapper[18592]: E0308 04:13:51.555692 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-webhook-certs podName:09a67a89-922f-4e74-8621-27fbfb770f0b nodeName:}" failed. No retries permitted until 2026-03-08 04:13:59.55568018 +0000 UTC m=+1251.654434540 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-webhook-certs") pod "openstack-operator-controller-manager-6bbb7bcd9-l7q5f" (UID: "09a67a89-922f-4e74-8621-27fbfb770f0b") : secret "webhook-server-cert" not found Mar 08 04:13:59.114537 master-0 kubenswrapper[18592]: I0308 04:13:59.114297 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-7vpk7\" (UID: \"42c8b424-b12a-48b3-94e7-22ba1d126949\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" Mar 08 04:13:59.114537 master-0 kubenswrapper[18592]: E0308 04:13:59.114520 18592 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 04:13:59.116561 master-0 kubenswrapper[18592]: E0308 04:13:59.116472 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert podName:42c8b424-b12a-48b3-94e7-22ba1d126949 nodeName:}" failed. No retries permitted until 2026-03-08 04:14:15.114615462 +0000 UTC m=+1267.213369822 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert") pod "infra-operator-controller-manager-b8c8d7cc8-7vpk7" (UID: "42c8b424-b12a-48b3-94e7-22ba1d126949") : secret "infra-operator-webhook-server-cert" not found Mar 08 04:13:59.420636 master-0 kubenswrapper[18592]: I0308 04:13:59.420346 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/156ae5e7-80bd-4788-9fbe-94fb9a036161-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cf849j\" (UID: \"156ae5e7-80bd-4788-9fbe-94fb9a036161\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" Mar 08 04:13:59.425532 master-0 kubenswrapper[18592]: I0308 04:13:59.425485 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/156ae5e7-80bd-4788-9fbe-94fb9a036161-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cf849j\" (UID: \"156ae5e7-80bd-4788-9fbe-94fb9a036161\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" Mar 08 04:13:59.459550 master-0 kubenswrapper[18592]: I0308 04:13:59.459481 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" Mar 08 04:13:59.622819 master-0 kubenswrapper[18592]: I0308 04:13:59.622755 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-webhook-certs\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:59.623198 master-0 kubenswrapper[18592]: I0308 04:13:59.622967 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-metrics-certs\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:59.627880 master-0 kubenswrapper[18592]: I0308 04:13:59.627805 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-metrics-certs\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:59.632532 master-0 kubenswrapper[18592]: I0308 04:13:59.632476 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09a67a89-922f-4e74-8621-27fbfb770f0b-webhook-certs\") pod \"openstack-operator-controller-manager-6bbb7bcd9-l7q5f\" (UID: \"09a67a89-922f-4e74-8621-27fbfb770f0b\") " pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:13:59.860338 master-0 kubenswrapper[18592]: I0308 04:13:59.860252 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:14:05.881908 master-0 kubenswrapper[18592]: I0308 04:14:05.879413 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j8xwt" Mar 08 04:14:05.901358 master-0 kubenswrapper[18592]: I0308 04:14:05.901287 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j8xwt" podStartSLOduration=1.883386256 podStartE2EDuration="23.90126673s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:43.402172813 +0000 UTC m=+1235.500927163" lastFinishedPulling="2026-03-08 04:14:05.420053257 +0000 UTC m=+1257.518807637" observedRunningTime="2026-03-08 04:14:05.891049448 +0000 UTC m=+1257.989803808" watchObservedRunningTime="2026-03-08 04:14:05.90126673 +0000 UTC m=+1258.000021080" Mar 08 04:14:06.089040 master-0 kubenswrapper[18592]: I0308 04:14:06.088635 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j"] Mar 08 04:14:06.269643 master-0 kubenswrapper[18592]: I0308 04:14:06.269589 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f"] Mar 08 04:14:06.317962 master-0 kubenswrapper[18592]: W0308 04:14:06.315002 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09a67a89_922f_4e74_8621_27fbfb770f0b.slice/crio-a0d3fa646f99d07b57c626b2f772bdf96b37aae7cdfe8bf14c57f7a505f36a1b WatchSource:0}: Error finding container a0d3fa646f99d07b57c626b2f772bdf96b37aae7cdfe8bf14c57f7a505f36a1b: Status 404 returned error can't find the container with id a0d3fa646f99d07b57c626b2f772bdf96b37aae7cdfe8bf14c57f7a505f36a1b Mar 08 04:14:06.901320 master-0 kubenswrapper[18592]: I0308 04:14:06.901247 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-llvv7" event={"ID":"97dbb8f0-af27-4e9a-ae6f-8311025a0168","Type":"ContainerStarted","Data":"730cfff4c29f27766af490d99ba0498dc14c33cd82f5454893baa4521133b590"} Mar 08 04:14:06.902066 master-0 kubenswrapper[18592]: I0308 04:14:06.901693 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-llvv7" Mar 08 04:14:06.916164 master-0 kubenswrapper[18592]: I0308 04:14:06.916088 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-c4wjd" event={"ID":"dfa2bccb-a39e-4939-bdd9-e6916672e53e","Type":"ContainerStarted","Data":"ee3986df0d5e056085d500be0b5378acbe7f9cfbfdb6e8c6cf61b1d4d55abb01"} Mar 08 04:14:06.917267 master-0 kubenswrapper[18592]: I0308 04:14:06.917231 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-c4wjd" Mar 08 04:14:06.952035 master-0 kubenswrapper[18592]: I0308 04:14:06.951909 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-llvv7" podStartSLOduration=4.160559657 podStartE2EDuration="24.951882248s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:44.732782519 +0000 UTC m=+1236.831536869" lastFinishedPulling="2026-03-08 04:14:05.52410508 +0000 UTC m=+1257.622859460" observedRunningTime="2026-03-08 04:14:06.944174793 +0000 UTC m=+1259.042929143" watchObservedRunningTime="2026-03-08 04:14:06.951882248 +0000 UTC m=+1259.050636598" Mar 08 04:14:06.963962 master-0 kubenswrapper[18592]: I0308 04:14:06.957182 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-ssj26" event={"ID":"987df199-3a6e-40aa-8736-deadb057262a","Type":"ContainerStarted","Data":"bb746501b08494956539170b54a57283611e559e9581f7b724a80e179aa94257"} Mar 08 04:14:06.963962 master-0 kubenswrapper[18592]: I0308 04:14:06.957602 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-ssj26" Mar 08 04:14:07.020001 master-0 kubenswrapper[18592]: I0308 04:14:07.019931 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-rhdgk" event={"ID":"cbed35cc-1e50-495e-8e4e-5b3765218089","Type":"ContainerStarted","Data":"7effbc91a9880e544a4a983a30fb763373aee5387f4717ab11e577bbd9cb9e01"} Mar 08 04:14:07.021411 master-0 kubenswrapper[18592]: I0308 04:14:07.021325 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-c4wjd" podStartSLOduration=4.620190334 podStartE2EDuration="25.021299081s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:45.17632253 +0000 UTC m=+1237.275076880" lastFinishedPulling="2026-03-08 04:14:05.577431257 +0000 UTC m=+1257.676185627" observedRunningTime="2026-03-08 04:14:06.992797014 +0000 UTC m=+1259.091551364" watchObservedRunningTime="2026-03-08 04:14:07.021299081 +0000 UTC m=+1259.120053431" Mar 08 04:14:07.021598 master-0 kubenswrapper[18592]: I0308 04:14:07.021579 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-rhdgk" Mar 08 04:14:07.043994 master-0 kubenswrapper[18592]: I0308 04:14:07.043220 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-ssj26" podStartSLOduration=4.6483161410000005 podStartE2EDuration="25.043201303s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:45.198495899 +0000 UTC m=+1237.297250249" lastFinishedPulling="2026-03-08 04:14:05.593381051 +0000 UTC m=+1257.692135411" observedRunningTime="2026-03-08 04:14:07.022061242 +0000 UTC m=+1259.120815592" watchObservedRunningTime="2026-03-08 04:14:07.043201303 +0000 UTC m=+1259.141955643" Mar 08 04:14:07.072679 master-0 kubenswrapper[18592]: I0308 04:14:07.072617 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xk2sd" event={"ID":"5c5e5398-174a-4dfd-b5ad-27496731960b","Type":"ContainerStarted","Data":"d6b0eaacac34f031605830572b789522a6104d8aa3cdd72eda0951b93ad3ea5a"} Mar 08 04:14:07.073166 master-0 kubenswrapper[18592]: I0308 04:14:07.073137 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xk2sd" Mar 08 04:14:07.123864 master-0 kubenswrapper[18592]: I0308 04:14:07.119608 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-rhdgk" podStartSLOduration=5.174537231 podStartE2EDuration="25.119581842s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:45.579513141 +0000 UTC m=+1237.678267481" lastFinishedPulling="2026-03-08 04:14:05.524557732 +0000 UTC m=+1257.623312092" observedRunningTime="2026-03-08 04:14:07.108994721 +0000 UTC m=+1259.207749071" watchObservedRunningTime="2026-03-08 04:14:07.119581842 +0000 UTC m=+1259.218336192" Mar 08 04:14:07.123864 master-0 kubenswrapper[18592]: I0308 04:14:07.123213 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-mtlx8" event={"ID":"2723e73b-52d4-4d1a-9778-2c14ffa390c2","Type":"ContainerStarted","Data":"94104579dc4a5efff0ea97c3049db5f7f0dbc7b67de42f389ca4e9ae803e8395"} Mar 08 04:14:07.124251 master-0 kubenswrapper[18592]: I0308 04:14:07.124109 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-mtlx8" Mar 08 04:14:07.151483 master-0 kubenswrapper[18592]: I0308 04:14:07.150043 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hq9nx" event={"ID":"ee358d70-7ec7-4e97-8e8c-b9144c6f1bff","Type":"ContainerStarted","Data":"35267c27895de3c0c62014a99cce20e463234cb90b503aacba5be82e21a9c97e"} Mar 08 04:14:07.151483 master-0 kubenswrapper[18592]: I0308 04:14:07.151428 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hq9nx" Mar 08 04:14:07.186859 master-0 kubenswrapper[18592]: I0308 04:14:07.176133 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-6w8p8" event={"ID":"2fa88bb5-6717-4aeb-b4b5-fb838c711a9a","Type":"ContainerStarted","Data":"c1037835c75d5237827cb3bbee0ee3b37088d94402b0f333a1d4929f833fadde"} Mar 08 04:14:07.186859 master-0 kubenswrapper[18592]: I0308 04:14:07.177371 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-6w8p8" Mar 08 04:14:07.186859 master-0 kubenswrapper[18592]: I0308 04:14:07.185295 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" event={"ID":"156ae5e7-80bd-4788-9fbe-94fb9a036161","Type":"ContainerStarted","Data":"5e364afb18de57fce626fc8a0117a408f1783a924aa4b626397b47fd418ff15e"} Mar 08 04:14:07.204592 master-0 kubenswrapper[18592]: I0308 04:14:07.204507 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" event={"ID":"09a67a89-922f-4e74-8621-27fbfb770f0b","Type":"ContainerStarted","Data":"a0d3fa646f99d07b57c626b2f772bdf96b37aae7cdfe8bf14c57f7a505f36a1b"} Mar 08 04:14:07.204592 master-0 kubenswrapper[18592]: I0308 04:14:07.204558 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:14:07.219592 master-0 kubenswrapper[18592]: I0308 04:14:07.218157 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-lpjvx" event={"ID":"50e962ee-4845-422a-948b-05e4f5b1097a","Type":"ContainerStarted","Data":"4c4a3d8938e162e9439a43c0dd9b82112812c322099ee63f6764116174687da2"} Mar 08 04:14:07.219592 master-0 kubenswrapper[18592]: I0308 04:14:07.219459 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-lpjvx" Mar 08 04:14:07.231596 master-0 kubenswrapper[18592]: I0308 04:14:07.231309 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j8xwt" event={"ID":"73cadee4-f498-45a6-93be-258e903c9a53","Type":"ContainerStarted","Data":"e371cce93d283b85d3f1e82dd4dc60d88a383711b6fdf4f05691b679d866d9b5"} Mar 08 04:14:07.259629 master-0 kubenswrapper[18592]: I0308 04:14:07.259556 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jtbw5" event={"ID":"eaa663c3-d979-4bf2-aa45-def619616016","Type":"ContainerStarted","Data":"d2e7f12c701f81832941266b00e73833bc6ab38f648ff2a12195ff544a1dedf2"} Mar 08 04:14:07.260629 master-0 kubenswrapper[18592]: I0308 04:14:07.260595 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jtbw5" Mar 08 04:14:07.291859 master-0 kubenswrapper[18592]: I0308 04:14:07.291753 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-585d849c57-sh9z8" event={"ID":"f20011d2-eebe-44f5-a184-a79992c664ff","Type":"ContainerStarted","Data":"62814b8ed1ffe0f782b52475b0b4fbe1318ce6ff139e1831e5e757295bdd5bb8"} Mar 08 04:14:07.293021 master-0 kubenswrapper[18592]: I0308 04:14:07.292915 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-585d849c57-sh9z8" Mar 08 04:14:07.310243 master-0 kubenswrapper[18592]: I0308 04:14:07.309405 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sbvfq" event={"ID":"6f82adce-b4bf-4a3e-a31e-c3c33c669501","Type":"ContainerStarted","Data":"33ed99dc70b5b1d725fedb47a5f2ee83bbc1e0cc219066876dc8333ff1ae99b1"} Mar 08 04:14:07.310243 master-0 kubenswrapper[18592]: I0308 04:14:07.310244 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sbvfq" Mar 08 04:14:07.311664 master-0 kubenswrapper[18592]: I0308 04:14:07.311593 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-cgc7b" event={"ID":"9b73e51b-5898-4d03-b81e-977e96f47ae5","Type":"ContainerStarted","Data":"8f7a13e0d307518bccbfd781a195b401a3af4bda7c4f94519113a2835481da2f"} Mar 08 04:14:07.312216 master-0 kubenswrapper[18592]: I0308 04:14:07.312157 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-cgc7b" Mar 08 04:14:07.352245 master-0 kubenswrapper[18592]: I0308 04:14:07.352177 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-27xxz" event={"ID":"64afb73a-1b22-4506-98d4-98d7ddf38a93","Type":"ContainerStarted","Data":"6b02a5589915299d4b401288cf627acc986fadebccceab0f1516833807494a83"} Mar 08 04:14:07.354043 master-0 kubenswrapper[18592]: I0308 04:14:07.352950 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-27xxz" Mar 08 04:14:07.364337 master-0 kubenswrapper[18592]: I0308 04:14:07.363929 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wgmst" event={"ID":"b0a44319-b717-43ff-9799-15d4191de116","Type":"ContainerStarted","Data":"011406d1d98138d77b16ebe2f1e2cc16ffba416b0452807b8e0eaf881fdf18dd"} Mar 08 04:14:07.364718 master-0 kubenswrapper[18592]: I0308 04:14:07.364551 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wgmst" Mar 08 04:14:07.371431 master-0 kubenswrapper[18592]: I0308 04:14:07.370725 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-nmjrc" event={"ID":"7a78dbb5-7df9-47bc-b7b1-a0ea7d42f8c3","Type":"ContainerStarted","Data":"28ed6f12f7ce62c54679380a4fecf5c65ebf0876080981731052153c0528ca37"} Mar 08 04:14:07.372803 master-0 kubenswrapper[18592]: I0308 04:14:07.371813 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-nmjrc" Mar 08 04:14:07.372982 master-0 kubenswrapper[18592]: I0308 04:14:07.372901 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-rtchq" event={"ID":"64139e00-f7e0-4225-9ca7-0942e38b5455","Type":"ContainerStarted","Data":"7f0039fbd5e49874f33931d5fa52b60d754967a45ab818324937ca5e09ef80f2"} Mar 08 04:14:07.373346 master-0 kubenswrapper[18592]: I0308 04:14:07.373297 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-rtchq" Mar 08 04:14:07.374303 master-0 kubenswrapper[18592]: I0308 04:14:07.374247 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fr47r" event={"ID":"d56131ac-d36a-4ee0-b997-ee7640a1fd15","Type":"ContainerStarted","Data":"bbda1fc3949b62bc27cc56c4dd443836a01f24edb64c8e73072b957752dd4b9c"} Mar 08 04:14:07.375702 master-0 kubenswrapper[18592]: I0308 04:14:07.375500 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6jmjm" event={"ID":"393711cd-0c1e-4052-89b0-498e0e2a09ba","Type":"ContainerStarted","Data":"8e9179eac42cf48dd26be07907a9d25948e3a423e1e4d5339e427d999cb7c2a4"} Mar 08 04:14:07.376115 master-0 kubenswrapper[18592]: I0308 04:14:07.376037 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6jmjm" Mar 08 04:14:07.459867 master-0 kubenswrapper[18592]: I0308 04:14:07.457233 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xk2sd" podStartSLOduration=5.458456454 podStartE2EDuration="25.457208631s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:45.580043586 +0000 UTC m=+1237.678797936" lastFinishedPulling="2026-03-08 04:14:05.578795753 +0000 UTC m=+1257.677550113" observedRunningTime="2026-03-08 04:14:07.191453031 +0000 UTC m=+1259.290207381" watchObservedRunningTime="2026-03-08 04:14:07.457208631 +0000 UTC m=+1259.555962981" Mar 08 04:14:07.842854 master-0 kubenswrapper[18592]: I0308 04:14:07.842727 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-6w8p8" podStartSLOduration=4.456434354 podStartE2EDuration="25.842692991s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:44.123080022 +0000 UTC m=+1236.221834372" lastFinishedPulling="2026-03-08 04:14:05.509338629 +0000 UTC m=+1257.608093009" observedRunningTime="2026-03-08 04:14:07.45494257 +0000 UTC m=+1259.553696930" watchObservedRunningTime="2026-03-08 04:14:07.842692991 +0000 UTC m=+1259.941447341" Mar 08 04:14:07.848609 master-0 kubenswrapper[18592]: I0308 04:14:07.848549 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-mtlx8" podStartSLOduration=5.496052011 podStartE2EDuration="25.848531536s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:45.240606548 +0000 UTC m=+1237.339360898" lastFinishedPulling="2026-03-08 04:14:05.593086083 +0000 UTC m=+1257.691840423" observedRunningTime="2026-03-08 04:14:07.829152711 +0000 UTC m=+1259.927907061" watchObservedRunningTime="2026-03-08 04:14:07.848531536 +0000 UTC m=+1259.947285886" Mar 08 04:14:08.385817 master-0 kubenswrapper[18592]: I0308 04:14:08.385729 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" event={"ID":"09a67a89-922f-4e74-8621-27fbfb770f0b","Type":"ContainerStarted","Data":"84c8966e3e56913e1eb8b8f99233f13a5d7606113211f9a1420fecdeb4253172"} Mar 08 04:14:08.820487 master-0 kubenswrapper[18592]: I0308 04:14:08.820366 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hq9nx" podStartSLOduration=4.901838118 podStartE2EDuration="26.820347371s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:43.67495099 +0000 UTC m=+1235.773705340" lastFinishedPulling="2026-03-08 04:14:05.593460233 +0000 UTC m=+1257.692214593" observedRunningTime="2026-03-08 04:14:08.814246098 +0000 UTC m=+1260.913000448" watchObservedRunningTime="2026-03-08 04:14:08.820347371 +0000 UTC m=+1260.919101711" Mar 08 04:14:10.455565 master-0 kubenswrapper[18592]: I0308 04:14:10.455486 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6jmjm" podStartSLOduration=7.74757712 podStartE2EDuration="28.455463155s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:44.800550949 +0000 UTC m=+1236.899305289" lastFinishedPulling="2026-03-08 04:14:05.508436944 +0000 UTC m=+1257.607191324" observedRunningTime="2026-03-08 04:14:10.45337337 +0000 UTC m=+1262.552127740" watchObservedRunningTime="2026-03-08 04:14:10.455463155 +0000 UTC m=+1262.554217525" Mar 08 04:14:10.459427 master-0 kubenswrapper[18592]: I0308 04:14:10.459381 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-rtchq" podStartSLOduration=7.696084722 podStartE2EDuration="28.459370879s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:44.761695916 +0000 UTC m=+1236.860450266" lastFinishedPulling="2026-03-08 04:14:05.524982063 +0000 UTC m=+1257.623736423" observedRunningTime="2026-03-08 04:14:09.206335103 +0000 UTC m=+1261.305089493" watchObservedRunningTime="2026-03-08 04:14:10.459370879 +0000 UTC m=+1262.558125239" Mar 08 04:14:10.700778 master-0 kubenswrapper[18592]: I0308 04:14:10.700678 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-cgc7b" podStartSLOduration=7.935545472 podStartE2EDuration="28.700649698s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:44.744017037 +0000 UTC m=+1236.842771387" lastFinishedPulling="2026-03-08 04:14:05.509121273 +0000 UTC m=+1257.607875613" observedRunningTime="2026-03-08 04:14:10.683298177 +0000 UTC m=+1262.782052607" watchObservedRunningTime="2026-03-08 04:14:10.700649698 +0000 UTC m=+1262.799404088" Mar 08 04:14:10.727328 master-0 kubenswrapper[18592]: I0308 04:14:10.727022 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fr47r" podStartSLOduration=7.724968553 podStartE2EDuration="27.727002978s" podCreationTimestamp="2026-03-08 04:13:43 +0000 UTC" firstStartedPulling="2026-03-08 04:13:45.594265913 +0000 UTC m=+1237.693020263" lastFinishedPulling="2026-03-08 04:14:05.596300298 +0000 UTC m=+1257.695054688" observedRunningTime="2026-03-08 04:14:10.713214261 +0000 UTC m=+1262.811968631" watchObservedRunningTime="2026-03-08 04:14:10.727002978 +0000 UTC m=+1262.825757338" Mar 08 04:14:10.750096 master-0 kubenswrapper[18592]: I0308 04:14:10.749916 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-585d849c57-sh9z8" podStartSLOduration=8.094542475 podStartE2EDuration="28.749891845s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:44.281964543 +0000 UTC m=+1236.380718893" lastFinishedPulling="2026-03-08 04:14:04.937313913 +0000 UTC m=+1257.036068263" observedRunningTime="2026-03-08 04:14:10.734286381 +0000 UTC m=+1262.833040771" watchObservedRunningTime="2026-03-08 04:14:10.749891845 +0000 UTC m=+1262.848646205" Mar 08 04:14:10.825845 master-0 kubenswrapper[18592]: I0308 04:14:10.823610 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wgmst" podStartSLOduration=8.011162912 podStartE2EDuration="28.823561193s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:44.797097467 +0000 UTC m=+1236.895851817" lastFinishedPulling="2026-03-08 04:14:05.609495738 +0000 UTC m=+1257.708250098" observedRunningTime="2026-03-08 04:14:10.81252613 +0000 UTC m=+1262.911280500" watchObservedRunningTime="2026-03-08 04:14:10.823561193 +0000 UTC m=+1262.922315553" Mar 08 04:14:10.846673 master-0 kubenswrapper[18592]: I0308 04:14:10.846599 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jtbw5" podStartSLOduration=6.962072804 podStartE2EDuration="28.846581924s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:43.639466397 +0000 UTC m=+1235.738220747" lastFinishedPulling="2026-03-08 04:14:05.523975517 +0000 UTC m=+1257.622729867" observedRunningTime="2026-03-08 04:14:10.839379433 +0000 UTC m=+1262.938133783" watchObservedRunningTime="2026-03-08 04:14:10.846581924 +0000 UTC m=+1262.945336274" Mar 08 04:14:10.864900 master-0 kubenswrapper[18592]: I0308 04:14:10.864590 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-nmjrc" podStartSLOduration=7.329767021 podStartE2EDuration="28.864570822s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:44.042614516 +0000 UTC m=+1236.141368866" lastFinishedPulling="2026-03-08 04:14:05.577418287 +0000 UTC m=+1257.676172667" observedRunningTime="2026-03-08 04:14:10.860692709 +0000 UTC m=+1262.959447059" watchObservedRunningTime="2026-03-08 04:14:10.864570822 +0000 UTC m=+1262.963325182" Mar 08 04:14:10.901567 master-0 kubenswrapper[18592]: I0308 04:14:10.901487 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" podStartSLOduration=27.901467722 podStartE2EDuration="27.901467722s" podCreationTimestamp="2026-03-08 04:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:14:10.900305661 +0000 UTC m=+1262.999060031" watchObservedRunningTime="2026-03-08 04:14:10.901467722 +0000 UTC m=+1263.000222072" Mar 08 04:14:10.928804 master-0 kubenswrapper[18592]: I0308 04:14:10.928721 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-lpjvx" podStartSLOduration=8.644829542 podStartE2EDuration="28.928703855s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:45.240578357 +0000 UTC m=+1237.339332707" lastFinishedPulling="2026-03-08 04:14:05.52445263 +0000 UTC m=+1257.623207020" observedRunningTime="2026-03-08 04:14:10.927691919 +0000 UTC m=+1263.026446269" watchObservedRunningTime="2026-03-08 04:14:10.928703855 +0000 UTC m=+1263.027458205" Mar 08 04:14:10.963137 master-0 kubenswrapper[18592]: I0308 04:14:10.963041 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-27xxz" podStartSLOduration=7.747816924 podStartE2EDuration="28.963017857s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:44.293951411 +0000 UTC m=+1236.392705761" lastFinishedPulling="2026-03-08 04:14:05.509152304 +0000 UTC m=+1257.607906694" observedRunningTime="2026-03-08 04:14:10.949415696 +0000 UTC m=+1263.048170056" watchObservedRunningTime="2026-03-08 04:14:10.963017857 +0000 UTC m=+1263.061772207" Mar 08 04:14:10.979904 master-0 kubenswrapper[18592]: I0308 04:14:10.979371 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sbvfq" podStartSLOduration=8.621821212 podStartE2EDuration="28.979354191s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:13:45.231766693 +0000 UTC m=+1237.330521043" lastFinishedPulling="2026-03-08 04:14:05.589299632 +0000 UTC m=+1257.688054022" observedRunningTime="2026-03-08 04:14:10.97366454 +0000 UTC m=+1263.072418890" watchObservedRunningTime="2026-03-08 04:14:10.979354191 +0000 UTC m=+1263.078108541" Mar 08 04:14:11.422637 master-0 kubenswrapper[18592]: I0308 04:14:11.422459 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" event={"ID":"156ae5e7-80bd-4788-9fbe-94fb9a036161","Type":"ContainerStarted","Data":"d5f59fdf709ffb1b3c946985a5cd77cf67bf28cf74e0ae9a3d892893e825cdd1"} Mar 08 04:14:11.423005 master-0 kubenswrapper[18592]: I0308 04:14:11.422972 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" Mar 08 04:14:11.451104 master-0 kubenswrapper[18592]: I0308 04:14:11.450978 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" podStartSLOduration=24.914963998 podStartE2EDuration="29.450955039s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:14:06.216687028 +0000 UTC m=+1258.315441378" lastFinishedPulling="2026-03-08 04:14:10.752678049 +0000 UTC m=+1262.851432419" observedRunningTime="2026-03-08 04:14:11.447429194 +0000 UTC m=+1263.546183584" watchObservedRunningTime="2026-03-08 04:14:11.450955039 +0000 UTC m=+1263.549709429" Mar 08 04:14:12.718152 master-0 kubenswrapper[18592]: I0308 04:14:12.718061 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-j8xwt" Mar 08 04:14:12.821238 master-0 kubenswrapper[18592]: I0308 04:14:12.820968 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hq9nx" Mar 08 04:14:12.821614 master-0 kubenswrapper[18592]: I0308 04:14:12.821582 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jtbw5" Mar 08 04:14:12.918336 master-0 kubenswrapper[18592]: I0308 04:14:12.918252 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-nmjrc" Mar 08 04:14:13.256895 master-0 kubenswrapper[18592]: I0308 04:14:13.256780 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-6w8p8" Mar 08 04:14:13.548590 master-0 kubenswrapper[18592]: I0308 04:14:13.548459 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-27xxz" Mar 08 04:14:13.627588 master-0 kubenswrapper[18592]: I0308 04:14:13.627523 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-585d849c57-sh9z8" Mar 08 04:14:13.689928 master-0 kubenswrapper[18592]: I0308 04:14:13.689868 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wgmst" Mar 08 04:14:13.694425 master-0 kubenswrapper[18592]: I0308 04:14:13.694386 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-cgc7b" Mar 08 04:14:13.756317 master-0 kubenswrapper[18592]: I0308 04:14:13.756256 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-rtchq" Mar 08 04:14:13.765642 master-0 kubenswrapper[18592]: I0308 04:14:13.765538 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-ssj26" Mar 08 04:14:13.818925 master-0 kubenswrapper[18592]: I0308 04:14:13.818514 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6jmjm" Mar 08 04:14:13.822154 master-0 kubenswrapper[18592]: I0308 04:14:13.819574 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-llvv7" Mar 08 04:14:13.929538 master-0 kubenswrapper[18592]: I0308 04:14:13.929502 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-mtlx8" Mar 08 04:14:14.073504 master-0 kubenswrapper[18592]: I0308 04:14:14.073338 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-lpjvx" Mar 08 04:14:14.088148 master-0 kubenswrapper[18592]: I0308 04:14:14.088091 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-c4wjd" Mar 08 04:14:14.126533 master-0 kubenswrapper[18592]: I0308 04:14:14.126466 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-sbvfq" Mar 08 04:14:14.219813 master-0 kubenswrapper[18592]: I0308 04:14:14.219729 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xk2sd" Mar 08 04:14:14.256152 master-0 kubenswrapper[18592]: I0308 04:14:14.256100 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-rhdgk" Mar 08 04:14:15.180138 master-0 kubenswrapper[18592]: I0308 04:14:15.180020 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-7vpk7\" (UID: \"42c8b424-b12a-48b3-94e7-22ba1d126949\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" Mar 08 04:14:15.186898 master-0 kubenswrapper[18592]: I0308 04:14:15.183566 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42c8b424-b12a-48b3-94e7-22ba1d126949-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-7vpk7\" (UID: \"42c8b424-b12a-48b3-94e7-22ba1d126949\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" Mar 08 04:14:15.406415 master-0 kubenswrapper[18592]: I0308 04:14:15.406336 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" Mar 08 04:14:15.916693 master-0 kubenswrapper[18592]: I0308 04:14:15.916589 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7"] Mar 08 04:14:16.483588 master-0 kubenswrapper[18592]: I0308 04:14:16.483519 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" event={"ID":"42c8b424-b12a-48b3-94e7-22ba1d126949","Type":"ContainerStarted","Data":"88abda43f5a46f4264eac6b492811e73fc1275d03da120856a7d62be82d14425"} Mar 08 04:14:19.470520 master-0 kubenswrapper[18592]: I0308 04:14:19.470406 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cf849j" Mar 08 04:14:19.532386 master-0 kubenswrapper[18592]: I0308 04:14:19.532298 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" event={"ID":"42c8b424-b12a-48b3-94e7-22ba1d126949","Type":"ContainerStarted","Data":"2e45686f75c754d2d9ee93f835b6e5b54fa410d32fba9c5ffc886494fa6044d8"} Mar 08 04:14:19.532679 master-0 kubenswrapper[18592]: I0308 04:14:19.532634 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" Mar 08 04:14:19.623346 master-0 kubenswrapper[18592]: I0308 04:14:19.622589 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" podStartSLOduration=35.142386092 podStartE2EDuration="37.622564075s" podCreationTimestamp="2026-03-08 04:13:42 +0000 UTC" firstStartedPulling="2026-03-08 04:14:15.928328326 +0000 UTC m=+1268.027082716" lastFinishedPulling="2026-03-08 04:14:18.408506309 +0000 UTC m=+1270.507260699" observedRunningTime="2026-03-08 04:14:19.583016137 +0000 UTC m=+1271.681770497" watchObservedRunningTime="2026-03-08 04:14:19.622564075 +0000 UTC m=+1271.721318415" Mar 08 04:14:19.868882 master-0 kubenswrapper[18592]: I0308 04:14:19.868727 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6bbb7bcd9-l7q5f" Mar 08 04:14:25.412788 master-0 kubenswrapper[18592]: I0308 04:14:25.412690 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-7vpk7" Mar 08 04:15:07.391985 master-0 kubenswrapper[18592]: I0308 04:15:07.391901 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55994974c5-gb4wt"] Mar 08 04:15:07.393529 master-0 kubenswrapper[18592]: I0308 04:15:07.393492 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55994974c5-gb4wt" Mar 08 04:15:07.409179 master-0 kubenswrapper[18592]: I0308 04:15:07.409128 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 08 04:15:07.409618 master-0 kubenswrapper[18592]: I0308 04:15:07.409589 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 08 04:15:07.409753 master-0 kubenswrapper[18592]: I0308 04:15:07.409727 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 08 04:15:07.511151 master-0 kubenswrapper[18592]: I0308 04:15:07.506743 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88pwx\" (UniqueName: \"kubernetes.io/projected/651222ab-0f62-459d-b8a2-4897537cb868-kube-api-access-88pwx\") pod \"dnsmasq-dns-55994974c5-gb4wt\" (UID: \"651222ab-0f62-459d-b8a2-4897537cb868\") " pod="openstack/dnsmasq-dns-55994974c5-gb4wt" Mar 08 04:15:07.511151 master-0 kubenswrapper[18592]: I0308 04:15:07.506838 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/651222ab-0f62-459d-b8a2-4897537cb868-config\") pod \"dnsmasq-dns-55994974c5-gb4wt\" (UID: \"651222ab-0f62-459d-b8a2-4897537cb868\") " pod="openstack/dnsmasq-dns-55994974c5-gb4wt" Mar 08 04:15:07.526082 master-0 kubenswrapper[18592]: I0308 04:15:07.526015 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55994974c5-gb4wt"] Mar 08 04:15:07.584758 master-0 kubenswrapper[18592]: I0308 04:15:07.584702 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d859fb5df-tch88"] Mar 08 04:15:07.603107 master-0 kubenswrapper[18592]: I0308 04:15:07.603052 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d859fb5df-tch88" Mar 08 04:15:07.603318 master-0 kubenswrapper[18592]: I0308 04:15:07.603264 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d859fb5df-tch88"] Mar 08 04:15:07.605092 master-0 kubenswrapper[18592]: I0308 04:15:07.605067 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 08 04:15:07.614812 master-0 kubenswrapper[18592]: I0308 04:15:07.614704 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88pwx\" (UniqueName: \"kubernetes.io/projected/651222ab-0f62-459d-b8a2-4897537cb868-kube-api-access-88pwx\") pod \"dnsmasq-dns-55994974c5-gb4wt\" (UID: \"651222ab-0f62-459d-b8a2-4897537cb868\") " pod="openstack/dnsmasq-dns-55994974c5-gb4wt" Mar 08 04:15:07.614812 master-0 kubenswrapper[18592]: I0308 04:15:07.614793 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/651222ab-0f62-459d-b8a2-4897537cb868-config\") pod \"dnsmasq-dns-55994974c5-gb4wt\" (UID: \"651222ab-0f62-459d-b8a2-4897537cb868\") " pod="openstack/dnsmasq-dns-55994974c5-gb4wt" Mar 08 04:15:07.615733 master-0 kubenswrapper[18592]: I0308 04:15:07.615659 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/651222ab-0f62-459d-b8a2-4897537cb868-config\") pod \"dnsmasq-dns-55994974c5-gb4wt\" (UID: \"651222ab-0f62-459d-b8a2-4897537cb868\") " pod="openstack/dnsmasq-dns-55994974c5-gb4wt" Mar 08 04:15:07.640712 master-0 kubenswrapper[18592]: I0308 04:15:07.640666 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88pwx\" (UniqueName: \"kubernetes.io/projected/651222ab-0f62-459d-b8a2-4897537cb868-kube-api-access-88pwx\") pod \"dnsmasq-dns-55994974c5-gb4wt\" (UID: \"651222ab-0f62-459d-b8a2-4897537cb868\") " pod="openstack/dnsmasq-dns-55994974c5-gb4wt" Mar 08 04:15:07.716100 master-0 kubenswrapper[18592]: I0308 04:15:07.716046 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bdsk\" (UniqueName: \"kubernetes.io/projected/a117758c-2170-460f-8db5-251a9ca2006f-kube-api-access-9bdsk\") pod \"dnsmasq-dns-5d859fb5df-tch88\" (UID: \"a117758c-2170-460f-8db5-251a9ca2006f\") " pod="openstack/dnsmasq-dns-5d859fb5df-tch88" Mar 08 04:15:07.716292 master-0 kubenswrapper[18592]: I0308 04:15:07.716235 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a117758c-2170-460f-8db5-251a9ca2006f-dns-svc\") pod \"dnsmasq-dns-5d859fb5df-tch88\" (UID: \"a117758c-2170-460f-8db5-251a9ca2006f\") " pod="openstack/dnsmasq-dns-5d859fb5df-tch88" Mar 08 04:15:07.716408 master-0 kubenswrapper[18592]: I0308 04:15:07.716387 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a117758c-2170-460f-8db5-251a9ca2006f-config\") pod \"dnsmasq-dns-5d859fb5df-tch88\" (UID: \"a117758c-2170-460f-8db5-251a9ca2006f\") " pod="openstack/dnsmasq-dns-5d859fb5df-tch88" Mar 08 04:15:07.775186 master-0 kubenswrapper[18592]: I0308 04:15:07.775139 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55994974c5-gb4wt" Mar 08 04:15:07.818567 master-0 kubenswrapper[18592]: I0308 04:15:07.818522 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bdsk\" (UniqueName: \"kubernetes.io/projected/a117758c-2170-460f-8db5-251a9ca2006f-kube-api-access-9bdsk\") pod \"dnsmasq-dns-5d859fb5df-tch88\" (UID: \"a117758c-2170-460f-8db5-251a9ca2006f\") " pod="openstack/dnsmasq-dns-5d859fb5df-tch88" Mar 08 04:15:07.818740 master-0 kubenswrapper[18592]: I0308 04:15:07.818610 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a117758c-2170-460f-8db5-251a9ca2006f-dns-svc\") pod \"dnsmasq-dns-5d859fb5df-tch88\" (UID: \"a117758c-2170-460f-8db5-251a9ca2006f\") " pod="openstack/dnsmasq-dns-5d859fb5df-tch88" Mar 08 04:15:07.818740 master-0 kubenswrapper[18592]: I0308 04:15:07.818653 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a117758c-2170-460f-8db5-251a9ca2006f-config\") pod \"dnsmasq-dns-5d859fb5df-tch88\" (UID: \"a117758c-2170-460f-8db5-251a9ca2006f\") " pod="openstack/dnsmasq-dns-5d859fb5df-tch88" Mar 08 04:15:07.819430 master-0 kubenswrapper[18592]: I0308 04:15:07.819404 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a117758c-2170-460f-8db5-251a9ca2006f-config\") pod \"dnsmasq-dns-5d859fb5df-tch88\" (UID: \"a117758c-2170-460f-8db5-251a9ca2006f\") " pod="openstack/dnsmasq-dns-5d859fb5df-tch88" Mar 08 04:15:07.819671 master-0 kubenswrapper[18592]: I0308 04:15:07.819654 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a117758c-2170-460f-8db5-251a9ca2006f-dns-svc\") pod \"dnsmasq-dns-5d859fb5df-tch88\" (UID: \"a117758c-2170-460f-8db5-251a9ca2006f\") " pod="openstack/dnsmasq-dns-5d859fb5df-tch88" Mar 08 04:15:07.836975 master-0 kubenswrapper[18592]: I0308 04:15:07.836243 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bdsk\" (UniqueName: \"kubernetes.io/projected/a117758c-2170-460f-8db5-251a9ca2006f-kube-api-access-9bdsk\") pod \"dnsmasq-dns-5d859fb5df-tch88\" (UID: \"a117758c-2170-460f-8db5-251a9ca2006f\") " pod="openstack/dnsmasq-dns-5d859fb5df-tch88" Mar 08 04:15:07.929162 master-0 kubenswrapper[18592]: I0308 04:15:07.928739 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d859fb5df-tch88" Mar 08 04:15:08.245869 master-0 kubenswrapper[18592]: I0308 04:15:08.245127 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55994974c5-gb4wt"] Mar 08 04:15:08.432936 master-0 kubenswrapper[18592]: I0308 04:15:08.432874 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d859fb5df-tch88"] Mar 08 04:15:09.237928 master-0 kubenswrapper[18592]: I0308 04:15:09.237837 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55994974c5-gb4wt" event={"ID":"651222ab-0f62-459d-b8a2-4897537cb868","Type":"ContainerStarted","Data":"42b4d92ca29aea50e5c5ed37429e2638d90231bc42aacbfceb4df8a00d38bf35"} Mar 08 04:15:09.239428 master-0 kubenswrapper[18592]: I0308 04:15:09.239397 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d859fb5df-tch88" event={"ID":"a117758c-2170-460f-8db5-251a9ca2006f","Type":"ContainerStarted","Data":"dc725511c655099fc1670c36fc1ea7f438387a8654ab97a1de9d5a4f0165605a"} Mar 08 04:15:10.105307 master-0 kubenswrapper[18592]: I0308 04:15:10.104999 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55994974c5-gb4wt"] Mar 08 04:15:10.225927 master-0 kubenswrapper[18592]: I0308 04:15:10.225688 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6877bbfb4f-ht5gv"] Mar 08 04:15:10.227686 master-0 kubenswrapper[18592]: I0308 04:15:10.227668 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6877bbfb4f-ht5gv"] Mar 08 04:15:10.227849 master-0 kubenswrapper[18592]: I0308 04:15:10.227836 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" Mar 08 04:15:10.287033 master-0 kubenswrapper[18592]: I0308 04:15:10.286850 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f-config\") pod \"dnsmasq-dns-6877bbfb4f-ht5gv\" (UID: \"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f\") " pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" Mar 08 04:15:10.287430 master-0 kubenswrapper[18592]: I0308 04:15:10.287383 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f-dns-svc\") pod \"dnsmasq-dns-6877bbfb4f-ht5gv\" (UID: \"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f\") " pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" Mar 08 04:15:10.287914 master-0 kubenswrapper[18592]: I0308 04:15:10.287891 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8wc7\" (UniqueName: \"kubernetes.io/projected/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f-kube-api-access-l8wc7\") pod \"dnsmasq-dns-6877bbfb4f-ht5gv\" (UID: \"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f\") " pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" Mar 08 04:15:10.389682 master-0 kubenswrapper[18592]: I0308 04:15:10.389618 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f-config\") pod \"dnsmasq-dns-6877bbfb4f-ht5gv\" (UID: \"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f\") " pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" Mar 08 04:15:10.390501 master-0 kubenswrapper[18592]: I0308 04:15:10.390449 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f-config\") pod \"dnsmasq-dns-6877bbfb4f-ht5gv\" (UID: \"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f\") " pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" Mar 08 04:15:10.390551 master-0 kubenswrapper[18592]: I0308 04:15:10.390517 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f-dns-svc\") pod \"dnsmasq-dns-6877bbfb4f-ht5gv\" (UID: \"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f\") " pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" Mar 08 04:15:10.390799 master-0 kubenswrapper[18592]: I0308 04:15:10.390775 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8wc7\" (UniqueName: \"kubernetes.io/projected/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f-kube-api-access-l8wc7\") pod \"dnsmasq-dns-6877bbfb4f-ht5gv\" (UID: \"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f\") " pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" Mar 08 04:15:10.391700 master-0 kubenswrapper[18592]: I0308 04:15:10.391670 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f-dns-svc\") pod \"dnsmasq-dns-6877bbfb4f-ht5gv\" (UID: \"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f\") " pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" Mar 08 04:15:11.118217 master-0 kubenswrapper[18592]: I0308 04:15:11.117719 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8wc7\" (UniqueName: \"kubernetes.io/projected/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f-kube-api-access-l8wc7\") pod \"dnsmasq-dns-6877bbfb4f-ht5gv\" (UID: \"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f\") " pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" Mar 08 04:15:11.145804 master-0 kubenswrapper[18592]: I0308 04:15:11.145750 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" Mar 08 04:15:11.541891 master-0 kubenswrapper[18592]: I0308 04:15:11.541119 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d859fb5df-tch88"] Mar 08 04:15:11.607846 master-0 kubenswrapper[18592]: I0308 04:15:11.603283 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f75dd7cd9-2zs4f"] Mar 08 04:15:11.607846 master-0 kubenswrapper[18592]: I0308 04:15:11.605966 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" Mar 08 04:15:11.645897 master-0 kubenswrapper[18592]: I0308 04:15:11.641760 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8ttq\" (UniqueName: \"kubernetes.io/projected/d021c6b4-8118-43d3-a703-8c2f73e6e077-kube-api-access-h8ttq\") pod \"dnsmasq-dns-6f75dd7cd9-2zs4f\" (UID: \"d021c6b4-8118-43d3-a703-8c2f73e6e077\") " pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" Mar 08 04:15:11.645897 master-0 kubenswrapper[18592]: I0308 04:15:11.641975 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d021c6b4-8118-43d3-a703-8c2f73e6e077-config\") pod \"dnsmasq-dns-6f75dd7cd9-2zs4f\" (UID: \"d021c6b4-8118-43d3-a703-8c2f73e6e077\") " pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" Mar 08 04:15:11.645897 master-0 kubenswrapper[18592]: I0308 04:15:11.642035 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d021c6b4-8118-43d3-a703-8c2f73e6e077-dns-svc\") pod \"dnsmasq-dns-6f75dd7cd9-2zs4f\" (UID: \"d021c6b4-8118-43d3-a703-8c2f73e6e077\") " pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" Mar 08 04:15:11.675506 master-0 kubenswrapper[18592]: I0308 04:15:11.674617 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f75dd7cd9-2zs4f"] Mar 08 04:15:11.759212 master-0 kubenswrapper[18592]: I0308 04:15:11.758832 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8ttq\" (UniqueName: \"kubernetes.io/projected/d021c6b4-8118-43d3-a703-8c2f73e6e077-kube-api-access-h8ttq\") pod \"dnsmasq-dns-6f75dd7cd9-2zs4f\" (UID: \"d021c6b4-8118-43d3-a703-8c2f73e6e077\") " pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" Mar 08 04:15:11.759691 master-0 kubenswrapper[18592]: I0308 04:15:11.759673 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d021c6b4-8118-43d3-a703-8c2f73e6e077-config\") pod \"dnsmasq-dns-6f75dd7cd9-2zs4f\" (UID: \"d021c6b4-8118-43d3-a703-8c2f73e6e077\") " pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" Mar 08 04:15:11.759843 master-0 kubenswrapper[18592]: I0308 04:15:11.759818 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d021c6b4-8118-43d3-a703-8c2f73e6e077-dns-svc\") pod \"dnsmasq-dns-6f75dd7cd9-2zs4f\" (UID: \"d021c6b4-8118-43d3-a703-8c2f73e6e077\") " pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" Mar 08 04:15:11.760468 master-0 kubenswrapper[18592]: I0308 04:15:11.760430 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d021c6b4-8118-43d3-a703-8c2f73e6e077-config\") pod \"dnsmasq-dns-6f75dd7cd9-2zs4f\" (UID: \"d021c6b4-8118-43d3-a703-8c2f73e6e077\") " pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" Mar 08 04:15:11.764393 master-0 kubenswrapper[18592]: I0308 04:15:11.764348 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d021c6b4-8118-43d3-a703-8c2f73e6e077-dns-svc\") pod \"dnsmasq-dns-6f75dd7cd9-2zs4f\" (UID: \"d021c6b4-8118-43d3-a703-8c2f73e6e077\") " pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" Mar 08 04:15:11.790767 master-0 kubenswrapper[18592]: I0308 04:15:11.790712 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8ttq\" (UniqueName: \"kubernetes.io/projected/d021c6b4-8118-43d3-a703-8c2f73e6e077-kube-api-access-h8ttq\") pod \"dnsmasq-dns-6f75dd7cd9-2zs4f\" (UID: \"d021c6b4-8118-43d3-a703-8c2f73e6e077\") " pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" Mar 08 04:15:11.995906 master-0 kubenswrapper[18592]: I0308 04:15:11.993306 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" Mar 08 04:15:14.441060 master-0 kubenswrapper[18592]: I0308 04:15:14.430143 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 08 04:15:14.441060 master-0 kubenswrapper[18592]: I0308 04:15:14.440202 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 04:15:14.467124 master-0 kubenswrapper[18592]: I0308 04:15:14.467084 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 08 04:15:14.477063 master-0 kubenswrapper[18592]: I0308 04:15:14.475623 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 08 04:15:14.478407 master-0 kubenswrapper[18592]: I0308 04:15:14.478375 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 08 04:15:14.485420 master-0 kubenswrapper[18592]: I0308 04:15:14.485378 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 08 04:15:14.522853 master-0 kubenswrapper[18592]: I0308 04:15:14.522310 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d174d635-30c7-4d7d-a077-aa6436a5675a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d174d635-30c7-4d7d-a077-aa6436a5675a\") " pod="openstack/memcached-0" Mar 08 04:15:14.522853 master-0 kubenswrapper[18592]: I0308 04:15:14.522356 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d174d635-30c7-4d7d-a077-aa6436a5675a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d174d635-30c7-4d7d-a077-aa6436a5675a\") " pod="openstack/memcached-0" Mar 08 04:15:14.522853 master-0 kubenswrapper[18592]: I0308 04:15:14.522378 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d174d635-30c7-4d7d-a077-aa6436a5675a-config-data\") pod \"memcached-0\" (UID: \"d174d635-30c7-4d7d-a077-aa6436a5675a\") " pod="openstack/memcached-0" Mar 08 04:15:14.522853 master-0 kubenswrapper[18592]: I0308 04:15:14.522401 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d174d635-30c7-4d7d-a077-aa6436a5675a-kolla-config\") pod \"memcached-0\" (UID: \"d174d635-30c7-4d7d-a077-aa6436a5675a\") " pod="openstack/memcached-0" Mar 08 04:15:14.522853 master-0 kubenswrapper[18592]: I0308 04:15:14.522417 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q5wq\" (UniqueName: \"kubernetes.io/projected/d174d635-30c7-4d7d-a077-aa6436a5675a-kube-api-access-4q5wq\") pod \"memcached-0\" (UID: \"d174d635-30c7-4d7d-a077-aa6436a5675a\") " pod="openstack/memcached-0" Mar 08 04:15:14.624151 master-0 kubenswrapper[18592]: I0308 04:15:14.624098 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d174d635-30c7-4d7d-a077-aa6436a5675a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d174d635-30c7-4d7d-a077-aa6436a5675a\") " pod="openstack/memcached-0" Mar 08 04:15:14.624340 master-0 kubenswrapper[18592]: I0308 04:15:14.624240 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d174d635-30c7-4d7d-a077-aa6436a5675a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d174d635-30c7-4d7d-a077-aa6436a5675a\") " pod="openstack/memcached-0" Mar 08 04:15:14.624340 master-0 kubenswrapper[18592]: I0308 04:15:14.624265 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d174d635-30c7-4d7d-a077-aa6436a5675a-config-data\") pod \"memcached-0\" (UID: \"d174d635-30c7-4d7d-a077-aa6436a5675a\") " pod="openstack/memcached-0" Mar 08 04:15:14.624340 master-0 kubenswrapper[18592]: I0308 04:15:14.624311 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d174d635-30c7-4d7d-a077-aa6436a5675a-kolla-config\") pod \"memcached-0\" (UID: \"d174d635-30c7-4d7d-a077-aa6436a5675a\") " pod="openstack/memcached-0" Mar 08 04:15:14.624340 master-0 kubenswrapper[18592]: I0308 04:15:14.624328 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q5wq\" (UniqueName: \"kubernetes.io/projected/d174d635-30c7-4d7d-a077-aa6436a5675a-kube-api-access-4q5wq\") pod \"memcached-0\" (UID: \"d174d635-30c7-4d7d-a077-aa6436a5675a\") " pod="openstack/memcached-0" Mar 08 04:15:14.627291 master-0 kubenswrapper[18592]: I0308 04:15:14.625413 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d174d635-30c7-4d7d-a077-aa6436a5675a-config-data\") pod \"memcached-0\" (UID: \"d174d635-30c7-4d7d-a077-aa6436a5675a\") " pod="openstack/memcached-0" Mar 08 04:15:14.627291 master-0 kubenswrapper[18592]: I0308 04:15:14.626504 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d174d635-30c7-4d7d-a077-aa6436a5675a-kolla-config\") pod \"memcached-0\" (UID: \"d174d635-30c7-4d7d-a077-aa6436a5675a\") " pod="openstack/memcached-0" Mar 08 04:15:14.634413 master-0 kubenswrapper[18592]: I0308 04:15:14.634375 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d174d635-30c7-4d7d-a077-aa6436a5675a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d174d635-30c7-4d7d-a077-aa6436a5675a\") " pod="openstack/memcached-0" Mar 08 04:15:14.647038 master-0 kubenswrapper[18592]: I0308 04:15:14.647003 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q5wq\" (UniqueName: \"kubernetes.io/projected/d174d635-30c7-4d7d-a077-aa6436a5675a-kube-api-access-4q5wq\") pod \"memcached-0\" (UID: \"d174d635-30c7-4d7d-a077-aa6436a5675a\") " pod="openstack/memcached-0" Mar 08 04:15:14.649113 master-0 kubenswrapper[18592]: I0308 04:15:14.649090 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d174d635-30c7-4d7d-a077-aa6436a5675a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d174d635-30c7-4d7d-a077-aa6436a5675a\") " pod="openstack/memcached-0" Mar 08 04:15:14.757834 master-0 kubenswrapper[18592]: I0308 04:15:14.757707 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 04:15:15.249166 master-0 kubenswrapper[18592]: I0308 04:15:15.248960 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 04:15:15.250654 master-0 kubenswrapper[18592]: I0308 04:15:15.250608 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.261631 master-0 kubenswrapper[18592]: I0308 04:15:15.259401 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 08 04:15:15.261631 master-0 kubenswrapper[18592]: I0308 04:15:15.259532 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 08 04:15:15.285322 master-0 kubenswrapper[18592]: I0308 04:15:15.266219 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 08 04:15:15.285322 master-0 kubenswrapper[18592]: I0308 04:15:15.266459 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 08 04:15:15.285322 master-0 kubenswrapper[18592]: I0308 04:15:15.267610 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 08 04:15:15.285322 master-0 kubenswrapper[18592]: I0308 04:15:15.271913 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 04:15:15.285322 master-0 kubenswrapper[18592]: I0308 04:15:15.276673 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 08 04:15:15.443848 master-0 kubenswrapper[18592]: I0308 04:15:15.443372 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.443848 master-0 kubenswrapper[18592]: I0308 04:15:15.443423 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.443848 master-0 kubenswrapper[18592]: I0308 04:15:15.443455 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-config-data\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.443848 master-0 kubenswrapper[18592]: I0308 04:15:15.443613 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dded983b-d4d2-4809-a2d4-d6c7d33f32e9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^20943431-fb7a-4ade-8783-9ac9fa584eec\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.443848 master-0 kubenswrapper[18592]: I0308 04:15:15.443717 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b224w\" (UniqueName: \"kubernetes.io/projected/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-kube-api-access-b224w\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.443848 master-0 kubenswrapper[18592]: I0308 04:15:15.443785 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.443848 master-0 kubenswrapper[18592]: I0308 04:15:15.443805 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.444633 master-0 kubenswrapper[18592]: I0308 04:15:15.443906 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.444633 master-0 kubenswrapper[18592]: I0308 04:15:15.443950 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.444633 master-0 kubenswrapper[18592]: I0308 04:15:15.443983 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.444633 master-0 kubenswrapper[18592]: I0308 04:15:15.444083 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.548911 master-0 kubenswrapper[18592]: I0308 04:15:15.548712 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dded983b-d4d2-4809-a2d4-d6c7d33f32e9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^20943431-fb7a-4ade-8783-9ac9fa584eec\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.548911 master-0 kubenswrapper[18592]: I0308 04:15:15.548786 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b224w\" (UniqueName: \"kubernetes.io/projected/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-kube-api-access-b224w\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.548911 master-0 kubenswrapper[18592]: I0308 04:15:15.548810 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.548911 master-0 kubenswrapper[18592]: I0308 04:15:15.548839 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.548911 master-0 kubenswrapper[18592]: I0308 04:15:15.548910 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.549215 master-0 kubenswrapper[18592]: I0308 04:15:15.548934 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.549215 master-0 kubenswrapper[18592]: I0308 04:15:15.548958 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.561871 master-0 kubenswrapper[18592]: I0308 04:15:15.549549 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.561871 master-0 kubenswrapper[18592]: I0308 04:15:15.550242 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.561871 master-0 kubenswrapper[18592]: I0308 04:15:15.550311 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.561871 master-0 kubenswrapper[18592]: I0308 04:15:15.550332 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.561871 master-0 kubenswrapper[18592]: I0308 04:15:15.550357 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-config-data\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.561871 master-0 kubenswrapper[18592]: I0308 04:15:15.551149 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.561871 master-0 kubenswrapper[18592]: I0308 04:15:15.551912 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-config-data\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.561871 master-0 kubenswrapper[18592]: I0308 04:15:15.554081 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.561871 master-0 kubenswrapper[18592]: I0308 04:15:15.557566 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.561871 master-0 kubenswrapper[18592]: I0308 04:15:15.559426 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.564360 master-0 kubenswrapper[18592]: I0308 04:15:15.562641 18592 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 04:15:15.564360 master-0 kubenswrapper[18592]: I0308 04:15:15.562686 18592 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dded983b-d4d2-4809-a2d4-d6c7d33f32e9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^20943431-fb7a-4ade-8783-9ac9fa584eec\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/0b79729cbaadfbfa1c84abef9bfe80ad9dece7b27e50267f0a194c6251522e6f/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.571106 master-0 kubenswrapper[18592]: I0308 04:15:15.568035 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.592847 master-0 kubenswrapper[18592]: I0308 04:15:15.577734 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.592847 master-0 kubenswrapper[18592]: I0308 04:15:15.580953 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.600853 master-0 kubenswrapper[18592]: I0308 04:15:15.593992 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b224w\" (UniqueName: \"kubernetes.io/projected/6cc32df9-dcb4-43f3-b78d-f992b0488bf1-kube-api-access-b224w\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:15.790242 master-0 kubenswrapper[18592]: I0308 04:15:15.790177 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 04:15:15.792584 master-0 kubenswrapper[18592]: I0308 04:15:15.792557 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:15.795059 master-0 kubenswrapper[18592]: I0308 04:15:15.795027 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 08 04:15:15.795206 master-0 kubenswrapper[18592]: I0308 04:15:15.795163 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 08 04:15:15.795886 master-0 kubenswrapper[18592]: I0308 04:15:15.795858 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 08 04:15:15.796002 master-0 kubenswrapper[18592]: I0308 04:15:15.795978 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 08 04:15:15.796243 master-0 kubenswrapper[18592]: I0308 04:15:15.796217 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 08 04:15:15.796373 master-0 kubenswrapper[18592]: I0308 04:15:15.796350 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 08 04:15:15.923523 master-0 kubenswrapper[18592]: I0308 04:15:15.921751 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 04:15:15.986905 master-0 kubenswrapper[18592]: I0308 04:15:15.982558 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/466c2b13-2b27-4a83-911c-db97d66490a5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:15.986905 master-0 kubenswrapper[18592]: I0308 04:15:15.982622 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/466c2b13-2b27-4a83-911c-db97d66490a5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:15.986905 master-0 kubenswrapper[18592]: I0308 04:15:15.982645 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/466c2b13-2b27-4a83-911c-db97d66490a5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:15.986905 master-0 kubenswrapper[18592]: I0308 04:15:15.982669 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/466c2b13-2b27-4a83-911c-db97d66490a5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:15.986905 master-0 kubenswrapper[18592]: I0308 04:15:15.982696 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/466c2b13-2b27-4a83-911c-db97d66490a5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:15.986905 master-0 kubenswrapper[18592]: I0308 04:15:15.982720 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/466c2b13-2b27-4a83-911c-db97d66490a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:15.986905 master-0 kubenswrapper[18592]: I0308 04:15:15.982755 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trqms\" (UniqueName: \"kubernetes.io/projected/466c2b13-2b27-4a83-911c-db97d66490a5-kube-api-access-trqms\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:15.986905 master-0 kubenswrapper[18592]: I0308 04:15:15.982795 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/466c2b13-2b27-4a83-911c-db97d66490a5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:15.986905 master-0 kubenswrapper[18592]: I0308 04:15:15.982837 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/466c2b13-2b27-4a83-911c-db97d66490a5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:15.986905 master-0 kubenswrapper[18592]: I0308 04:15:15.982875 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e761f4d0-ad04-4858-a2d2-62c2dc0e24a0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^3e47d08b-74b3-4f4c-84d1-0ae0c231cb6b\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:15.986905 master-0 kubenswrapper[18592]: I0308 04:15:15.982903 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/466c2b13-2b27-4a83-911c-db97d66490a5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.087591 master-0 kubenswrapper[18592]: I0308 04:15:16.083927 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/466c2b13-2b27-4a83-911c-db97d66490a5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.087591 master-0 kubenswrapper[18592]: I0308 04:15:16.083987 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/466c2b13-2b27-4a83-911c-db97d66490a5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.087591 master-0 kubenswrapper[18592]: I0308 04:15:16.084029 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e761f4d0-ad04-4858-a2d2-62c2dc0e24a0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^3e47d08b-74b3-4f4c-84d1-0ae0c231cb6b\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.087591 master-0 kubenswrapper[18592]: I0308 04:15:16.084062 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/466c2b13-2b27-4a83-911c-db97d66490a5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.087591 master-0 kubenswrapper[18592]: I0308 04:15:16.084085 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/466c2b13-2b27-4a83-911c-db97d66490a5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.087591 master-0 kubenswrapper[18592]: I0308 04:15:16.084115 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/466c2b13-2b27-4a83-911c-db97d66490a5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.087591 master-0 kubenswrapper[18592]: I0308 04:15:16.084132 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/466c2b13-2b27-4a83-911c-db97d66490a5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.087591 master-0 kubenswrapper[18592]: I0308 04:15:16.084157 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/466c2b13-2b27-4a83-911c-db97d66490a5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.087591 master-0 kubenswrapper[18592]: I0308 04:15:16.084180 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/466c2b13-2b27-4a83-911c-db97d66490a5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.087591 master-0 kubenswrapper[18592]: I0308 04:15:16.084203 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/466c2b13-2b27-4a83-911c-db97d66490a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.087591 master-0 kubenswrapper[18592]: I0308 04:15:16.084236 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trqms\" (UniqueName: \"kubernetes.io/projected/466c2b13-2b27-4a83-911c-db97d66490a5-kube-api-access-trqms\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.087591 master-0 kubenswrapper[18592]: I0308 04:15:16.085108 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/466c2b13-2b27-4a83-911c-db97d66490a5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.087591 master-0 kubenswrapper[18592]: I0308 04:15:16.085674 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/466c2b13-2b27-4a83-911c-db97d66490a5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.087591 master-0 kubenswrapper[18592]: I0308 04:15:16.085992 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/466c2b13-2b27-4a83-911c-db97d66490a5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.087591 master-0 kubenswrapper[18592]: I0308 04:15:16.087223 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/466c2b13-2b27-4a83-911c-db97d66490a5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.095377 master-0 kubenswrapper[18592]: I0308 04:15:16.089068 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/466c2b13-2b27-4a83-911c-db97d66490a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.096153 master-0 kubenswrapper[18592]: I0308 04:15:16.096054 18592 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 04:15:16.096153 master-0 kubenswrapper[18592]: I0308 04:15:16.096089 18592 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e761f4d0-ad04-4858-a2d2-62c2dc0e24a0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^3e47d08b-74b3-4f4c-84d1-0ae0c231cb6b\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/c22e0b850b6a5a9f79cbdd4b779f0db44ec916d85f0c999da852ff954c3deb15/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.096153 master-0 kubenswrapper[18592]: I0308 04:15:16.096098 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/466c2b13-2b27-4a83-911c-db97d66490a5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.099900 master-0 kubenswrapper[18592]: I0308 04:15:16.098310 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/466c2b13-2b27-4a83-911c-db97d66490a5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.099900 master-0 kubenswrapper[18592]: I0308 04:15:16.099029 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/466c2b13-2b27-4a83-911c-db97d66490a5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.101011 master-0 kubenswrapper[18592]: I0308 04:15:16.100958 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trqms\" (UniqueName: \"kubernetes.io/projected/466c2b13-2b27-4a83-911c-db97d66490a5-kube-api-access-trqms\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:16.101209 master-0 kubenswrapper[18592]: I0308 04:15:16.101173 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/466c2b13-2b27-4a83-911c-db97d66490a5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:17.113178 master-0 kubenswrapper[18592]: I0308 04:15:17.102469 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 08 04:15:17.113178 master-0 kubenswrapper[18592]: I0308 04:15:17.104039 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 04:15:17.113178 master-0 kubenswrapper[18592]: I0308 04:15:17.110813 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 04:15:17.113178 master-0 kubenswrapper[18592]: I0308 04:15:17.111951 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 08 04:15:17.113178 master-0 kubenswrapper[18592]: I0308 04:15:17.112283 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 08 04:15:17.113178 master-0 kubenswrapper[18592]: I0308 04:15:17.112457 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 08 04:15:17.212592 master-0 kubenswrapper[18592]: I0308 04:15:17.212551 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.213872 master-0 kubenswrapper[18592]: I0308 04:15:17.213814 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.213872 master-0 kubenswrapper[18592]: I0308 04:15:17.213862 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a0aee878-f068-481f-98af-59d94bff27c4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6bc048da-778d-4d44-8ca8-abb95f9deb0b\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.214065 master-0 kubenswrapper[18592]: I0308 04:15:17.213946 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.214119 master-0 kubenswrapper[18592]: I0308 04:15:17.214100 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.214159 master-0 kubenswrapper[18592]: I0308 04:15:17.214134 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-kolla-config\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.214196 master-0 kubenswrapper[18592]: I0308 04:15:17.214156 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-config-data-default\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.214196 master-0 kubenswrapper[18592]: I0308 04:15:17.214193 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nmpc\" (UniqueName: \"kubernetes.io/projected/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-kube-api-access-8nmpc\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.223632 master-0 kubenswrapper[18592]: I0308 04:15:17.223576 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dded983b-d4d2-4809-a2d4-d6c7d33f32e9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^20943431-fb7a-4ade-8783-9ac9fa584eec\") pod \"rabbitmq-server-0\" (UID: \"6cc32df9-dcb4-43f3-b78d-f992b0488bf1\") " pod="openstack/rabbitmq-server-0" Mar 08 04:15:17.315344 master-0 kubenswrapper[18592]: I0308 04:15:17.315258 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.315517 master-0 kubenswrapper[18592]: I0308 04:15:17.315342 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a0aee878-f068-481f-98af-59d94bff27c4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6bc048da-778d-4d44-8ca8-abb95f9deb0b\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.315517 master-0 kubenswrapper[18592]: I0308 04:15:17.315397 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.315517 master-0 kubenswrapper[18592]: I0308 04:15:17.315446 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.315517 master-0 kubenswrapper[18592]: I0308 04:15:17.315466 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-kolla-config\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.315517 master-0 kubenswrapper[18592]: I0308 04:15:17.315485 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-config-data-default\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.315672 master-0 kubenswrapper[18592]: I0308 04:15:17.315523 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nmpc\" (UniqueName: \"kubernetes.io/projected/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-kube-api-access-8nmpc\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.315672 master-0 kubenswrapper[18592]: I0308 04:15:17.315559 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.316606 master-0 kubenswrapper[18592]: I0308 04:15:17.316557 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.317043 master-0 kubenswrapper[18592]: I0308 04:15:17.317018 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.318951 master-0 kubenswrapper[18592]: I0308 04:15:17.318534 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-config-data-default\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.319449 master-0 kubenswrapper[18592]: I0308 04:15:17.319231 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-kolla-config\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.319449 master-0 kubenswrapper[18592]: I0308 04:15:17.319307 18592 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 04:15:17.319449 master-0 kubenswrapper[18592]: I0308 04:15:17.319353 18592 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a0aee878-f068-481f-98af-59d94bff27c4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6bc048da-778d-4d44-8ca8-abb95f9deb0b\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/11d932fa895ef3d5637158815e06d1807a829cb93c3abc2217ed758823217884/globalmount\"" pod="openstack/openstack-galera-0" Mar 08 04:15:17.320698 master-0 kubenswrapper[18592]: I0308 04:15:17.320486 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.324644 master-0 kubenswrapper[18592]: I0308 04:15:17.324606 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.334082 master-0 kubenswrapper[18592]: I0308 04:15:17.334035 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nmpc\" (UniqueName: \"kubernetes.io/projected/7d8ff61a-7e75-41ec-9314-40ff5a0fea03-kube-api-access-8nmpc\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:17.389380 master-0 kubenswrapper[18592]: I0308 04:15:17.389231 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 04:15:18.019265 master-0 kubenswrapper[18592]: I0308 04:15:18.019220 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 04:15:18.020975 master-0 kubenswrapper[18592]: I0308 04:15:18.020957 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.023815 master-0 kubenswrapper[18592]: I0308 04:15:18.023751 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 08 04:15:18.024009 master-0 kubenswrapper[18592]: I0308 04:15:18.023782 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 08 04:15:18.027666 master-0 kubenswrapper[18592]: I0308 04:15:18.027648 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 08 04:15:18.040301 master-0 kubenswrapper[18592]: I0308 04:15:18.040236 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 04:15:18.176461 master-0 kubenswrapper[18592]: I0308 04:15:18.176289 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/17b43cd8-4413-4958-9473-bbc5448585dc-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.176461 master-0 kubenswrapper[18592]: I0308 04:15:18.176356 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-673d379c-abea-4f56-83cd-dd88a13454fa\" (UniqueName: \"kubernetes.io/csi/topolvm.io^69919929-dd7e-4b11-baa5-ae540c5902b5\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.176461 master-0 kubenswrapper[18592]: I0308 04:15:18.176395 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17b43cd8-4413-4958-9473-bbc5448585dc-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.176461 master-0 kubenswrapper[18592]: I0308 04:15:18.176416 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hwsg\" (UniqueName: \"kubernetes.io/projected/17b43cd8-4413-4958-9473-bbc5448585dc-kube-api-access-2hwsg\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.176461 master-0 kubenswrapper[18592]: I0308 04:15:18.176463 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/17b43cd8-4413-4958-9473-bbc5448585dc-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.178521 master-0 kubenswrapper[18592]: I0308 04:15:18.176492 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17b43cd8-4413-4958-9473-bbc5448585dc-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.178521 master-0 kubenswrapper[18592]: I0308 04:15:18.176520 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/17b43cd8-4413-4958-9473-bbc5448585dc-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.178521 master-0 kubenswrapper[18592]: I0308 04:15:18.176547 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/17b43cd8-4413-4958-9473-bbc5448585dc-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.277867 master-0 kubenswrapper[18592]: I0308 04:15:18.277808 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/17b43cd8-4413-4958-9473-bbc5448585dc-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.278063 master-0 kubenswrapper[18592]: I0308 04:15:18.277887 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-673d379c-abea-4f56-83cd-dd88a13454fa\" (UniqueName: \"kubernetes.io/csi/topolvm.io^69919929-dd7e-4b11-baa5-ae540c5902b5\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.278063 master-0 kubenswrapper[18592]: I0308 04:15:18.277925 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17b43cd8-4413-4958-9473-bbc5448585dc-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.278063 master-0 kubenswrapper[18592]: I0308 04:15:18.277943 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hwsg\" (UniqueName: \"kubernetes.io/projected/17b43cd8-4413-4958-9473-bbc5448585dc-kube-api-access-2hwsg\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.278724 master-0 kubenswrapper[18592]: I0308 04:15:18.278686 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/17b43cd8-4413-4958-9473-bbc5448585dc-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.279904 master-0 kubenswrapper[18592]: I0308 04:15:18.278802 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/17b43cd8-4413-4958-9473-bbc5448585dc-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.279904 master-0 kubenswrapper[18592]: I0308 04:15:18.279354 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17b43cd8-4413-4958-9473-bbc5448585dc-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.279904 master-0 kubenswrapper[18592]: I0308 04:15:18.279576 18592 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 04:15:18.279904 master-0 kubenswrapper[18592]: I0308 04:15:18.279612 18592 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-673d379c-abea-4f56-83cd-dd88a13454fa\" (UniqueName: \"kubernetes.io/csi/topolvm.io^69919929-dd7e-4b11-baa5-ae540c5902b5\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/a39f0329ad7a91e028f2f6e44dc76a0e24f1660ed2b9f548af76cc3886903e59/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.279904 master-0 kubenswrapper[18592]: I0308 04:15:18.279650 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/17b43cd8-4413-4958-9473-bbc5448585dc-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.279904 master-0 kubenswrapper[18592]: I0308 04:15:18.279809 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/17b43cd8-4413-4958-9473-bbc5448585dc-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.280420 master-0 kubenswrapper[18592]: I0308 04:15:18.280319 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/17b43cd8-4413-4958-9473-bbc5448585dc-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.280703 master-0 kubenswrapper[18592]: I0308 04:15:18.280680 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17b43cd8-4413-4958-9473-bbc5448585dc-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.284255 master-0 kubenswrapper[18592]: I0308 04:15:18.283810 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/17b43cd8-4413-4958-9473-bbc5448585dc-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.284466 master-0 kubenswrapper[18592]: I0308 04:15:18.284414 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17b43cd8-4413-4958-9473-bbc5448585dc-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.293458 master-0 kubenswrapper[18592]: I0308 04:15:18.293295 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/17b43cd8-4413-4958-9473-bbc5448585dc-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.293458 master-0 kubenswrapper[18592]: I0308 04:15:18.293418 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hwsg\" (UniqueName: \"kubernetes.io/projected/17b43cd8-4413-4958-9473-bbc5448585dc-kube-api-access-2hwsg\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:18.506402 master-0 kubenswrapper[18592]: I0308 04:15:18.506351 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e761f4d0-ad04-4858-a2d2-62c2dc0e24a0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^3e47d08b-74b3-4f4c-84d1-0ae0c231cb6b\") pod \"rabbitmq-cell1-server-0\" (UID: \"466c2b13-2b27-4a83-911c-db97d66490a5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:18.521726 master-0 kubenswrapper[18592]: I0308 04:15:18.521674 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:15:19.552035 master-0 kubenswrapper[18592]: I0308 04:15:19.551991 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a0aee878-f068-481f-98af-59d94bff27c4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6bc048da-778d-4d44-8ca8-abb95f9deb0b\") pod \"openstack-galera-0\" (UID: \"7d8ff61a-7e75-41ec-9314-40ff5a0fea03\") " pod="openstack/openstack-galera-0" Mar 08 04:15:19.829046 master-0 kubenswrapper[18592]: I0308 04:15:19.827555 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hkfg8"] Mar 08 04:15:19.835803 master-0 kubenswrapper[18592]: I0308 04:15:19.831580 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:19.835803 master-0 kubenswrapper[18592]: I0308 04:15:19.833593 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 04:15:19.836848 master-0 kubenswrapper[18592]: I0308 04:15:19.836451 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 08 04:15:19.836848 master-0 kubenswrapper[18592]: I0308 04:15:19.836623 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 08 04:15:19.875904 master-0 kubenswrapper[18592]: I0308 04:15:19.875864 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-2lfv5"] Mar 08 04:15:19.886119 master-0 kubenswrapper[18592]: I0308 04:15:19.886077 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:19.959001 master-0 kubenswrapper[18592]: I0308 04:15:19.958934 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hkfg8"] Mar 08 04:15:19.977447 master-0 kubenswrapper[18592]: I0308 04:15:19.977405 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2lfv5"] Mar 08 04:15:20.032838 master-0 kubenswrapper[18592]: I0308 04:15:20.032754 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w76bc\" (UniqueName: \"kubernetes.io/projected/9fb36151-5aa5-462a-b8da-a082585a5a26-kube-api-access-w76bc\") pod \"ovn-controller-ovs-2lfv5\" (UID: \"9fb36151-5aa5-462a-b8da-a082585a5a26\") " pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.033098 master-0 kubenswrapper[18592]: I0308 04:15:20.032843 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-var-run\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.033098 master-0 kubenswrapper[18592]: I0308 04:15:20.032885 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9fb36151-5aa5-462a-b8da-a082585a5a26-var-log\") pod \"ovn-controller-ovs-2lfv5\" (UID: \"9fb36151-5aa5-462a-b8da-a082585a5a26\") " pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.033098 master-0 kubenswrapper[18592]: I0308 04:15:20.032914 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-scripts\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.033098 master-0 kubenswrapper[18592]: I0308 04:15:20.032997 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-var-log-ovn\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.033309 master-0 kubenswrapper[18592]: I0308 04:15:20.033234 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9fb36151-5aa5-462a-b8da-a082585a5a26-var-run\") pod \"ovn-controller-ovs-2lfv5\" (UID: \"9fb36151-5aa5-462a-b8da-a082585a5a26\") " pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.033361 master-0 kubenswrapper[18592]: I0308 04:15:20.033345 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9fb36151-5aa5-462a-b8da-a082585a5a26-var-lib\") pod \"ovn-controller-ovs-2lfv5\" (UID: \"9fb36151-5aa5-462a-b8da-a082585a5a26\") " pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.033502 master-0 kubenswrapper[18592]: I0308 04:15:20.033450 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9fb36151-5aa5-462a-b8da-a082585a5a26-scripts\") pod \"ovn-controller-ovs-2lfv5\" (UID: \"9fb36151-5aa5-462a-b8da-a082585a5a26\") " pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.033635 master-0 kubenswrapper[18592]: I0308 04:15:20.033581 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-var-run-ovn\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.033704 master-0 kubenswrapper[18592]: I0308 04:15:20.033657 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hstk7\" (UniqueName: \"kubernetes.io/projected/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-kube-api-access-hstk7\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.034943 master-0 kubenswrapper[18592]: I0308 04:15:20.034901 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9fb36151-5aa5-462a-b8da-a082585a5a26-etc-ovs\") pod \"ovn-controller-ovs-2lfv5\" (UID: \"9fb36151-5aa5-462a-b8da-a082585a5a26\") " pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.035041 master-0 kubenswrapper[18592]: I0308 04:15:20.034961 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-combined-ca-bundle\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.035097 master-0 kubenswrapper[18592]: I0308 04:15:20.035072 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-ovn-controller-tls-certs\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.137234 master-0 kubenswrapper[18592]: I0308 04:15:20.137111 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9fb36151-5aa5-462a-b8da-a082585a5a26-scripts\") pod \"ovn-controller-ovs-2lfv5\" (UID: \"9fb36151-5aa5-462a-b8da-a082585a5a26\") " pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.137437 master-0 kubenswrapper[18592]: I0308 04:15:20.137265 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-var-run-ovn\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.137437 master-0 kubenswrapper[18592]: I0308 04:15:20.137344 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hstk7\" (UniqueName: \"kubernetes.io/projected/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-kube-api-access-hstk7\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.137437 master-0 kubenswrapper[18592]: I0308 04:15:20.137415 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9fb36151-5aa5-462a-b8da-a082585a5a26-etc-ovs\") pod \"ovn-controller-ovs-2lfv5\" (UID: \"9fb36151-5aa5-462a-b8da-a082585a5a26\") " pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.137437 master-0 kubenswrapper[18592]: I0308 04:15:20.137437 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-combined-ca-bundle\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.137618 master-0 kubenswrapper[18592]: I0308 04:15:20.137523 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-ovn-controller-tls-certs\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.137618 master-0 kubenswrapper[18592]: I0308 04:15:20.137599 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w76bc\" (UniqueName: \"kubernetes.io/projected/9fb36151-5aa5-462a-b8da-a082585a5a26-kube-api-access-w76bc\") pod \"ovn-controller-ovs-2lfv5\" (UID: \"9fb36151-5aa5-462a-b8da-a082585a5a26\") " pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.137715 master-0 kubenswrapper[18592]: I0308 04:15:20.137628 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-var-run\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.138639 master-0 kubenswrapper[18592]: I0308 04:15:20.138066 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9fb36151-5aa5-462a-b8da-a082585a5a26-var-log\") pod \"ovn-controller-ovs-2lfv5\" (UID: \"9fb36151-5aa5-462a-b8da-a082585a5a26\") " pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.138639 master-0 kubenswrapper[18592]: I0308 04:15:20.137753 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-var-run-ovn\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.138639 master-0 kubenswrapper[18592]: I0308 04:15:20.137993 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-var-run\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.138639 master-0 kubenswrapper[18592]: I0308 04:15:20.137943 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9fb36151-5aa5-462a-b8da-a082585a5a26-etc-ovs\") pod \"ovn-controller-ovs-2lfv5\" (UID: \"9fb36151-5aa5-462a-b8da-a082585a5a26\") " pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.138639 master-0 kubenswrapper[18592]: I0308 04:15:20.138200 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-scripts\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.138639 master-0 kubenswrapper[18592]: I0308 04:15:20.138247 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-var-log-ovn\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.138639 master-0 kubenswrapper[18592]: I0308 04:15:20.138295 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9fb36151-5aa5-462a-b8da-a082585a5a26-var-run\") pod \"ovn-controller-ovs-2lfv5\" (UID: \"9fb36151-5aa5-462a-b8da-a082585a5a26\") " pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.138639 master-0 kubenswrapper[18592]: I0308 04:15:20.138346 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9fb36151-5aa5-462a-b8da-a082585a5a26-var-lib\") pod \"ovn-controller-ovs-2lfv5\" (UID: \"9fb36151-5aa5-462a-b8da-a082585a5a26\") " pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.138639 master-0 kubenswrapper[18592]: I0308 04:15:20.138538 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9fb36151-5aa5-462a-b8da-a082585a5a26-var-lib\") pod \"ovn-controller-ovs-2lfv5\" (UID: \"9fb36151-5aa5-462a-b8da-a082585a5a26\") " pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.138639 master-0 kubenswrapper[18592]: I0308 04:15:20.138494 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9fb36151-5aa5-462a-b8da-a082585a5a26-var-run\") pod \"ovn-controller-ovs-2lfv5\" (UID: \"9fb36151-5aa5-462a-b8da-a082585a5a26\") " pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.140983 master-0 kubenswrapper[18592]: I0308 04:15:20.140065 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9fb36151-5aa5-462a-b8da-a082585a5a26-scripts\") pod \"ovn-controller-ovs-2lfv5\" (UID: \"9fb36151-5aa5-462a-b8da-a082585a5a26\") " pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.140983 master-0 kubenswrapper[18592]: I0308 04:15:20.140187 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-scripts\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.140983 master-0 kubenswrapper[18592]: I0308 04:15:20.140262 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9fb36151-5aa5-462a-b8da-a082585a5a26-var-log\") pod \"ovn-controller-ovs-2lfv5\" (UID: \"9fb36151-5aa5-462a-b8da-a082585a5a26\") " pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.140983 master-0 kubenswrapper[18592]: I0308 04:15:20.140937 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-var-log-ovn\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.147935 master-0 kubenswrapper[18592]: I0308 04:15:20.142539 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-ovn-controller-tls-certs\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.150882 master-0 kubenswrapper[18592]: I0308 04:15:20.150846 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-combined-ca-bundle\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.154561 master-0 kubenswrapper[18592]: I0308 04:15:20.154517 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w76bc\" (UniqueName: \"kubernetes.io/projected/9fb36151-5aa5-462a-b8da-a082585a5a26-kube-api-access-w76bc\") pod \"ovn-controller-ovs-2lfv5\" (UID: \"9fb36151-5aa5-462a-b8da-a082585a5a26\") " pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.156846 master-0 kubenswrapper[18592]: I0308 04:15:20.156782 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hstk7\" (UniqueName: \"kubernetes.io/projected/bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1-kube-api-access-hstk7\") pod \"ovn-controller-hkfg8\" (UID: \"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1\") " pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.174072 master-0 kubenswrapper[18592]: I0308 04:15:20.174009 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:20.204992 master-0 kubenswrapper[18592]: I0308 04:15:20.204950 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:20.566107 master-0 kubenswrapper[18592]: I0308 04:15:20.566051 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-673d379c-abea-4f56-83cd-dd88a13454fa\" (UniqueName: \"kubernetes.io/csi/topolvm.io^69919929-dd7e-4b11-baa5-ae540c5902b5\") pod \"openstack-cell1-galera-0\" (UID: \"17b43cd8-4413-4958-9473-bbc5448585dc\") " pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:20.782937 master-0 kubenswrapper[18592]: I0308 04:15:20.782227 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:23.869844 master-0 kubenswrapper[18592]: I0308 04:15:23.864876 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 04:15:23.869844 master-0 kubenswrapper[18592]: I0308 04:15:23.866620 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:23.872535 master-0 kubenswrapper[18592]: I0308 04:15:23.872138 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 08 04:15:23.872535 master-0 kubenswrapper[18592]: I0308 04:15:23.872252 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 08 04:15:23.872535 master-0 kubenswrapper[18592]: I0308 04:15:23.872318 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 08 04:15:23.872535 master-0 kubenswrapper[18592]: I0308 04:15:23.872142 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 08 04:15:23.881812 master-0 kubenswrapper[18592]: I0308 04:15:23.881763 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 04:15:24.048548 master-0 kubenswrapper[18592]: I0308 04:15:24.048463 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2bc81af2-035c-4577-af27-93afee93376f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^96ccbb96-3189-477d-bb36-ff31c61b2b28\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.048766 master-0 kubenswrapper[18592]: I0308 04:15:24.048560 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2ms4\" (UniqueName: \"kubernetes.io/projected/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-kube-api-access-j2ms4\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.048766 master-0 kubenswrapper[18592]: I0308 04:15:24.048585 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.048766 master-0 kubenswrapper[18592]: I0308 04:15:24.048613 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.048766 master-0 kubenswrapper[18592]: I0308 04:15:24.048657 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-config\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.048766 master-0 kubenswrapper[18592]: I0308 04:15:24.048695 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.048949 master-0 kubenswrapper[18592]: I0308 04:15:24.048767 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.048949 master-0 kubenswrapper[18592]: I0308 04:15:24.048872 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.151258 master-0 kubenswrapper[18592]: I0308 04:15:24.150608 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.151258 master-0 kubenswrapper[18592]: I0308 04:15:24.150715 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.151258 master-0 kubenswrapper[18592]: I0308 04:15:24.150775 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.151258 master-0 kubenswrapper[18592]: I0308 04:15:24.151013 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2bc81af2-035c-4577-af27-93afee93376f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^96ccbb96-3189-477d-bb36-ff31c61b2b28\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.151258 master-0 kubenswrapper[18592]: I0308 04:15:24.151049 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2ms4\" (UniqueName: \"kubernetes.io/projected/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-kube-api-access-j2ms4\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.151258 master-0 kubenswrapper[18592]: I0308 04:15:24.151072 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.151258 master-0 kubenswrapper[18592]: I0308 04:15:24.151101 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.151258 master-0 kubenswrapper[18592]: I0308 04:15:24.151135 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-config\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.152202 master-0 kubenswrapper[18592]: I0308 04:15:24.152170 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.152557 master-0 kubenswrapper[18592]: I0308 04:15:24.152529 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-config\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.152782 master-0 kubenswrapper[18592]: I0308 04:15:24.152745 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.156308 master-0 kubenswrapper[18592]: I0308 04:15:24.156260 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.156706 master-0 kubenswrapper[18592]: I0308 04:15:24.156662 18592 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 04:15:24.156878 master-0 kubenswrapper[18592]: I0308 04:15:24.156856 18592 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2bc81af2-035c-4577-af27-93afee93376f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^96ccbb96-3189-477d-bb36-ff31c61b2b28\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/15f918159c5b70e2463dd4eed227bae7440420c0cc88557d6b92b92727e1006a/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.157166 master-0 kubenswrapper[18592]: I0308 04:15:24.156680 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.160886 master-0 kubenswrapper[18592]: I0308 04:15:24.160748 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.172962 master-0 kubenswrapper[18592]: I0308 04:15:24.166998 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2ms4\" (UniqueName: \"kubernetes.io/projected/9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264-kube-api-access-j2ms4\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:24.693843 master-0 kubenswrapper[18592]: I0308 04:15:24.681215 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 04:15:24.693843 master-0 kubenswrapper[18592]: I0308 04:15:24.684431 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.693843 master-0 kubenswrapper[18592]: I0308 04:15:24.686752 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 08 04:15:24.693843 master-0 kubenswrapper[18592]: I0308 04:15:24.688106 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 08 04:15:24.693843 master-0 kubenswrapper[18592]: I0308 04:15:24.688229 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 08 04:15:24.693843 master-0 kubenswrapper[18592]: I0308 04:15:24.691793 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 04:15:24.869812 master-0 kubenswrapper[18592]: I0308 04:15:24.869744 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d803f7-e454-4197-833f-539d8f1926ca-config\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.870052 master-0 kubenswrapper[18592]: I0308 04:15:24.869918 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d803f7-e454-4197-833f-539d8f1926ca-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.870052 master-0 kubenswrapper[18592]: I0308 04:15:24.869967 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc6gw\" (UniqueName: \"kubernetes.io/projected/19d803f7-e454-4197-833f-539d8f1926ca-kube-api-access-lc6gw\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.870052 master-0 kubenswrapper[18592]: I0308 04:15:24.870023 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/19d803f7-e454-4197-833f-539d8f1926ca-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.871073 master-0 kubenswrapper[18592]: I0308 04:15:24.870175 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19d803f7-e454-4197-833f-539d8f1926ca-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.871073 master-0 kubenswrapper[18592]: I0308 04:15:24.870394 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/19d803f7-e454-4197-833f-539d8f1926ca-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.871073 master-0 kubenswrapper[18592]: I0308 04:15:24.870549 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-19515682-1559-4e10-89c4-f3e1006f2c94\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f689bcff-a1bf-49dd-90c3-77d4b0e0b636\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.871073 master-0 kubenswrapper[18592]: I0308 04:15:24.870786 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/19d803f7-e454-4197-833f-539d8f1926ca-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.976295 master-0 kubenswrapper[18592]: I0308 04:15:24.976132 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d803f7-e454-4197-833f-539d8f1926ca-config\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.976542 master-0 kubenswrapper[18592]: I0308 04:15:24.976323 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d803f7-e454-4197-833f-539d8f1926ca-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.976542 master-0 kubenswrapper[18592]: I0308 04:15:24.976361 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc6gw\" (UniqueName: \"kubernetes.io/projected/19d803f7-e454-4197-833f-539d8f1926ca-kube-api-access-lc6gw\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.976542 master-0 kubenswrapper[18592]: I0308 04:15:24.976395 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/19d803f7-e454-4197-833f-539d8f1926ca-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.976542 master-0 kubenswrapper[18592]: I0308 04:15:24.976419 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19d803f7-e454-4197-833f-539d8f1926ca-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.976542 master-0 kubenswrapper[18592]: I0308 04:15:24.976473 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/19d803f7-e454-4197-833f-539d8f1926ca-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.976907 master-0 kubenswrapper[18592]: I0308 04:15:24.976763 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-19515682-1559-4e10-89c4-f3e1006f2c94\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f689bcff-a1bf-49dd-90c3-77d4b0e0b636\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.976982 master-0 kubenswrapper[18592]: I0308 04:15:24.976938 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/19d803f7-e454-4197-833f-539d8f1926ca-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.978505 master-0 kubenswrapper[18592]: I0308 04:15:24.978423 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19d803f7-e454-4197-833f-539d8f1926ca-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.978720 master-0 kubenswrapper[18592]: I0308 04:15:24.978567 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d803f7-e454-4197-833f-539d8f1926ca-config\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.980913 master-0 kubenswrapper[18592]: I0308 04:15:24.980871 18592 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 04:15:24.981051 master-0 kubenswrapper[18592]: I0308 04:15:24.980936 18592 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-19515682-1559-4e10-89c4-f3e1006f2c94\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f689bcff-a1bf-49dd-90c3-77d4b0e0b636\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/c402d77a6d950a3719d90727f934933700e8ff41ff4d87dc93322eeb88802116/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.981131 master-0 kubenswrapper[18592]: I0308 04:15:24.981059 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/19d803f7-e454-4197-833f-539d8f1926ca-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.981994 master-0 kubenswrapper[18592]: I0308 04:15:24.981729 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/19d803f7-e454-4197-833f-539d8f1926ca-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.982312 master-0 kubenswrapper[18592]: I0308 04:15:24.982015 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/19d803f7-e454-4197-833f-539d8f1926ca-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.983965 master-0 kubenswrapper[18592]: I0308 04:15:24.983896 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d803f7-e454-4197-833f-539d8f1926ca-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:24.996769 master-0 kubenswrapper[18592]: I0308 04:15:24.996622 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc6gw\" (UniqueName: \"kubernetes.io/projected/19d803f7-e454-4197-833f-539d8f1926ca-kube-api-access-lc6gw\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:25.496934 master-0 kubenswrapper[18592]: I0308 04:15:25.494248 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2bc81af2-035c-4577-af27-93afee93376f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^96ccbb96-3189-477d-bb36-ff31c61b2b28\") pod \"ovsdbserver-sb-0\" (UID: \"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264\") " pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:25.719455 master-0 kubenswrapper[18592]: I0308 04:15:25.719398 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:26.878053 master-0 kubenswrapper[18592]: I0308 04:15:26.878013 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-19515682-1559-4e10-89c4-f3e1006f2c94\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f689bcff-a1bf-49dd-90c3-77d4b0e0b636\") pod \"ovsdbserver-nb-0\" (UID: \"19d803f7-e454-4197-833f-539d8f1926ca\") " pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:26.986441 master-0 kubenswrapper[18592]: I0308 04:15:26.986368 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 08 04:15:27.012973 master-0 kubenswrapper[18592]: I0308 04:15:27.012805 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6877bbfb4f-ht5gv"] Mar 08 04:15:27.037367 master-0 kubenswrapper[18592]: I0308 04:15:27.037325 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 04:15:27.046089 master-0 kubenswrapper[18592]: I0308 04:15:27.045875 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 04:15:27.052710 master-0 kubenswrapper[18592]: W0308 04:15:27.052672 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cc32df9_dcb4_43f3_b78d_f992b0488bf1.slice/crio-1501344506d57b65aded59794a13e4dfa0ee234297371a5e8726edf0f061d43c WatchSource:0}: Error finding container 1501344506d57b65aded59794a13e4dfa0ee234297371a5e8726edf0f061d43c: Status 404 returned error can't find the container with id 1501344506d57b65aded59794a13e4dfa0ee234297371a5e8726edf0f061d43c Mar 08 04:15:27.125901 master-0 kubenswrapper[18592]: I0308 04:15:27.116966 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:27.450569 master-0 kubenswrapper[18592]: I0308 04:15:27.450492 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6cc32df9-dcb4-43f3-b78d-f992b0488bf1","Type":"ContainerStarted","Data":"1501344506d57b65aded59794a13e4dfa0ee234297371a5e8726edf0f061d43c"} Mar 08 04:15:27.452304 master-0 kubenswrapper[18592]: I0308 04:15:27.452252 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d174d635-30c7-4d7d-a077-aa6436a5675a","Type":"ContainerStarted","Data":"d5c502ee34598385134eca13421a6b6749d04710b09c25d278f88a7f2cd13892"} Mar 08 04:15:27.466284 master-0 kubenswrapper[18592]: I0308 04:15:27.466210 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"466c2b13-2b27-4a83-911c-db97d66490a5","Type":"ContainerStarted","Data":"c81ff84fefc8ffd9cf549066032d28ea3a8a9637d1578390571da46e37e0f421"} Mar 08 04:15:27.468931 master-0 kubenswrapper[18592]: I0308 04:15:27.468669 18592 generic.go:334] "Generic (PLEG): container finished" podID="651222ab-0f62-459d-b8a2-4897537cb868" containerID="b5a41486677b463369d89b8d803a6545363c3deeba115d7b0f7132d360e5dbc8" exitCode=0 Mar 08 04:15:27.468931 master-0 kubenswrapper[18592]: I0308 04:15:27.468783 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55994974c5-gb4wt" event={"ID":"651222ab-0f62-459d-b8a2-4897537cb868","Type":"ContainerDied","Data":"b5a41486677b463369d89b8d803a6545363c3deeba115d7b0f7132d360e5dbc8"} Mar 08 04:15:27.473642 master-0 kubenswrapper[18592]: I0308 04:15:27.473332 18592 generic.go:334] "Generic (PLEG): container finished" podID="a117758c-2170-460f-8db5-251a9ca2006f" containerID="21c1373d4b83bff60cc9c50af9de0e6b8ea4f76ac0618d1c25fc3b52b5d7a11d" exitCode=0 Mar 08 04:15:27.473642 master-0 kubenswrapper[18592]: I0308 04:15:27.473414 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d859fb5df-tch88" event={"ID":"a117758c-2170-460f-8db5-251a9ca2006f","Type":"ContainerDied","Data":"21c1373d4b83bff60cc9c50af9de0e6b8ea4f76ac0618d1c25fc3b52b5d7a11d"} Mar 08 04:15:27.485899 master-0 kubenswrapper[18592]: I0308 04:15:27.485776 18592 generic.go:334] "Generic (PLEG): container finished" podID="bfc42b5c-d187-4d48-8c06-f13c9eedeb0f" containerID="aaec311876223841843dcd3f9aa2a1f47b6324e74198c6e495885189ec50d560" exitCode=0 Mar 08 04:15:27.486025 master-0 kubenswrapper[18592]: I0308 04:15:27.485903 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" event={"ID":"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f","Type":"ContainerDied","Data":"aaec311876223841843dcd3f9aa2a1f47b6324e74198c6e495885189ec50d560"} Mar 08 04:15:27.486025 master-0 kubenswrapper[18592]: I0308 04:15:27.485960 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" event={"ID":"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f","Type":"ContainerStarted","Data":"d8a6e82a1df2cf6279d2ced34377d7541e895791af15a51c957431163763b5c6"} Mar 08 04:15:27.721005 master-0 kubenswrapper[18592]: I0308 04:15:27.720197 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hkfg8"] Mar 08 04:15:27.776732 master-0 kubenswrapper[18592]: I0308 04:15:27.776662 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f75dd7cd9-2zs4f"] Mar 08 04:15:27.799145 master-0 kubenswrapper[18592]: I0308 04:15:27.794054 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 04:15:27.848762 master-0 kubenswrapper[18592]: I0308 04:15:27.848715 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 04:15:27.859485 master-0 kubenswrapper[18592]: E0308 04:15:27.859425 18592 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 08 04:15:27.859485 master-0 kubenswrapper[18592]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 08 04:15:27.859485 master-0 kubenswrapper[18592]: > podSandboxID="d8a6e82a1df2cf6279d2ced34377d7541e895791af15a51c957431163763b5c6" Mar 08 04:15:27.859640 master-0 kubenswrapper[18592]: E0308 04:15:27.859614 18592 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 04:15:27.859640 master-0 kubenswrapper[18592]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nbchf8h696h5ffh5cdh585hc5hbfh597h58dhfh554h67bh9bh5c9hfch7dh5fbhbbh567h78h669hf8h65dh55dh588h5ddh88h694h669h95h8q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l8wc7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000800000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6877bbfb4f-ht5gv_openstack(bfc42b5c-d187-4d48-8c06-f13c9eedeb0f): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 08 04:15:27.859640 master-0 kubenswrapper[18592]: > logger="UnhandledError" Mar 08 04:15:27.860938 master-0 kubenswrapper[18592]: E0308 04:15:27.860792 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" podUID="bfc42b5c-d187-4d48-8c06-f13c9eedeb0f" Mar 08 04:15:28.058615 master-0 kubenswrapper[18592]: I0308 04:15:28.058519 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2lfv5"] Mar 08 04:15:28.153885 master-0 kubenswrapper[18592]: I0308 04:15:28.153795 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55994974c5-gb4wt" Mar 08 04:15:28.186199 master-0 kubenswrapper[18592]: I0308 04:15:28.186145 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 04:15:28.196257 master-0 kubenswrapper[18592]: I0308 04:15:28.196201 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88pwx\" (UniqueName: \"kubernetes.io/projected/651222ab-0f62-459d-b8a2-4897537cb868-kube-api-access-88pwx\") pod \"651222ab-0f62-459d-b8a2-4897537cb868\" (UID: \"651222ab-0f62-459d-b8a2-4897537cb868\") " Mar 08 04:15:28.196394 master-0 kubenswrapper[18592]: I0308 04:15:28.196335 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/651222ab-0f62-459d-b8a2-4897537cb868-config\") pod \"651222ab-0f62-459d-b8a2-4897537cb868\" (UID: \"651222ab-0f62-459d-b8a2-4897537cb868\") " Mar 08 04:15:28.199104 master-0 kubenswrapper[18592]: I0308 04:15:28.199069 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d859fb5df-tch88" Mar 08 04:15:28.199551 master-0 kubenswrapper[18592]: I0308 04:15:28.199512 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/651222ab-0f62-459d-b8a2-4897537cb868-kube-api-access-88pwx" (OuterVolumeSpecName: "kube-api-access-88pwx") pod "651222ab-0f62-459d-b8a2-4897537cb868" (UID: "651222ab-0f62-459d-b8a2-4897537cb868"). InnerVolumeSpecName "kube-api-access-88pwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:15:28.246926 master-0 kubenswrapper[18592]: I0308 04:15:28.246511 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/651222ab-0f62-459d-b8a2-4897537cb868-config" (OuterVolumeSpecName: "config") pod "651222ab-0f62-459d-b8a2-4897537cb868" (UID: "651222ab-0f62-459d-b8a2-4897537cb868"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:15:28.299030 master-0 kubenswrapper[18592]: I0308 04:15:28.298651 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bdsk\" (UniqueName: \"kubernetes.io/projected/a117758c-2170-460f-8db5-251a9ca2006f-kube-api-access-9bdsk\") pod \"a117758c-2170-460f-8db5-251a9ca2006f\" (UID: \"a117758c-2170-460f-8db5-251a9ca2006f\") " Mar 08 04:15:28.299030 master-0 kubenswrapper[18592]: I0308 04:15:28.298780 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a117758c-2170-460f-8db5-251a9ca2006f-dns-svc\") pod \"a117758c-2170-460f-8db5-251a9ca2006f\" (UID: \"a117758c-2170-460f-8db5-251a9ca2006f\") " Mar 08 04:15:28.299274 master-0 kubenswrapper[18592]: I0308 04:15:28.299038 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a117758c-2170-460f-8db5-251a9ca2006f-config\") pod \"a117758c-2170-460f-8db5-251a9ca2006f\" (UID: \"a117758c-2170-460f-8db5-251a9ca2006f\") " Mar 08 04:15:28.299470 master-0 kubenswrapper[18592]: I0308 04:15:28.299436 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88pwx\" (UniqueName: \"kubernetes.io/projected/651222ab-0f62-459d-b8a2-4897537cb868-kube-api-access-88pwx\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:28.299470 master-0 kubenswrapper[18592]: I0308 04:15:28.299460 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/651222ab-0f62-459d-b8a2-4897537cb868-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:28.305538 master-0 kubenswrapper[18592]: I0308 04:15:28.305499 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a117758c-2170-460f-8db5-251a9ca2006f-kube-api-access-9bdsk" (OuterVolumeSpecName: "kube-api-access-9bdsk") pod "a117758c-2170-460f-8db5-251a9ca2006f" (UID: "a117758c-2170-460f-8db5-251a9ca2006f"). InnerVolumeSpecName "kube-api-access-9bdsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:15:28.333837 master-0 kubenswrapper[18592]: I0308 04:15:28.322140 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a117758c-2170-460f-8db5-251a9ca2006f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a117758c-2170-460f-8db5-251a9ca2006f" (UID: "a117758c-2170-460f-8db5-251a9ca2006f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:15:28.333837 master-0 kubenswrapper[18592]: I0308 04:15:28.323144 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a117758c-2170-460f-8db5-251a9ca2006f-config" (OuterVolumeSpecName: "config") pod "a117758c-2170-460f-8db5-251a9ca2006f" (UID: "a117758c-2170-460f-8db5-251a9ca2006f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:15:28.400834 master-0 kubenswrapper[18592]: I0308 04:15:28.400780 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bdsk\" (UniqueName: \"kubernetes.io/projected/a117758c-2170-460f-8db5-251a9ca2006f-kube-api-access-9bdsk\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:28.400834 master-0 kubenswrapper[18592]: I0308 04:15:28.400817 18592 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a117758c-2170-460f-8db5-251a9ca2006f-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:28.400834 master-0 kubenswrapper[18592]: I0308 04:15:28.400838 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a117758c-2170-460f-8db5-251a9ca2006f-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:28.494697 master-0 kubenswrapper[18592]: I0308 04:15:28.494657 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"19d803f7-e454-4197-833f-539d8f1926ca","Type":"ContainerStarted","Data":"d65a6f3f5cd90c00bab7b7bedc7a19db52b8ee6f79380b7d9b979775d26c03b1"} Mar 08 04:15:28.495693 master-0 kubenswrapper[18592]: I0308 04:15:28.495644 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2lfv5" event={"ID":"9fb36151-5aa5-462a-b8da-a082585a5a26","Type":"ContainerStarted","Data":"32f8590a743377919bc13844190e18106a22a12188cbbcc80d895a2e5f596bcb"} Mar 08 04:15:28.496930 master-0 kubenswrapper[18592]: I0308 04:15:28.496900 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hkfg8" event={"ID":"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1","Type":"ContainerStarted","Data":"d8ae0423986363633c63f28f01493ed117cf0f519492de2b13d447ba0092c569"} Mar 08 04:15:28.498561 master-0 kubenswrapper[18592]: I0308 04:15:28.498133 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55994974c5-gb4wt" event={"ID":"651222ab-0f62-459d-b8a2-4897537cb868","Type":"ContainerDied","Data":"42b4d92ca29aea50e5c5ed37429e2638d90231bc42aacbfceb4df8a00d38bf35"} Mar 08 04:15:28.498561 master-0 kubenswrapper[18592]: I0308 04:15:28.498164 18592 scope.go:117] "RemoveContainer" containerID="b5a41486677b463369d89b8d803a6545363c3deeba115d7b0f7132d360e5dbc8" Mar 08 04:15:28.498561 master-0 kubenswrapper[18592]: I0308 04:15:28.498268 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55994974c5-gb4wt" Mar 08 04:15:28.503581 master-0 kubenswrapper[18592]: I0308 04:15:28.503528 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d859fb5df-tch88" event={"ID":"a117758c-2170-460f-8db5-251a9ca2006f","Type":"ContainerDied","Data":"dc725511c655099fc1670c36fc1ea7f438387a8654ab97a1de9d5a4f0165605a"} Mar 08 04:15:28.503639 master-0 kubenswrapper[18592]: I0308 04:15:28.503548 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d859fb5df-tch88" Mar 08 04:15:28.504975 master-0 kubenswrapper[18592]: I0308 04:15:28.504936 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7d8ff61a-7e75-41ec-9314-40ff5a0fea03","Type":"ContainerStarted","Data":"509a64efa5ec48406d8918e3b4b3c0d5cd89ffc32e8f37f9f52677764740ab0c"} Mar 08 04:15:28.506437 master-0 kubenswrapper[18592]: I0308 04:15:28.506392 18592 generic.go:334] "Generic (PLEG): container finished" podID="d021c6b4-8118-43d3-a703-8c2f73e6e077" containerID="cad1d8c8bc57c2f7e4d71daaacd0e8e77c9fbdeb822bcb372dd43138c4083cd7" exitCode=0 Mar 08 04:15:28.506496 master-0 kubenswrapper[18592]: I0308 04:15:28.506454 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" event={"ID":"d021c6b4-8118-43d3-a703-8c2f73e6e077","Type":"ContainerDied","Data":"cad1d8c8bc57c2f7e4d71daaacd0e8e77c9fbdeb822bcb372dd43138c4083cd7"} Mar 08 04:15:28.506496 master-0 kubenswrapper[18592]: I0308 04:15:28.506479 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" event={"ID":"d021c6b4-8118-43d3-a703-8c2f73e6e077","Type":"ContainerStarted","Data":"077c5a7bab12e49ae5a70a01665cb59164ede9d1399cbb5470ea2541e4e0da41"} Mar 08 04:15:28.507455 master-0 kubenswrapper[18592]: I0308 04:15:28.507419 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"17b43cd8-4413-4958-9473-bbc5448585dc","Type":"ContainerStarted","Data":"a5b03ddc6d8c9f53a12bf9fc1a59736155b277957626badadc99e131bc8403da"} Mar 08 04:15:28.525044 master-0 kubenswrapper[18592]: I0308 04:15:28.525009 18592 scope.go:117] "RemoveContainer" containerID="21c1373d4b83bff60cc9c50af9de0e6b8ea4f76ac0618d1c25fc3b52b5d7a11d" Mar 08 04:15:31.876036 master-0 kubenswrapper[18592]: I0308 04:15:31.875880 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d859fb5df-tch88"] Mar 08 04:15:32.003383 master-0 kubenswrapper[18592]: I0308 04:15:32.001425 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d859fb5df-tch88"] Mar 08 04:15:32.200738 master-0 kubenswrapper[18592]: I0308 04:15:32.200692 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a117758c-2170-460f-8db5-251a9ca2006f" path="/var/lib/kubelet/pods/a117758c-2170-460f-8db5-251a9ca2006f/volumes" Mar 08 04:15:32.273797 master-0 kubenswrapper[18592]: I0308 04:15:32.272607 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 04:15:32.324344 master-0 kubenswrapper[18592]: I0308 04:15:32.324287 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" event={"ID":"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f","Type":"ContainerStarted","Data":"cf3aa9dc92db12abd2d6f7b72d8c866955bcdacb9eb2e781d134260a36379af8"} Mar 08 04:15:32.324888 master-0 kubenswrapper[18592]: I0308 04:15:32.324843 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" Mar 08 04:15:32.339894 master-0 kubenswrapper[18592]: I0308 04:15:32.329653 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" event={"ID":"d021c6b4-8118-43d3-a703-8c2f73e6e077","Type":"ContainerStarted","Data":"ee3298007e604218ca7104c92674c537834cf4a8e7b6a33c253d1de4de1e8351"} Mar 08 04:15:32.340033 master-0 kubenswrapper[18592]: I0308 04:15:32.340007 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" Mar 08 04:15:32.462950 master-0 kubenswrapper[18592]: I0308 04:15:32.456553 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55994974c5-gb4wt"] Mar 08 04:15:32.490036 master-0 kubenswrapper[18592]: I0308 04:15:32.489971 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55994974c5-gb4wt"] Mar 08 04:15:32.565621 master-0 kubenswrapper[18592]: I0308 04:15:32.565147 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" podStartSLOduration=21.565115569 podStartE2EDuration="21.565115569s" podCreationTimestamp="2026-03-08 04:15:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:15:32.557678734 +0000 UTC m=+1344.656433124" watchObservedRunningTime="2026-03-08 04:15:32.565115569 +0000 UTC m=+1344.663869919" Mar 08 04:15:32.622844 master-0 kubenswrapper[18592]: I0308 04:15:32.621409 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" podStartSLOduration=22.621392606 podStartE2EDuration="22.621392606s" podCreationTimestamp="2026-03-08 04:15:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:15:32.617674123 +0000 UTC m=+1344.716428493" watchObservedRunningTime="2026-03-08 04:15:32.621392606 +0000 UTC m=+1344.720146956" Mar 08 04:15:33.342366 master-0 kubenswrapper[18592]: I0308 04:15:33.342112 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264","Type":"ContainerStarted","Data":"d682309e122e3748557f31f7c7cf6b929627aece38860b174401a60a80a07dd2"} Mar 08 04:15:34.164391 master-0 kubenswrapper[18592]: I0308 04:15:34.164268 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="651222ab-0f62-459d-b8a2-4897537cb868" path="/var/lib/kubelet/pods/651222ab-0f62-459d-b8a2-4897537cb868/volumes" Mar 08 04:15:36.158144 master-0 kubenswrapper[18592]: I0308 04:15:36.156949 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" Mar 08 04:15:36.995090 master-0 kubenswrapper[18592]: I0308 04:15:36.995033 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" Mar 08 04:15:37.892908 master-0 kubenswrapper[18592]: I0308 04:15:37.892840 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6877bbfb4f-ht5gv"] Mar 08 04:15:37.893551 master-0 kubenswrapper[18592]: I0308 04:15:37.893078 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" podUID="bfc42b5c-d187-4d48-8c06-f13c9eedeb0f" containerName="dnsmasq-dns" containerID="cri-o://cf3aa9dc92db12abd2d6f7b72d8c866955bcdacb9eb2e781d134260a36379af8" gracePeriod=10 Mar 08 04:15:40.277269 master-0 kubenswrapper[18592]: I0308 04:15:40.276887 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" Mar 08 04:15:40.393067 master-0 kubenswrapper[18592]: I0308 04:15:40.393020 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f-dns-svc\") pod \"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f\" (UID: \"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f\") " Mar 08 04:15:40.393278 master-0 kubenswrapper[18592]: I0308 04:15:40.393124 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8wc7\" (UniqueName: \"kubernetes.io/projected/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f-kube-api-access-l8wc7\") pod \"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f\" (UID: \"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f\") " Mar 08 04:15:40.393278 master-0 kubenswrapper[18592]: I0308 04:15:40.393213 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f-config\") pod \"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f\" (UID: \"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f\") " Mar 08 04:15:40.398501 master-0 kubenswrapper[18592]: I0308 04:15:40.398468 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f-kube-api-access-l8wc7" (OuterVolumeSpecName: "kube-api-access-l8wc7") pod "bfc42b5c-d187-4d48-8c06-f13c9eedeb0f" (UID: "bfc42b5c-d187-4d48-8c06-f13c9eedeb0f"). InnerVolumeSpecName "kube-api-access-l8wc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:15:40.425981 master-0 kubenswrapper[18592]: I0308 04:15:40.425934 18592 generic.go:334] "Generic (PLEG): container finished" podID="bfc42b5c-d187-4d48-8c06-f13c9eedeb0f" containerID="cf3aa9dc92db12abd2d6f7b72d8c866955bcdacb9eb2e781d134260a36379af8" exitCode=0 Mar 08 04:15:40.425981 master-0 kubenswrapper[18592]: I0308 04:15:40.425977 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" event={"ID":"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f","Type":"ContainerDied","Data":"cf3aa9dc92db12abd2d6f7b72d8c866955bcdacb9eb2e781d134260a36379af8"} Mar 08 04:15:40.426113 master-0 kubenswrapper[18592]: I0308 04:15:40.426000 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" event={"ID":"bfc42b5c-d187-4d48-8c06-f13c9eedeb0f","Type":"ContainerDied","Data":"d8a6e82a1df2cf6279d2ced34377d7541e895791af15a51c957431163763b5c6"} Mar 08 04:15:40.426113 master-0 kubenswrapper[18592]: I0308 04:15:40.426016 18592 scope.go:117] "RemoveContainer" containerID="cf3aa9dc92db12abd2d6f7b72d8c866955bcdacb9eb2e781d134260a36379af8" Mar 08 04:15:40.426113 master-0 kubenswrapper[18592]: I0308 04:15:40.426038 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6877bbfb4f-ht5gv" Mar 08 04:15:40.471787 master-0 kubenswrapper[18592]: I0308 04:15:40.471746 18592 scope.go:117] "RemoveContainer" containerID="aaec311876223841843dcd3f9aa2a1f47b6324e74198c6e495885189ec50d560" Mar 08 04:15:40.495049 master-0 kubenswrapper[18592]: I0308 04:15:40.495006 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8wc7\" (UniqueName: \"kubernetes.io/projected/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f-kube-api-access-l8wc7\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:40.505458 master-0 kubenswrapper[18592]: I0308 04:15:40.505371 18592 scope.go:117] "RemoveContainer" containerID="cf3aa9dc92db12abd2d6f7b72d8c866955bcdacb9eb2e781d134260a36379af8" Mar 08 04:15:40.506096 master-0 kubenswrapper[18592]: E0308 04:15:40.506063 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf3aa9dc92db12abd2d6f7b72d8c866955bcdacb9eb2e781d134260a36379af8\": container with ID starting with cf3aa9dc92db12abd2d6f7b72d8c866955bcdacb9eb2e781d134260a36379af8 not found: ID does not exist" containerID="cf3aa9dc92db12abd2d6f7b72d8c866955bcdacb9eb2e781d134260a36379af8" Mar 08 04:15:40.506159 master-0 kubenswrapper[18592]: I0308 04:15:40.506101 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf3aa9dc92db12abd2d6f7b72d8c866955bcdacb9eb2e781d134260a36379af8"} err="failed to get container status \"cf3aa9dc92db12abd2d6f7b72d8c866955bcdacb9eb2e781d134260a36379af8\": rpc error: code = NotFound desc = could not find container \"cf3aa9dc92db12abd2d6f7b72d8c866955bcdacb9eb2e781d134260a36379af8\": container with ID starting with cf3aa9dc92db12abd2d6f7b72d8c866955bcdacb9eb2e781d134260a36379af8 not found: ID does not exist" Mar 08 04:15:40.506159 master-0 kubenswrapper[18592]: I0308 04:15:40.506129 18592 scope.go:117] "RemoveContainer" containerID="aaec311876223841843dcd3f9aa2a1f47b6324e74198c6e495885189ec50d560" Mar 08 04:15:40.507503 master-0 kubenswrapper[18592]: E0308 04:15:40.506556 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaec311876223841843dcd3f9aa2a1f47b6324e74198c6e495885189ec50d560\": container with ID starting with aaec311876223841843dcd3f9aa2a1f47b6324e74198c6e495885189ec50d560 not found: ID does not exist" containerID="aaec311876223841843dcd3f9aa2a1f47b6324e74198c6e495885189ec50d560" Mar 08 04:15:40.507503 master-0 kubenswrapper[18592]: I0308 04:15:40.506610 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaec311876223841843dcd3f9aa2a1f47b6324e74198c6e495885189ec50d560"} err="failed to get container status \"aaec311876223841843dcd3f9aa2a1f47b6324e74198c6e495885189ec50d560\": rpc error: code = NotFound desc = could not find container \"aaec311876223841843dcd3f9aa2a1f47b6324e74198c6e495885189ec50d560\": container with ID starting with aaec311876223841843dcd3f9aa2a1f47b6324e74198c6e495885189ec50d560 not found: ID does not exist" Mar 08 04:15:40.537493 master-0 kubenswrapper[18592]: I0308 04:15:40.537419 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bfc42b5c-d187-4d48-8c06-f13c9eedeb0f" (UID: "bfc42b5c-d187-4d48-8c06-f13c9eedeb0f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:15:40.597850 master-0 kubenswrapper[18592]: I0308 04:15:40.596656 18592 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:40.875327 master-0 kubenswrapper[18592]: I0308 04:15:40.875175 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f-config" (OuterVolumeSpecName: "config") pod "bfc42b5c-d187-4d48-8c06-f13c9eedeb0f" (UID: "bfc42b5c-d187-4d48-8c06-f13c9eedeb0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:15:40.902789 master-0 kubenswrapper[18592]: I0308 04:15:40.902680 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:41.073601 master-0 kubenswrapper[18592]: I0308 04:15:41.073527 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6877bbfb4f-ht5gv"] Mar 08 04:15:41.085368 master-0 kubenswrapper[18592]: I0308 04:15:41.085281 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6877bbfb4f-ht5gv"] Mar 08 04:15:41.456886 master-0 kubenswrapper[18592]: I0308 04:15:41.456745 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7d8ff61a-7e75-41ec-9314-40ff5a0fea03","Type":"ContainerStarted","Data":"7faa9f36c42f90faf8eed5684cd95ede0a16f91a8b6f5b3b82a9157b7aa718c4"} Mar 08 04:15:41.463333 master-0 kubenswrapper[18592]: I0308 04:15:41.463224 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"17b43cd8-4413-4958-9473-bbc5448585dc","Type":"ContainerStarted","Data":"4996c8bd1eb456cd3cf55219f045758fb75f97c4f290518341ebca3f814b546a"} Mar 08 04:15:41.467438 master-0 kubenswrapper[18592]: I0308 04:15:41.467326 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d174d635-30c7-4d7d-a077-aa6436a5675a","Type":"ContainerStarted","Data":"e798c47661a2794bc0628df2427d1a0c025a2065d53a702f00592972038a90e4"} Mar 08 04:15:41.467777 master-0 kubenswrapper[18592]: I0308 04:15:41.467753 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 08 04:15:41.472179 master-0 kubenswrapper[18592]: I0308 04:15:41.472091 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"19d803f7-e454-4197-833f-539d8f1926ca","Type":"ContainerStarted","Data":"61df43f8ff360cbaabb59865b8194cb0e94c1c2bb9a00cb2da1f539a35f600da"} Mar 08 04:15:41.488639 master-0 kubenswrapper[18592]: I0308 04:15:41.484798 18592 generic.go:334] "Generic (PLEG): container finished" podID="9fb36151-5aa5-462a-b8da-a082585a5a26" containerID="7911ee54e84a12387c29dc193d9de3a66a37f06c5d1335443fc082be0ebef367" exitCode=0 Mar 08 04:15:41.488639 master-0 kubenswrapper[18592]: I0308 04:15:41.484870 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2lfv5" event={"ID":"9fb36151-5aa5-462a-b8da-a082585a5a26","Type":"ContainerDied","Data":"7911ee54e84a12387c29dc193d9de3a66a37f06c5d1335443fc082be0ebef367"} Mar 08 04:15:41.505736 master-0 kubenswrapper[18592]: I0308 04:15:41.505179 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hkfg8" event={"ID":"bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1","Type":"ContainerStarted","Data":"05d6106205e70a4e0ad73917cb676067664b15682e091621dbf66e4790597ef7"} Mar 08 04:15:41.505736 master-0 kubenswrapper[18592]: I0308 04:15:41.505252 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-hkfg8" Mar 08 04:15:41.509929 master-0 kubenswrapper[18592]: I0308 04:15:41.509865 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264","Type":"ContainerStarted","Data":"dd2b4f72433a314603e9e8759f8c7aa5164eadc723693c177bdd065646fd7c92"} Mar 08 04:15:41.585663 master-0 kubenswrapper[18592]: I0308 04:15:41.585504 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.431092621 podStartE2EDuration="27.58548397s" podCreationTimestamp="2026-03-08 04:15:14 +0000 UTC" firstStartedPulling="2026-03-08 04:15:26.996347267 +0000 UTC m=+1339.095101617" lastFinishedPulling="2026-03-08 04:15:40.150738616 +0000 UTC m=+1352.249492966" observedRunningTime="2026-03-08 04:15:41.522011004 +0000 UTC m=+1353.620765354" watchObservedRunningTime="2026-03-08 04:15:41.58548397 +0000 UTC m=+1353.684238320" Mar 08 04:15:41.639732 master-0 kubenswrapper[18592]: I0308 04:15:41.639630 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-hkfg8" podStartSLOduration=10.15243073 podStartE2EDuration="22.639603027s" podCreationTimestamp="2026-03-08 04:15:19 +0000 UTC" firstStartedPulling="2026-03-08 04:15:27.758387126 +0000 UTC m=+1339.857141476" lastFinishedPulling="2026-03-08 04:15:40.245559423 +0000 UTC m=+1352.344313773" observedRunningTime="2026-03-08 04:15:41.572664517 +0000 UTC m=+1353.671418887" watchObservedRunningTime="2026-03-08 04:15:41.639603027 +0000 UTC m=+1353.738357377" Mar 08 04:15:42.164540 master-0 kubenswrapper[18592]: I0308 04:15:42.162809 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfc42b5c-d187-4d48-8c06-f13c9eedeb0f" path="/var/lib/kubelet/pods/bfc42b5c-d187-4d48-8c06-f13c9eedeb0f/volumes" Mar 08 04:15:42.520856 master-0 kubenswrapper[18592]: I0308 04:15:42.520776 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2lfv5" event={"ID":"9fb36151-5aa5-462a-b8da-a082585a5a26","Type":"ContainerStarted","Data":"56b7a81692a7f41744c2763f98fc66e4f0abf38e8d7d724edc9807589e198265"} Mar 08 04:15:42.521493 master-0 kubenswrapper[18592]: I0308 04:15:42.520864 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2lfv5" event={"ID":"9fb36151-5aa5-462a-b8da-a082585a5a26","Type":"ContainerStarted","Data":"372501d2d4171e936eda33845ce34955ba59d14ff4eb3f3a305aef501ea701d4"} Mar 08 04:15:42.521493 master-0 kubenswrapper[18592]: I0308 04:15:42.520949 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:42.526330 master-0 kubenswrapper[18592]: I0308 04:15:42.526302 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6cc32df9-dcb4-43f3-b78d-f992b0488bf1","Type":"ContainerStarted","Data":"f547e9071c3e4ece9adb6ee184ffdb6d1ee7719b88141eda4fb0b5e8b1fdc8bb"} Mar 08 04:15:42.535009 master-0 kubenswrapper[18592]: I0308 04:15:42.534958 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"466c2b13-2b27-4a83-911c-db97d66490a5","Type":"ContainerStarted","Data":"3547d58beca639c1dd05442bb3ec63e90c02312aa9592cb67cc9bcbf9a7f2eaa"} Mar 08 04:15:42.551265 master-0 kubenswrapper[18592]: I0308 04:15:42.551155 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-2lfv5" podStartSLOduration=11.477620571 podStartE2EDuration="23.551135866s" podCreationTimestamp="2026-03-08 04:15:19 +0000 UTC" firstStartedPulling="2026-03-08 04:15:28.076987435 +0000 UTC m=+1340.175741785" lastFinishedPulling="2026-03-08 04:15:40.15050271 +0000 UTC m=+1352.249257080" observedRunningTime="2026-03-08 04:15:42.54038465 +0000 UTC m=+1354.639139010" watchObservedRunningTime="2026-03-08 04:15:42.551135866 +0000 UTC m=+1354.649890216" Mar 08 04:15:43.053863 master-0 kubenswrapper[18592]: I0308 04:15:43.053015 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-5lz68"] Mar 08 04:15:43.053863 master-0 kubenswrapper[18592]: E0308 04:15:43.053496 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfc42b5c-d187-4d48-8c06-f13c9eedeb0f" containerName="dnsmasq-dns" Mar 08 04:15:43.053863 master-0 kubenswrapper[18592]: I0308 04:15:43.053509 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfc42b5c-d187-4d48-8c06-f13c9eedeb0f" containerName="dnsmasq-dns" Mar 08 04:15:43.053863 master-0 kubenswrapper[18592]: E0308 04:15:43.053521 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651222ab-0f62-459d-b8a2-4897537cb868" containerName="init" Mar 08 04:15:43.053863 master-0 kubenswrapper[18592]: I0308 04:15:43.053527 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="651222ab-0f62-459d-b8a2-4897537cb868" containerName="init" Mar 08 04:15:43.053863 master-0 kubenswrapper[18592]: E0308 04:15:43.053540 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a117758c-2170-460f-8db5-251a9ca2006f" containerName="init" Mar 08 04:15:43.053863 master-0 kubenswrapper[18592]: I0308 04:15:43.053547 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a117758c-2170-460f-8db5-251a9ca2006f" containerName="init" Mar 08 04:15:43.053863 master-0 kubenswrapper[18592]: E0308 04:15:43.053567 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfc42b5c-d187-4d48-8c06-f13c9eedeb0f" containerName="init" Mar 08 04:15:43.053863 master-0 kubenswrapper[18592]: I0308 04:15:43.053573 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfc42b5c-d187-4d48-8c06-f13c9eedeb0f" containerName="init" Mar 08 04:15:43.053863 master-0 kubenswrapper[18592]: I0308 04:15:43.053757 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfc42b5c-d187-4d48-8c06-f13c9eedeb0f" containerName="dnsmasq-dns" Mar 08 04:15:43.053863 master-0 kubenswrapper[18592]: I0308 04:15:43.053779 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="651222ab-0f62-459d-b8a2-4897537cb868" containerName="init" Mar 08 04:15:43.053863 master-0 kubenswrapper[18592]: I0308 04:15:43.053800 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a117758c-2170-460f-8db5-251a9ca2006f" containerName="init" Mar 08 04:15:43.058487 master-0 kubenswrapper[18592]: I0308 04:15:43.054544 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.060261 master-0 kubenswrapper[18592]: I0308 04:15:43.059605 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 08 04:15:43.074259 master-0 kubenswrapper[18592]: I0308 04:15:43.073586 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-5lz68"] Mar 08 04:15:43.146085 master-0 kubenswrapper[18592]: I0308 04:15:43.145464 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa276697-ebfd-42a0-b269-60e71b01056c-combined-ca-bundle\") pod \"ovn-controller-metrics-5lz68\" (UID: \"fa276697-ebfd-42a0-b269-60e71b01056c\") " pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.146085 master-0 kubenswrapper[18592]: I0308 04:15:43.145522 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fa276697-ebfd-42a0-b269-60e71b01056c-ovn-rundir\") pod \"ovn-controller-metrics-5lz68\" (UID: \"fa276697-ebfd-42a0-b269-60e71b01056c\") " pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.146085 master-0 kubenswrapper[18592]: I0308 04:15:43.145800 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa276697-ebfd-42a0-b269-60e71b01056c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5lz68\" (UID: \"fa276697-ebfd-42a0-b269-60e71b01056c\") " pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.146085 master-0 kubenswrapper[18592]: I0308 04:15:43.145989 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fa276697-ebfd-42a0-b269-60e71b01056c-ovs-rundir\") pod \"ovn-controller-metrics-5lz68\" (UID: \"fa276697-ebfd-42a0-b269-60e71b01056c\") " pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.146575 master-0 kubenswrapper[18592]: I0308 04:15:43.146170 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjsbb\" (UniqueName: \"kubernetes.io/projected/fa276697-ebfd-42a0-b269-60e71b01056c-kube-api-access-pjsbb\") pod \"ovn-controller-metrics-5lz68\" (UID: \"fa276697-ebfd-42a0-b269-60e71b01056c\") " pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.146575 master-0 kubenswrapper[18592]: I0308 04:15:43.146239 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa276697-ebfd-42a0-b269-60e71b01056c-config\") pod \"ovn-controller-metrics-5lz68\" (UID: \"fa276697-ebfd-42a0-b269-60e71b01056c\") " pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.251011 master-0 kubenswrapper[18592]: I0308 04:15:43.250937 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjsbb\" (UniqueName: \"kubernetes.io/projected/fa276697-ebfd-42a0-b269-60e71b01056c-kube-api-access-pjsbb\") pod \"ovn-controller-metrics-5lz68\" (UID: \"fa276697-ebfd-42a0-b269-60e71b01056c\") " pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.251011 master-0 kubenswrapper[18592]: I0308 04:15:43.251019 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa276697-ebfd-42a0-b269-60e71b01056c-config\") pod \"ovn-controller-metrics-5lz68\" (UID: \"fa276697-ebfd-42a0-b269-60e71b01056c\") " pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.251283 master-0 kubenswrapper[18592]: I0308 04:15:43.251095 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa276697-ebfd-42a0-b269-60e71b01056c-combined-ca-bundle\") pod \"ovn-controller-metrics-5lz68\" (UID: \"fa276697-ebfd-42a0-b269-60e71b01056c\") " pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.251283 master-0 kubenswrapper[18592]: I0308 04:15:43.251116 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fa276697-ebfd-42a0-b269-60e71b01056c-ovn-rundir\") pod \"ovn-controller-metrics-5lz68\" (UID: \"fa276697-ebfd-42a0-b269-60e71b01056c\") " pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.251283 master-0 kubenswrapper[18592]: I0308 04:15:43.251232 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa276697-ebfd-42a0-b269-60e71b01056c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5lz68\" (UID: \"fa276697-ebfd-42a0-b269-60e71b01056c\") " pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.251283 master-0 kubenswrapper[18592]: I0308 04:15:43.251263 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fa276697-ebfd-42a0-b269-60e71b01056c-ovs-rundir\") pod \"ovn-controller-metrics-5lz68\" (UID: \"fa276697-ebfd-42a0-b269-60e71b01056c\") " pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.252846 master-0 kubenswrapper[18592]: I0308 04:15:43.252784 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fa276697-ebfd-42a0-b269-60e71b01056c-ovs-rundir\") pod \"ovn-controller-metrics-5lz68\" (UID: \"fa276697-ebfd-42a0-b269-60e71b01056c\") " pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.253021 master-0 kubenswrapper[18592]: I0308 04:15:43.252989 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fa276697-ebfd-42a0-b269-60e71b01056c-ovn-rundir\") pod \"ovn-controller-metrics-5lz68\" (UID: \"fa276697-ebfd-42a0-b269-60e71b01056c\") " pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.253682 master-0 kubenswrapper[18592]: I0308 04:15:43.253648 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa276697-ebfd-42a0-b269-60e71b01056c-config\") pod \"ovn-controller-metrics-5lz68\" (UID: \"fa276697-ebfd-42a0-b269-60e71b01056c\") " pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.255591 master-0 kubenswrapper[18592]: I0308 04:15:43.255520 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b59cd5bfc-fkx85"] Mar 08 04:15:43.256481 master-0 kubenswrapper[18592]: I0308 04:15:43.256449 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa276697-ebfd-42a0-b269-60e71b01056c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5lz68\" (UID: \"fa276697-ebfd-42a0-b269-60e71b01056c\") " pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.256580 master-0 kubenswrapper[18592]: I0308 04:15:43.256559 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa276697-ebfd-42a0-b269-60e71b01056c-combined-ca-bundle\") pod \"ovn-controller-metrics-5lz68\" (UID: \"fa276697-ebfd-42a0-b269-60e71b01056c\") " pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.257600 master-0 kubenswrapper[18592]: I0308 04:15:43.257564 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b59cd5bfc-fkx85" Mar 08 04:15:43.264251 master-0 kubenswrapper[18592]: I0308 04:15:43.261385 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 08 04:15:43.276338 master-0 kubenswrapper[18592]: I0308 04:15:43.276270 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b59cd5bfc-fkx85"] Mar 08 04:15:43.284806 master-0 kubenswrapper[18592]: I0308 04:15:43.284757 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjsbb\" (UniqueName: \"kubernetes.io/projected/fa276697-ebfd-42a0-b269-60e71b01056c-kube-api-access-pjsbb\") pod \"ovn-controller-metrics-5lz68\" (UID: \"fa276697-ebfd-42a0-b269-60e71b01056c\") " pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.366197 master-0 kubenswrapper[18592]: I0308 04:15:43.353403 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgtcb\" (UniqueName: \"kubernetes.io/projected/754e1319-1bad-4c83-92f1-f590c65509b4-kube-api-access-zgtcb\") pod \"dnsmasq-dns-b59cd5bfc-fkx85\" (UID: \"754e1319-1bad-4c83-92f1-f590c65509b4\") " pod="openstack/dnsmasq-dns-b59cd5bfc-fkx85" Mar 08 04:15:43.366197 master-0 kubenswrapper[18592]: I0308 04:15:43.353468 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/754e1319-1bad-4c83-92f1-f590c65509b4-ovsdbserver-sb\") pod \"dnsmasq-dns-b59cd5bfc-fkx85\" (UID: \"754e1319-1bad-4c83-92f1-f590c65509b4\") " pod="openstack/dnsmasq-dns-b59cd5bfc-fkx85" Mar 08 04:15:43.366197 master-0 kubenswrapper[18592]: I0308 04:15:43.353492 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/754e1319-1bad-4c83-92f1-f590c65509b4-config\") pod \"dnsmasq-dns-b59cd5bfc-fkx85\" (UID: \"754e1319-1bad-4c83-92f1-f590c65509b4\") " pod="openstack/dnsmasq-dns-b59cd5bfc-fkx85" Mar 08 04:15:43.366197 master-0 kubenswrapper[18592]: I0308 04:15:43.353600 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/754e1319-1bad-4c83-92f1-f590c65509b4-dns-svc\") pod \"dnsmasq-dns-b59cd5bfc-fkx85\" (UID: \"754e1319-1bad-4c83-92f1-f590c65509b4\") " pod="openstack/dnsmasq-dns-b59cd5bfc-fkx85" Mar 08 04:15:43.383113 master-0 kubenswrapper[18592]: I0308 04:15:43.383043 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5lz68" Mar 08 04:15:43.450491 master-0 kubenswrapper[18592]: I0308 04:15:43.450444 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b59cd5bfc-fkx85"] Mar 08 04:15:43.451639 master-0 kubenswrapper[18592]: E0308 04:15:43.451614 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-zgtcb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-b59cd5bfc-fkx85" podUID="754e1319-1bad-4c83-92f1-f590c65509b4" Mar 08 04:15:43.457889 master-0 kubenswrapper[18592]: I0308 04:15:43.456842 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/754e1319-1bad-4c83-92f1-f590c65509b4-dns-svc\") pod \"dnsmasq-dns-b59cd5bfc-fkx85\" (UID: \"754e1319-1bad-4c83-92f1-f590c65509b4\") " pod="openstack/dnsmasq-dns-b59cd5bfc-fkx85" Mar 08 04:15:43.457889 master-0 kubenswrapper[18592]: I0308 04:15:43.456930 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgtcb\" (UniqueName: \"kubernetes.io/projected/754e1319-1bad-4c83-92f1-f590c65509b4-kube-api-access-zgtcb\") pod \"dnsmasq-dns-b59cd5bfc-fkx85\" (UID: \"754e1319-1bad-4c83-92f1-f590c65509b4\") " pod="openstack/dnsmasq-dns-b59cd5bfc-fkx85" Mar 08 04:15:43.457889 master-0 kubenswrapper[18592]: I0308 04:15:43.456962 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/754e1319-1bad-4c83-92f1-f590c65509b4-ovsdbserver-sb\") pod \"dnsmasq-dns-b59cd5bfc-fkx85\" (UID: \"754e1319-1bad-4c83-92f1-f590c65509b4\") " pod="openstack/dnsmasq-dns-b59cd5bfc-fkx85" Mar 08 04:15:43.457889 master-0 kubenswrapper[18592]: I0308 04:15:43.456982 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/754e1319-1bad-4c83-92f1-f590c65509b4-config\") pod \"dnsmasq-dns-b59cd5bfc-fkx85\" (UID: \"754e1319-1bad-4c83-92f1-f590c65509b4\") " pod="openstack/dnsmasq-dns-b59cd5bfc-fkx85" Mar 08 04:15:43.458165 master-0 kubenswrapper[18592]: I0308 04:15:43.457913 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/754e1319-1bad-4c83-92f1-f590c65509b4-config\") pod \"dnsmasq-dns-b59cd5bfc-fkx85\" (UID: \"754e1319-1bad-4c83-92f1-f590c65509b4\") " pod="openstack/dnsmasq-dns-b59cd5bfc-fkx85" Mar 08 04:15:43.458484 master-0 kubenswrapper[18592]: I0308 04:15:43.458463 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/754e1319-1bad-4c83-92f1-f590c65509b4-dns-svc\") pod \"dnsmasq-dns-b59cd5bfc-fkx85\" (UID: \"754e1319-1bad-4c83-92f1-f590c65509b4\") " pod="openstack/dnsmasq-dns-b59cd5bfc-fkx85" Mar 08 04:15:43.459015 master-0 kubenswrapper[18592]: I0308 04:15:43.458999 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/754e1319-1bad-4c83-92f1-f590c65509b4-ovsdbserver-sb\") pod \"dnsmasq-dns-b59cd5bfc-fkx85\" (UID: \"754e1319-1bad-4c83-92f1-f590c65509b4\") " pod="openstack/dnsmasq-dns-b59cd5bfc-fkx85" Mar 08 04:15:43.487714 master-0 kubenswrapper[18592]: I0308 04:15:43.486486 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgtcb\" (UniqueName: \"kubernetes.io/projected/754e1319-1bad-4c83-92f1-f590c65509b4-kube-api-access-zgtcb\") pod \"dnsmasq-dns-b59cd5bfc-fkx85\" (UID: \"754e1319-1bad-4c83-92f1-f590c65509b4\") " pod="openstack/dnsmasq-dns-b59cd5bfc-fkx85" Mar 08 04:15:43.526134 master-0 kubenswrapper[18592]: I0308 04:15:43.513161 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-689ddcfcf7-jh95m"] Mar 08 04:15:43.526134 master-0 kubenswrapper[18592]: I0308 04:15:43.514985 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:43.526134 master-0 kubenswrapper[18592]: I0308 04:15:43.520687 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689ddcfcf7-jh95m"] Mar 08 04:15:43.547866 master-0 kubenswrapper[18592]: I0308 04:15:43.536864 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 08 04:15:43.558367 master-0 kubenswrapper[18592]: I0308 04:15:43.558303 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-config\") pod \"dnsmasq-dns-689ddcfcf7-jh95m\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:43.558461 master-0 kubenswrapper[18592]: I0308 04:15:43.558389 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfnc6\" (UniqueName: \"kubernetes.io/projected/ff3c3806-caca-42ef-993a-85bc4e509640-kube-api-access-xfnc6\") pod \"dnsmasq-dns-689ddcfcf7-jh95m\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:43.558461 master-0 kubenswrapper[18592]: I0308 04:15:43.558413 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-ovsdbserver-sb\") pod \"dnsmasq-dns-689ddcfcf7-jh95m\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:43.558461 master-0 kubenswrapper[18592]: I0308 04:15:43.558447 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-ovsdbserver-nb\") pod \"dnsmasq-dns-689ddcfcf7-jh95m\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:43.558619 master-0 kubenswrapper[18592]: I0308 04:15:43.558490 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-dns-svc\") pod \"dnsmasq-dns-689ddcfcf7-jh95m\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:43.560988 master-0 kubenswrapper[18592]: I0308 04:15:43.560957 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b59cd5bfc-fkx85" Mar 08 04:15:43.562608 master-0 kubenswrapper[18592]: I0308 04:15:43.562579 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:15:43.596037 master-0 kubenswrapper[18592]: I0308 04:15:43.595951 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b59cd5bfc-fkx85" Mar 08 04:15:43.670244 master-0 kubenswrapper[18592]: I0308 04:15:43.668639 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/754e1319-1bad-4c83-92f1-f590c65509b4-dns-svc\") pod \"754e1319-1bad-4c83-92f1-f590c65509b4\" (UID: \"754e1319-1bad-4c83-92f1-f590c65509b4\") " Mar 08 04:15:43.670244 master-0 kubenswrapper[18592]: I0308 04:15:43.668745 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/754e1319-1bad-4c83-92f1-f590c65509b4-ovsdbserver-sb\") pod \"754e1319-1bad-4c83-92f1-f590c65509b4\" (UID: \"754e1319-1bad-4c83-92f1-f590c65509b4\") " Mar 08 04:15:43.670244 master-0 kubenswrapper[18592]: I0308 04:15:43.668815 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/754e1319-1bad-4c83-92f1-f590c65509b4-config\") pod \"754e1319-1bad-4c83-92f1-f590c65509b4\" (UID: \"754e1319-1bad-4c83-92f1-f590c65509b4\") " Mar 08 04:15:43.670244 master-0 kubenswrapper[18592]: I0308 04:15:43.668906 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgtcb\" (UniqueName: \"kubernetes.io/projected/754e1319-1bad-4c83-92f1-f590c65509b4-kube-api-access-zgtcb\") pod \"754e1319-1bad-4c83-92f1-f590c65509b4\" (UID: \"754e1319-1bad-4c83-92f1-f590c65509b4\") " Mar 08 04:15:43.670244 master-0 kubenswrapper[18592]: I0308 04:15:43.669373 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-config\") pod \"dnsmasq-dns-689ddcfcf7-jh95m\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:43.670244 master-0 kubenswrapper[18592]: I0308 04:15:43.669430 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfnc6\" (UniqueName: \"kubernetes.io/projected/ff3c3806-caca-42ef-993a-85bc4e509640-kube-api-access-xfnc6\") pod \"dnsmasq-dns-689ddcfcf7-jh95m\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:43.670244 master-0 kubenswrapper[18592]: I0308 04:15:43.669449 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-ovsdbserver-sb\") pod \"dnsmasq-dns-689ddcfcf7-jh95m\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:43.670244 master-0 kubenswrapper[18592]: I0308 04:15:43.669509 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-ovsdbserver-nb\") pod \"dnsmasq-dns-689ddcfcf7-jh95m\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:43.670244 master-0 kubenswrapper[18592]: I0308 04:15:43.669564 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-dns-svc\") pod \"dnsmasq-dns-689ddcfcf7-jh95m\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:43.670244 master-0 kubenswrapper[18592]: I0308 04:15:43.669974 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/754e1319-1bad-4c83-92f1-f590c65509b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "754e1319-1bad-4c83-92f1-f590c65509b4" (UID: "754e1319-1bad-4c83-92f1-f590c65509b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:15:43.670700 master-0 kubenswrapper[18592]: I0308 04:15:43.670307 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/754e1319-1bad-4c83-92f1-f590c65509b4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "754e1319-1bad-4c83-92f1-f590c65509b4" (UID: "754e1319-1bad-4c83-92f1-f590c65509b4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:15:43.670700 master-0 kubenswrapper[18592]: I0308 04:15:43.670624 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/754e1319-1bad-4c83-92f1-f590c65509b4-config" (OuterVolumeSpecName: "config") pod "754e1319-1bad-4c83-92f1-f590c65509b4" (UID: "754e1319-1bad-4c83-92f1-f590c65509b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:15:43.670963 master-0 kubenswrapper[18592]: I0308 04:15:43.670913 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-dns-svc\") pod \"dnsmasq-dns-689ddcfcf7-jh95m\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:43.677029 master-0 kubenswrapper[18592]: I0308 04:15:43.672206 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-ovsdbserver-sb\") pod \"dnsmasq-dns-689ddcfcf7-jh95m\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:43.677029 master-0 kubenswrapper[18592]: I0308 04:15:43.672334 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/754e1319-1bad-4c83-92f1-f590c65509b4-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:43.677029 master-0 kubenswrapper[18592]: I0308 04:15:43.672345 18592 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/754e1319-1bad-4c83-92f1-f590c65509b4-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:43.677029 master-0 kubenswrapper[18592]: I0308 04:15:43.672354 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/754e1319-1bad-4c83-92f1-f590c65509b4-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:43.677029 master-0 kubenswrapper[18592]: I0308 04:15:43.673475 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-ovsdbserver-nb\") pod \"dnsmasq-dns-689ddcfcf7-jh95m\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:43.677644 master-0 kubenswrapper[18592]: I0308 04:15:43.677619 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-config\") pod \"dnsmasq-dns-689ddcfcf7-jh95m\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:43.701852 master-0 kubenswrapper[18592]: I0308 04:15:43.697882 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfnc6\" (UniqueName: \"kubernetes.io/projected/ff3c3806-caca-42ef-993a-85bc4e509640-kube-api-access-xfnc6\") pod \"dnsmasq-dns-689ddcfcf7-jh95m\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:43.701852 master-0 kubenswrapper[18592]: I0308 04:15:43.700035 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/754e1319-1bad-4c83-92f1-f590c65509b4-kube-api-access-zgtcb" (OuterVolumeSpecName: "kube-api-access-zgtcb") pod "754e1319-1bad-4c83-92f1-f590c65509b4" (UID: "754e1319-1bad-4c83-92f1-f590c65509b4"). InnerVolumeSpecName "kube-api-access-zgtcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:15:43.775844 master-0 kubenswrapper[18592]: I0308 04:15:43.774118 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgtcb\" (UniqueName: \"kubernetes.io/projected/754e1319-1bad-4c83-92f1-f590c65509b4-kube-api-access-zgtcb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:43.859108 master-0 kubenswrapper[18592]: I0308 04:15:43.858982 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:44.577551 master-0 kubenswrapper[18592]: I0308 04:15:44.577489 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b59cd5bfc-fkx85" Mar 08 04:15:44.638310 master-0 kubenswrapper[18592]: I0308 04:15:44.632175 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b59cd5bfc-fkx85"] Mar 08 04:15:44.643084 master-0 kubenswrapper[18592]: I0308 04:15:44.642425 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b59cd5bfc-fkx85"] Mar 08 04:15:45.596971 master-0 kubenswrapper[18592]: I0308 04:15:45.596918 18592 generic.go:334] "Generic (PLEG): container finished" podID="7d8ff61a-7e75-41ec-9314-40ff5a0fea03" containerID="7faa9f36c42f90faf8eed5684cd95ede0a16f91a8b6f5b3b82a9157b7aa718c4" exitCode=0 Mar 08 04:15:45.597443 master-0 kubenswrapper[18592]: I0308 04:15:45.596943 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7d8ff61a-7e75-41ec-9314-40ff5a0fea03","Type":"ContainerDied","Data":"7faa9f36c42f90faf8eed5684cd95ede0a16f91a8b6f5b3b82a9157b7aa718c4"} Mar 08 04:15:45.600002 master-0 kubenswrapper[18592]: I0308 04:15:45.599253 18592 generic.go:334] "Generic (PLEG): container finished" podID="17b43cd8-4413-4958-9473-bbc5448585dc" containerID="4996c8bd1eb456cd3cf55219f045758fb75f97c4f290518341ebca3f814b546a" exitCode=0 Mar 08 04:15:45.600002 master-0 kubenswrapper[18592]: I0308 04:15:45.599294 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"17b43cd8-4413-4958-9473-bbc5448585dc","Type":"ContainerDied","Data":"4996c8bd1eb456cd3cf55219f045758fb75f97c4f290518341ebca3f814b546a"} Mar 08 04:15:45.878686 master-0 kubenswrapper[18592]: I0308 04:15:45.878602 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-5lz68"] Mar 08 04:15:46.053474 master-0 kubenswrapper[18592]: I0308 04:15:46.053410 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689ddcfcf7-jh95m"] Mar 08 04:15:46.055062 master-0 kubenswrapper[18592]: W0308 04:15:46.055017 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff3c3806_caca_42ef_993a_85bc4e509640.slice/crio-5c89f0b76ab9802411596ba2ae041f5d4c44edb91c44b252892bdfcb11170dce WatchSource:0}: Error finding container 5c89f0b76ab9802411596ba2ae041f5d4c44edb91c44b252892bdfcb11170dce: Status 404 returned error can't find the container with id 5c89f0b76ab9802411596ba2ae041f5d4c44edb91c44b252892bdfcb11170dce Mar 08 04:15:46.161578 master-0 kubenswrapper[18592]: I0308 04:15:46.158771 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="754e1319-1bad-4c83-92f1-f590c65509b4" path="/var/lib/kubelet/pods/754e1319-1bad-4c83-92f1-f590c65509b4/volumes" Mar 08 04:15:46.614186 master-0 kubenswrapper[18592]: I0308 04:15:46.613996 18592 generic.go:334] "Generic (PLEG): container finished" podID="ff3c3806-caca-42ef-993a-85bc4e509640" containerID="6fa67719324bab80394201b5cf7e864c9c6f746b3a1ad91fae10fe49168214c1" exitCode=0 Mar 08 04:15:46.614186 master-0 kubenswrapper[18592]: I0308 04:15:46.614070 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" event={"ID":"ff3c3806-caca-42ef-993a-85bc4e509640","Type":"ContainerDied","Data":"6fa67719324bab80394201b5cf7e864c9c6f746b3a1ad91fae10fe49168214c1"} Mar 08 04:15:46.614186 master-0 kubenswrapper[18592]: I0308 04:15:46.614098 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" event={"ID":"ff3c3806-caca-42ef-993a-85bc4e509640","Type":"ContainerStarted","Data":"5c89f0b76ab9802411596ba2ae041f5d4c44edb91c44b252892bdfcb11170dce"} Mar 08 04:15:46.618079 master-0 kubenswrapper[18592]: I0308 04:15:46.618007 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264","Type":"ContainerStarted","Data":"02963c0fa42020d1e2119b3c6b7c210f5cd7353f6e3e68e4f6fefd3370532fd7"} Mar 08 04:15:46.620358 master-0 kubenswrapper[18592]: I0308 04:15:46.620309 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5lz68" event={"ID":"fa276697-ebfd-42a0-b269-60e71b01056c","Type":"ContainerStarted","Data":"e3dd78196f244f9c32fff99b20a94dd8b770ed1da14edea629798ea08774b901"} Mar 08 04:15:46.620358 master-0 kubenswrapper[18592]: I0308 04:15:46.620348 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5lz68" event={"ID":"fa276697-ebfd-42a0-b269-60e71b01056c","Type":"ContainerStarted","Data":"2023268fada70308cdd12342c47f462ade25f98347db094500fbc1b34739b903"} Mar 08 04:15:46.626787 master-0 kubenswrapper[18592]: I0308 04:15:46.626724 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7d8ff61a-7e75-41ec-9314-40ff5a0fea03","Type":"ContainerStarted","Data":"f87620e29cb7643a520962e5561b9447f5397998015a482cc8032c46e28ea5eb"} Mar 08 04:15:46.632526 master-0 kubenswrapper[18592]: I0308 04:15:46.632441 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"17b43cd8-4413-4958-9473-bbc5448585dc","Type":"ContainerStarted","Data":"6434bb0b067dbcd129f353ed226a608d7df022b3f3b155050ccb37a5f793db78"} Mar 08 04:15:46.635434 master-0 kubenswrapper[18592]: I0308 04:15:46.635370 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"19d803f7-e454-4197-833f-539d8f1926ca","Type":"ContainerStarted","Data":"991913048f6ffa9e0aaec002211f0ab5b09287668dc16f36d49e65db310ae5a4"} Mar 08 04:15:46.696861 master-0 kubenswrapper[18592]: I0308 04:15:46.696660 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.424520395 podStartE2EDuration="34.6966042s" podCreationTimestamp="2026-03-08 04:15:12 +0000 UTC" firstStartedPulling="2026-03-08 04:15:27.877437388 +0000 UTC m=+1339.976191738" lastFinishedPulling="2026-03-08 04:15:40.149521193 +0000 UTC m=+1352.248275543" observedRunningTime="2026-03-08 04:15:46.689293279 +0000 UTC m=+1358.788047669" watchObservedRunningTime="2026-03-08 04:15:46.6966042 +0000 UTC m=+1358.795358570" Mar 08 04:15:46.720898 master-0 kubenswrapper[18592]: I0308 04:15:46.720814 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:46.762288 master-0 kubenswrapper[18592]: I0308 04:15:46.762127 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=12.578823822 podStartE2EDuration="25.76210815s" podCreationTimestamp="2026-03-08 04:15:21 +0000 UTC" firstStartedPulling="2026-03-08 04:15:32.3114082 +0000 UTC m=+1344.410162550" lastFinishedPulling="2026-03-08 04:15:45.494692528 +0000 UTC m=+1357.593446878" observedRunningTime="2026-03-08 04:15:46.717657789 +0000 UTC m=+1358.816412149" watchObservedRunningTime="2026-03-08 04:15:46.76210815 +0000 UTC m=+1358.860862510" Mar 08 04:15:46.772310 master-0 kubenswrapper[18592]: I0308 04:15:46.771962 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=22.306486121 podStartE2EDuration="34.771936291s" podCreationTimestamp="2026-03-08 04:15:12 +0000 UTC" firstStartedPulling="2026-03-08 04:15:27.79417818 +0000 UTC m=+1339.892932530" lastFinishedPulling="2026-03-08 04:15:40.25962834 +0000 UTC m=+1352.358382700" observedRunningTime="2026-03-08 04:15:46.748090805 +0000 UTC m=+1358.846845195" watchObservedRunningTime="2026-03-08 04:15:46.771936291 +0000 UTC m=+1358.870690661" Mar 08 04:15:46.808519 master-0 kubenswrapper[18592]: I0308 04:15:46.805077 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-5lz68" podStartSLOduration=3.805052242 podStartE2EDuration="3.805052242s" podCreationTimestamp="2026-03-08 04:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:15:46.788563528 +0000 UTC m=+1358.887317888" watchObservedRunningTime="2026-03-08 04:15:46.805052242 +0000 UTC m=+1358.903806612" Mar 08 04:15:46.821563 master-0 kubenswrapper[18592]: I0308 04:15:46.821493 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:46.839910 master-0 kubenswrapper[18592]: I0308 04:15:46.839136 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=7.4719114730000005 podStartE2EDuration="24.839117128s" podCreationTimestamp="2026-03-08 04:15:22 +0000 UTC" firstStartedPulling="2026-03-08 04:15:28.167449111 +0000 UTC m=+1340.266203471" lastFinishedPulling="2026-03-08 04:15:45.534654776 +0000 UTC m=+1357.633409126" observedRunningTime="2026-03-08 04:15:46.827997212 +0000 UTC m=+1358.926751562" watchObservedRunningTime="2026-03-08 04:15:46.839117128 +0000 UTC m=+1358.937871478" Mar 08 04:15:47.118056 master-0 kubenswrapper[18592]: I0308 04:15:47.118006 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:47.648939 master-0 kubenswrapper[18592]: I0308 04:15:47.648744 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" event={"ID":"ff3c3806-caca-42ef-993a-85bc4e509640","Type":"ContainerStarted","Data":"9bf4910b9e2bde3ae21cefb3f6bf1431a41853ad07a96b9bec2ca95e4593ff20"} Mar 08 04:15:47.649815 master-0 kubenswrapper[18592]: I0308 04:15:47.648990 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:47.650120 master-0 kubenswrapper[18592]: I0308 04:15:47.650062 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:47.691134 master-0 kubenswrapper[18592]: I0308 04:15:47.690964 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" podStartSLOduration=4.690896384 podStartE2EDuration="4.690896384s" podCreationTimestamp="2026-03-08 04:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:15:47.674047031 +0000 UTC m=+1359.772801381" watchObservedRunningTime="2026-03-08 04:15:47.690896384 +0000 UTC m=+1359.789650774" Mar 08 04:15:47.731873 master-0 kubenswrapper[18592]: I0308 04:15:47.731503 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 08 04:15:48.118599 master-0 kubenswrapper[18592]: I0308 04:15:48.118505 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:48.179162 master-0 kubenswrapper[18592]: I0308 04:15:48.179089 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:48.685986 master-0 kubenswrapper[18592]: E0308 04:15:48.685642 18592 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:54156->192.168.32.10:42143: write tcp 192.168.32.10:54156->192.168.32.10:42143: write: broken pipe Mar 08 04:15:48.702954 master-0 kubenswrapper[18592]: I0308 04:15:48.702871 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 08 04:15:48.881204 master-0 kubenswrapper[18592]: I0308 04:15:48.881154 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 08 04:15:48.888885 master-0 kubenswrapper[18592]: I0308 04:15:48.888472 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 04:15:48.910201 master-0 kubenswrapper[18592]: I0308 04:15:48.909687 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 08 04:15:48.910201 master-0 kubenswrapper[18592]: I0308 04:15:48.909737 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 08 04:15:48.910201 master-0 kubenswrapper[18592]: I0308 04:15:48.910117 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 08 04:15:48.947402 master-0 kubenswrapper[18592]: I0308 04:15:48.942718 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 04:15:49.020338 master-0 kubenswrapper[18592]: I0308 04:15:49.020299 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e48f207-5d13-41af-9187-97f528daeb55-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.020539 master-0 kubenswrapper[18592]: I0308 04:15:49.020523 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxg96\" (UniqueName: \"kubernetes.io/projected/8e48f207-5d13-41af-9187-97f528daeb55-kube-api-access-zxg96\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.020637 master-0 kubenswrapper[18592]: I0308 04:15:49.020622 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e48f207-5d13-41af-9187-97f528daeb55-config\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.020748 master-0 kubenswrapper[18592]: I0308 04:15:49.020735 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e48f207-5d13-41af-9187-97f528daeb55-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.020844 master-0 kubenswrapper[18592]: I0308 04:15:49.020817 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e48f207-5d13-41af-9187-97f528daeb55-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.020922 master-0 kubenswrapper[18592]: I0308 04:15:49.020911 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e48f207-5d13-41af-9187-97f528daeb55-scripts\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.020992 master-0 kubenswrapper[18592]: I0308 04:15:49.020979 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e48f207-5d13-41af-9187-97f528daeb55-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.122372 master-0 kubenswrapper[18592]: I0308 04:15:49.122309 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e48f207-5d13-41af-9187-97f528daeb55-config\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.122590 master-0 kubenswrapper[18592]: I0308 04:15:49.122406 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e48f207-5d13-41af-9187-97f528daeb55-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.122590 master-0 kubenswrapper[18592]: I0308 04:15:49.122440 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e48f207-5d13-41af-9187-97f528daeb55-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.122590 master-0 kubenswrapper[18592]: I0308 04:15:49.122455 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e48f207-5d13-41af-9187-97f528daeb55-scripts\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.122590 master-0 kubenswrapper[18592]: I0308 04:15:49.122476 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e48f207-5d13-41af-9187-97f528daeb55-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.122590 master-0 kubenswrapper[18592]: I0308 04:15:49.122566 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e48f207-5d13-41af-9187-97f528daeb55-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.122590 master-0 kubenswrapper[18592]: I0308 04:15:49.122582 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxg96\" (UniqueName: \"kubernetes.io/projected/8e48f207-5d13-41af-9187-97f528daeb55-kube-api-access-zxg96\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.123690 master-0 kubenswrapper[18592]: I0308 04:15:49.123643 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e48f207-5d13-41af-9187-97f528daeb55-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.123981 master-0 kubenswrapper[18592]: I0308 04:15:49.123956 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e48f207-5d13-41af-9187-97f528daeb55-config\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.124841 master-0 kubenswrapper[18592]: I0308 04:15:49.124778 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e48f207-5d13-41af-9187-97f528daeb55-scripts\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.126258 master-0 kubenswrapper[18592]: I0308 04:15:49.126227 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e48f207-5d13-41af-9187-97f528daeb55-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.132874 master-0 kubenswrapper[18592]: I0308 04:15:49.131090 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e48f207-5d13-41af-9187-97f528daeb55-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.133579 master-0 kubenswrapper[18592]: I0308 04:15:49.133536 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e48f207-5d13-41af-9187-97f528daeb55-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.150073 master-0 kubenswrapper[18592]: I0308 04:15:49.144293 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxg96\" (UniqueName: \"kubernetes.io/projected/8e48f207-5d13-41af-9187-97f528daeb55-kube-api-access-zxg96\") pod \"ovn-northd-0\" (UID: \"8e48f207-5d13-41af-9187-97f528daeb55\") " pod="openstack/ovn-northd-0" Mar 08 04:15:49.235518 master-0 kubenswrapper[18592]: I0308 04:15:49.235399 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 04:15:49.759462 master-0 kubenswrapper[18592]: I0308 04:15:49.759303 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 08 04:15:49.762647 master-0 kubenswrapper[18592]: W0308 04:15:49.762392 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e48f207_5d13_41af_9187_97f528daeb55.slice/crio-434ca6f4028dc1649fa4f37504af26f0b0763ef7040098319f109a89a607aa41 WatchSource:0}: Error finding container 434ca6f4028dc1649fa4f37504af26f0b0763ef7040098319f109a89a607aa41: Status 404 returned error can't find the container with id 434ca6f4028dc1649fa4f37504af26f0b0763ef7040098319f109a89a607aa41 Mar 08 04:15:49.763293 master-0 kubenswrapper[18592]: I0308 04:15:49.763258 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 04:15:49.838559 master-0 kubenswrapper[18592]: I0308 04:15:49.838465 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 08 04:15:49.838559 master-0 kubenswrapper[18592]: I0308 04:15:49.838574 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 08 04:15:50.686677 master-0 kubenswrapper[18592]: I0308 04:15:50.686599 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8e48f207-5d13-41af-9187-97f528daeb55","Type":"ContainerStarted","Data":"434ca6f4028dc1649fa4f37504af26f0b0763ef7040098319f109a89a607aa41"} Mar 08 04:15:50.737581 master-0 kubenswrapper[18592]: I0308 04:15:50.737040 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 08 04:15:50.782953 master-0 kubenswrapper[18592]: I0308 04:15:50.782583 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:50.782953 master-0 kubenswrapper[18592]: I0308 04:15:50.782654 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:50.865644 master-0 kubenswrapper[18592]: I0308 04:15:50.864123 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 08 04:15:51.703725 master-0 kubenswrapper[18592]: I0308 04:15:51.703653 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8e48f207-5d13-41af-9187-97f528daeb55","Type":"ContainerStarted","Data":"38e40b4efa25c9fa4b24f7cca25b189cd4893cae0172520a3db93e9e5204380b"} Mar 08 04:15:51.703959 master-0 kubenswrapper[18592]: I0308 04:15:51.703734 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8e48f207-5d13-41af-9187-97f528daeb55","Type":"ContainerStarted","Data":"2f646d15f9238bd55e0f3b7d30c28e60debb45d2dc1396b969182403dc584ba8"} Mar 08 04:15:51.703959 master-0 kubenswrapper[18592]: I0308 04:15:51.703794 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 08 04:15:51.733657 master-0 kubenswrapper[18592]: I0308 04:15:51.733550 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.688286517 podStartE2EDuration="3.733525491s" podCreationTimestamp="2026-03-08 04:15:48 +0000 UTC" firstStartedPulling="2026-03-08 04:15:49.76847088 +0000 UTC m=+1361.867225270" lastFinishedPulling="2026-03-08 04:15:50.813709854 +0000 UTC m=+1362.912464244" observedRunningTime="2026-03-08 04:15:51.719998909 +0000 UTC m=+1363.818753269" watchObservedRunningTime="2026-03-08 04:15:51.733525491 +0000 UTC m=+1363.832279861" Mar 08 04:15:52.200045 master-0 kubenswrapper[18592]: I0308 04:15:52.199973 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-k6c6g"] Mar 08 04:15:52.202333 master-0 kubenswrapper[18592]: I0308 04:15:52.201376 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k6c6g" Mar 08 04:15:52.222417 master-0 kubenswrapper[18592]: I0308 04:15:52.222351 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-k6c6g"] Mar 08 04:15:52.224533 master-0 kubenswrapper[18592]: I0308 04:15:52.224503 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4j66\" (UniqueName: \"kubernetes.io/projected/66da2c95-0991-4d14-900f-de90241dd987-kube-api-access-x4j66\") pod \"root-account-create-update-k6c6g\" (UID: \"66da2c95-0991-4d14-900f-de90241dd987\") " pod="openstack/root-account-create-update-k6c6g" Mar 08 04:15:52.224639 master-0 kubenswrapper[18592]: I0308 04:15:52.224551 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66da2c95-0991-4d14-900f-de90241dd987-operator-scripts\") pod \"root-account-create-update-k6c6g\" (UID: \"66da2c95-0991-4d14-900f-de90241dd987\") " pod="openstack/root-account-create-update-k6c6g" Mar 08 04:15:52.240899 master-0 kubenswrapper[18592]: I0308 04:15:52.238080 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 08 04:15:52.326541 master-0 kubenswrapper[18592]: I0308 04:15:52.326480 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4j66\" (UniqueName: \"kubernetes.io/projected/66da2c95-0991-4d14-900f-de90241dd987-kube-api-access-x4j66\") pod \"root-account-create-update-k6c6g\" (UID: \"66da2c95-0991-4d14-900f-de90241dd987\") " pod="openstack/root-account-create-update-k6c6g" Mar 08 04:15:52.326938 master-0 kubenswrapper[18592]: I0308 04:15:52.326550 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66da2c95-0991-4d14-900f-de90241dd987-operator-scripts\") pod \"root-account-create-update-k6c6g\" (UID: \"66da2c95-0991-4d14-900f-de90241dd987\") " pod="openstack/root-account-create-update-k6c6g" Mar 08 04:15:52.328145 master-0 kubenswrapper[18592]: I0308 04:15:52.328085 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66da2c95-0991-4d14-900f-de90241dd987-operator-scripts\") pod \"root-account-create-update-k6c6g\" (UID: \"66da2c95-0991-4d14-900f-de90241dd987\") " pod="openstack/root-account-create-update-k6c6g" Mar 08 04:15:52.369940 master-0 kubenswrapper[18592]: I0308 04:15:52.359460 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4j66\" (UniqueName: \"kubernetes.io/projected/66da2c95-0991-4d14-900f-de90241dd987-kube-api-access-x4j66\") pod \"root-account-create-update-k6c6g\" (UID: \"66da2c95-0991-4d14-900f-de90241dd987\") " pod="openstack/root-account-create-update-k6c6g" Mar 08 04:15:52.536772 master-0 kubenswrapper[18592]: I0308 04:15:52.536572 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k6c6g" Mar 08 04:15:53.157141 master-0 kubenswrapper[18592]: W0308 04:15:53.156744 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66da2c95_0991_4d14_900f_de90241dd987.slice/crio-fe59d133e0e34e1cdc5dc50ff9efebe841d5de534df8b41f5a3737687bd8cc0c WatchSource:0}: Error finding container fe59d133e0e34e1cdc5dc50ff9efebe841d5de534df8b41f5a3737687bd8cc0c: Status 404 returned error can't find the container with id fe59d133e0e34e1cdc5dc50ff9efebe841d5de534df8b41f5a3737687bd8cc0c Mar 08 04:15:53.160943 master-0 kubenswrapper[18592]: I0308 04:15:53.159347 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-k6c6g"] Mar 08 04:15:53.375749 master-0 kubenswrapper[18592]: I0308 04:15:53.375685 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:53.515764 master-0 kubenswrapper[18592]: I0308 04:15:53.515702 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 08 04:15:53.723524 master-0 kubenswrapper[18592]: I0308 04:15:53.723465 18592 generic.go:334] "Generic (PLEG): container finished" podID="66da2c95-0991-4d14-900f-de90241dd987" containerID="de1e46bba5defdbd0fdb2eccd135173714e135192f882fbe3a7a4118754955b4" exitCode=0 Mar 08 04:15:53.723744 master-0 kubenswrapper[18592]: I0308 04:15:53.723547 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k6c6g" event={"ID":"66da2c95-0991-4d14-900f-de90241dd987","Type":"ContainerDied","Data":"de1e46bba5defdbd0fdb2eccd135173714e135192f882fbe3a7a4118754955b4"} Mar 08 04:15:53.723744 master-0 kubenswrapper[18592]: I0308 04:15:53.723613 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k6c6g" event={"ID":"66da2c95-0991-4d14-900f-de90241dd987","Type":"ContainerStarted","Data":"fe59d133e0e34e1cdc5dc50ff9efebe841d5de534df8b41f5a3737687bd8cc0c"} Mar 08 04:15:53.861282 master-0 kubenswrapper[18592]: I0308 04:15:53.861129 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:15:53.967434 master-0 kubenswrapper[18592]: I0308 04:15:53.967363 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f75dd7cd9-2zs4f"] Mar 08 04:15:53.968452 master-0 kubenswrapper[18592]: I0308 04:15:53.967727 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" podUID="d021c6b4-8118-43d3-a703-8c2f73e6e077" containerName="dnsmasq-dns" containerID="cri-o://ee3298007e604218ca7104c92674c537834cf4a8e7b6a33c253d1de4de1e8351" gracePeriod=10 Mar 08 04:15:54.498183 master-0 kubenswrapper[18592]: I0308 04:15:54.498122 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" Mar 08 04:15:54.599412 master-0 kubenswrapper[18592]: I0308 04:15:54.599358 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d021c6b4-8118-43d3-a703-8c2f73e6e077-dns-svc\") pod \"d021c6b4-8118-43d3-a703-8c2f73e6e077\" (UID: \"d021c6b4-8118-43d3-a703-8c2f73e6e077\") " Mar 08 04:15:54.599534 master-0 kubenswrapper[18592]: I0308 04:15:54.599493 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d021c6b4-8118-43d3-a703-8c2f73e6e077-config\") pod \"d021c6b4-8118-43d3-a703-8c2f73e6e077\" (UID: \"d021c6b4-8118-43d3-a703-8c2f73e6e077\") " Mar 08 04:15:54.599672 master-0 kubenswrapper[18592]: I0308 04:15:54.599645 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8ttq\" (UniqueName: \"kubernetes.io/projected/d021c6b4-8118-43d3-a703-8c2f73e6e077-kube-api-access-h8ttq\") pod \"d021c6b4-8118-43d3-a703-8c2f73e6e077\" (UID: \"d021c6b4-8118-43d3-a703-8c2f73e6e077\") " Mar 08 04:15:54.602787 master-0 kubenswrapper[18592]: I0308 04:15:54.602747 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d021c6b4-8118-43d3-a703-8c2f73e6e077-kube-api-access-h8ttq" (OuterVolumeSpecName: "kube-api-access-h8ttq") pod "d021c6b4-8118-43d3-a703-8c2f73e6e077" (UID: "d021c6b4-8118-43d3-a703-8c2f73e6e077"). InnerVolumeSpecName "kube-api-access-h8ttq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:15:54.641881 master-0 kubenswrapper[18592]: I0308 04:15:54.641843 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d021c6b4-8118-43d3-a703-8c2f73e6e077-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d021c6b4-8118-43d3-a703-8c2f73e6e077" (UID: "d021c6b4-8118-43d3-a703-8c2f73e6e077"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:15:54.669170 master-0 kubenswrapper[18592]: I0308 04:15:54.664548 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d021c6b4-8118-43d3-a703-8c2f73e6e077-config" (OuterVolumeSpecName: "config") pod "d021c6b4-8118-43d3-a703-8c2f73e6e077" (UID: "d021c6b4-8118-43d3-a703-8c2f73e6e077"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:15:54.702315 master-0 kubenswrapper[18592]: I0308 04:15:54.702260 18592 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d021c6b4-8118-43d3-a703-8c2f73e6e077-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:54.702315 master-0 kubenswrapper[18592]: I0308 04:15:54.702308 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d021c6b4-8118-43d3-a703-8c2f73e6e077-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:54.702519 master-0 kubenswrapper[18592]: I0308 04:15:54.702322 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8ttq\" (UniqueName: \"kubernetes.io/projected/d021c6b4-8118-43d3-a703-8c2f73e6e077-kube-api-access-h8ttq\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:54.735736 master-0 kubenswrapper[18592]: I0308 04:15:54.735650 18592 generic.go:334] "Generic (PLEG): container finished" podID="d021c6b4-8118-43d3-a703-8c2f73e6e077" containerID="ee3298007e604218ca7104c92674c537834cf4a8e7b6a33c253d1de4de1e8351" exitCode=0 Mar 08 04:15:54.735949 master-0 kubenswrapper[18592]: I0308 04:15:54.735919 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" Mar 08 04:15:54.736131 master-0 kubenswrapper[18592]: I0308 04:15:54.736082 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" event={"ID":"d021c6b4-8118-43d3-a703-8c2f73e6e077","Type":"ContainerDied","Data":"ee3298007e604218ca7104c92674c537834cf4a8e7b6a33c253d1de4de1e8351"} Mar 08 04:15:54.736181 master-0 kubenswrapper[18592]: I0308 04:15:54.736144 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f75dd7cd9-2zs4f" event={"ID":"d021c6b4-8118-43d3-a703-8c2f73e6e077","Type":"ContainerDied","Data":"077c5a7bab12e49ae5a70a01665cb59164ede9d1399cbb5470ea2541e4e0da41"} Mar 08 04:15:54.736181 master-0 kubenswrapper[18592]: I0308 04:15:54.736167 18592 scope.go:117] "RemoveContainer" containerID="ee3298007e604218ca7104c92674c537834cf4a8e7b6a33c253d1de4de1e8351" Mar 08 04:15:54.773095 master-0 kubenswrapper[18592]: I0308 04:15:54.772982 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f75dd7cd9-2zs4f"] Mar 08 04:15:54.778684 master-0 kubenswrapper[18592]: I0308 04:15:54.776038 18592 scope.go:117] "RemoveContainer" containerID="cad1d8c8bc57c2f7e4d71daaacd0e8e77c9fbdeb822bcb372dd43138c4083cd7" Mar 08 04:15:54.788588 master-0 kubenswrapper[18592]: I0308 04:15:54.788173 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f75dd7cd9-2zs4f"] Mar 08 04:15:54.800239 master-0 kubenswrapper[18592]: I0308 04:15:54.800083 18592 scope.go:117] "RemoveContainer" containerID="ee3298007e604218ca7104c92674c537834cf4a8e7b6a33c253d1de4de1e8351" Mar 08 04:15:54.800770 master-0 kubenswrapper[18592]: E0308 04:15:54.800697 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee3298007e604218ca7104c92674c537834cf4a8e7b6a33c253d1de4de1e8351\": container with ID starting with ee3298007e604218ca7104c92674c537834cf4a8e7b6a33c253d1de4de1e8351 not found: ID does not exist" containerID="ee3298007e604218ca7104c92674c537834cf4a8e7b6a33c253d1de4de1e8351" Mar 08 04:15:54.800854 master-0 kubenswrapper[18592]: I0308 04:15:54.800763 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee3298007e604218ca7104c92674c537834cf4a8e7b6a33c253d1de4de1e8351"} err="failed to get container status \"ee3298007e604218ca7104c92674c537834cf4a8e7b6a33c253d1de4de1e8351\": rpc error: code = NotFound desc = could not find container \"ee3298007e604218ca7104c92674c537834cf4a8e7b6a33c253d1de4de1e8351\": container with ID starting with ee3298007e604218ca7104c92674c537834cf4a8e7b6a33c253d1de4de1e8351 not found: ID does not exist" Mar 08 04:15:54.800854 master-0 kubenswrapper[18592]: I0308 04:15:54.800790 18592 scope.go:117] "RemoveContainer" containerID="cad1d8c8bc57c2f7e4d71daaacd0e8e77c9fbdeb822bcb372dd43138c4083cd7" Mar 08 04:15:54.801337 master-0 kubenswrapper[18592]: E0308 04:15:54.801316 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cad1d8c8bc57c2f7e4d71daaacd0e8e77c9fbdeb822bcb372dd43138c4083cd7\": container with ID starting with cad1d8c8bc57c2f7e4d71daaacd0e8e77c9fbdeb822bcb372dd43138c4083cd7 not found: ID does not exist" containerID="cad1d8c8bc57c2f7e4d71daaacd0e8e77c9fbdeb822bcb372dd43138c4083cd7" Mar 08 04:15:54.801394 master-0 kubenswrapper[18592]: I0308 04:15:54.801336 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad1d8c8bc57c2f7e4d71daaacd0e8e77c9fbdeb822bcb372dd43138c4083cd7"} err="failed to get container status \"cad1d8c8bc57c2f7e4d71daaacd0e8e77c9fbdeb822bcb372dd43138c4083cd7\": rpc error: code = NotFound desc = could not find container \"cad1d8c8bc57c2f7e4d71daaacd0e8e77c9fbdeb822bcb372dd43138c4083cd7\": container with ID starting with cad1d8c8bc57c2f7e4d71daaacd0e8e77c9fbdeb822bcb372dd43138c4083cd7 not found: ID does not exist" Mar 08 04:15:55.266671 master-0 kubenswrapper[18592]: I0308 04:15:55.266619 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k6c6g" Mar 08 04:15:55.322935 master-0 kubenswrapper[18592]: I0308 04:15:55.320945 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4j66\" (UniqueName: \"kubernetes.io/projected/66da2c95-0991-4d14-900f-de90241dd987-kube-api-access-x4j66\") pod \"66da2c95-0991-4d14-900f-de90241dd987\" (UID: \"66da2c95-0991-4d14-900f-de90241dd987\") " Mar 08 04:15:55.322935 master-0 kubenswrapper[18592]: I0308 04:15:55.321029 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66da2c95-0991-4d14-900f-de90241dd987-operator-scripts\") pod \"66da2c95-0991-4d14-900f-de90241dd987\" (UID: \"66da2c95-0991-4d14-900f-de90241dd987\") " Mar 08 04:15:55.322935 master-0 kubenswrapper[18592]: I0308 04:15:55.321539 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66da2c95-0991-4d14-900f-de90241dd987-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66da2c95-0991-4d14-900f-de90241dd987" (UID: "66da2c95-0991-4d14-900f-de90241dd987"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:15:55.322935 master-0 kubenswrapper[18592]: I0308 04:15:55.322117 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66da2c95-0991-4d14-900f-de90241dd987-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:55.324324 master-0 kubenswrapper[18592]: I0308 04:15:55.324285 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66da2c95-0991-4d14-900f-de90241dd987-kube-api-access-x4j66" (OuterVolumeSpecName: "kube-api-access-x4j66") pod "66da2c95-0991-4d14-900f-de90241dd987" (UID: "66da2c95-0991-4d14-900f-de90241dd987"). InnerVolumeSpecName "kube-api-access-x4j66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:15:55.423774 master-0 kubenswrapper[18592]: I0308 04:15:55.423692 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4j66\" (UniqueName: \"kubernetes.io/projected/66da2c95-0991-4d14-900f-de90241dd987-kube-api-access-x4j66\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:55.775058 master-0 kubenswrapper[18592]: I0308 04:15:55.774932 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k6c6g" event={"ID":"66da2c95-0991-4d14-900f-de90241dd987","Type":"ContainerDied","Data":"fe59d133e0e34e1cdc5dc50ff9efebe841d5de534df8b41f5a3737687bd8cc0c"} Mar 08 04:15:55.775058 master-0 kubenswrapper[18592]: I0308 04:15:55.774985 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k6c6g" Mar 08 04:15:55.775542 master-0 kubenswrapper[18592]: I0308 04:15:55.774996 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe59d133e0e34e1cdc5dc50ff9efebe841d5de534df8b41f5a3737687bd8cc0c" Mar 08 04:15:56.017301 master-0 kubenswrapper[18592]: I0308 04:15:56.017220 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-6m2qh"] Mar 08 04:15:56.019146 master-0 kubenswrapper[18592]: E0308 04:15:56.019097 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d021c6b4-8118-43d3-a703-8c2f73e6e077" containerName="dnsmasq-dns" Mar 08 04:15:56.019146 master-0 kubenswrapper[18592]: I0308 04:15:56.019131 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="d021c6b4-8118-43d3-a703-8c2f73e6e077" containerName="dnsmasq-dns" Mar 08 04:15:56.019378 master-0 kubenswrapper[18592]: E0308 04:15:56.019164 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d021c6b4-8118-43d3-a703-8c2f73e6e077" containerName="init" Mar 08 04:15:56.019378 master-0 kubenswrapper[18592]: I0308 04:15:56.019173 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="d021c6b4-8118-43d3-a703-8c2f73e6e077" containerName="init" Mar 08 04:15:56.019378 master-0 kubenswrapper[18592]: E0308 04:15:56.019192 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66da2c95-0991-4d14-900f-de90241dd987" containerName="mariadb-account-create-update" Mar 08 04:15:56.019378 master-0 kubenswrapper[18592]: I0308 04:15:56.019202 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="66da2c95-0991-4d14-900f-de90241dd987" containerName="mariadb-account-create-update" Mar 08 04:15:56.021181 master-0 kubenswrapper[18592]: I0308 04:15:56.020956 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="d021c6b4-8118-43d3-a703-8c2f73e6e077" containerName="dnsmasq-dns" Mar 08 04:15:56.021454 master-0 kubenswrapper[18592]: I0308 04:15:56.021189 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="66da2c95-0991-4d14-900f-de90241dd987" containerName="mariadb-account-create-update" Mar 08 04:15:56.022039 master-0 kubenswrapper[18592]: I0308 04:15:56.022000 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6m2qh" Mar 08 04:15:56.035305 master-0 kubenswrapper[18592]: I0308 04:15:56.035160 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-6m2qh"] Mar 08 04:15:56.038728 master-0 kubenswrapper[18592]: I0308 04:15:56.038663 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxmkj\" (UniqueName: \"kubernetes.io/projected/5d70e50d-f856-4cb6-bebe-58e2584f70dc-kube-api-access-kxmkj\") pod \"keystone-db-create-6m2qh\" (UID: \"5d70e50d-f856-4cb6-bebe-58e2584f70dc\") " pod="openstack/keystone-db-create-6m2qh" Mar 08 04:15:56.038980 master-0 kubenswrapper[18592]: I0308 04:15:56.038762 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d70e50d-f856-4cb6-bebe-58e2584f70dc-operator-scripts\") pod \"keystone-db-create-6m2qh\" (UID: \"5d70e50d-f856-4cb6-bebe-58e2584f70dc\") " pod="openstack/keystone-db-create-6m2qh" Mar 08 04:15:56.186077 master-0 kubenswrapper[18592]: I0308 04:15:56.139961 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxmkj\" (UniqueName: \"kubernetes.io/projected/5d70e50d-f856-4cb6-bebe-58e2584f70dc-kube-api-access-kxmkj\") pod \"keystone-db-create-6m2qh\" (UID: \"5d70e50d-f856-4cb6-bebe-58e2584f70dc\") " pod="openstack/keystone-db-create-6m2qh" Mar 08 04:15:56.186077 master-0 kubenswrapper[18592]: I0308 04:15:56.140038 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d70e50d-f856-4cb6-bebe-58e2584f70dc-operator-scripts\") pod \"keystone-db-create-6m2qh\" (UID: \"5d70e50d-f856-4cb6-bebe-58e2584f70dc\") " pod="openstack/keystone-db-create-6m2qh" Mar 08 04:15:56.186077 master-0 kubenswrapper[18592]: I0308 04:15:56.140800 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d70e50d-f856-4cb6-bebe-58e2584f70dc-operator-scripts\") pod \"keystone-db-create-6m2qh\" (UID: \"5d70e50d-f856-4cb6-bebe-58e2584f70dc\") " pod="openstack/keystone-db-create-6m2qh" Mar 08 04:15:56.198892 master-0 kubenswrapper[18592]: I0308 04:15:56.198838 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d021c6b4-8118-43d3-a703-8c2f73e6e077" path="/var/lib/kubelet/pods/d021c6b4-8118-43d3-a703-8c2f73e6e077/volumes" Mar 08 04:15:56.199617 master-0 kubenswrapper[18592]: I0308 04:15:56.199589 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-f96f-account-create-update-69kzj"] Mar 08 04:15:56.201381 master-0 kubenswrapper[18592]: I0308 04:15:56.201347 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f96f-account-create-update-69kzj"] Mar 08 04:15:56.201475 master-0 kubenswrapper[18592]: I0308 04:15:56.201448 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f96f-account-create-update-69kzj" Mar 08 04:15:56.204690 master-0 kubenswrapper[18592]: I0308 04:15:56.204437 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 08 04:15:56.207366 master-0 kubenswrapper[18592]: I0308 04:15:56.207333 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxmkj\" (UniqueName: \"kubernetes.io/projected/5d70e50d-f856-4cb6-bebe-58e2584f70dc-kube-api-access-kxmkj\") pod \"keystone-db-create-6m2qh\" (UID: \"5d70e50d-f856-4cb6-bebe-58e2584f70dc\") " pod="openstack/keystone-db-create-6m2qh" Mar 08 04:15:56.242026 master-0 kubenswrapper[18592]: I0308 04:15:56.241955 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6-operator-scripts\") pod \"keystone-f96f-account-create-update-69kzj\" (UID: \"6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6\") " pod="openstack/keystone-f96f-account-create-update-69kzj" Mar 08 04:15:56.242228 master-0 kubenswrapper[18592]: I0308 04:15:56.242106 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdsvc\" (UniqueName: \"kubernetes.io/projected/6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6-kube-api-access-kdsvc\") pod \"keystone-f96f-account-create-update-69kzj\" (UID: \"6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6\") " pod="openstack/keystone-f96f-account-create-update-69kzj" Mar 08 04:15:56.331174 master-0 kubenswrapper[18592]: I0308 04:15:56.330817 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-gc5cl"] Mar 08 04:15:56.334683 master-0 kubenswrapper[18592]: I0308 04:15:56.332140 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gc5cl" Mar 08 04:15:56.342958 master-0 kubenswrapper[18592]: I0308 04:15:56.342813 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72wj7\" (UniqueName: \"kubernetes.io/projected/d7ce4a7b-aee1-4b45-a17c-73289f866b0d-kube-api-access-72wj7\") pod \"placement-db-create-gc5cl\" (UID: \"d7ce4a7b-aee1-4b45-a17c-73289f866b0d\") " pod="openstack/placement-db-create-gc5cl" Mar 08 04:15:56.342958 master-0 kubenswrapper[18592]: I0308 04:15:56.342922 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdsvc\" (UniqueName: \"kubernetes.io/projected/6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6-kube-api-access-kdsvc\") pod \"keystone-f96f-account-create-update-69kzj\" (UID: \"6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6\") " pod="openstack/keystone-f96f-account-create-update-69kzj" Mar 08 04:15:56.342958 master-0 kubenswrapper[18592]: I0308 04:15:56.342968 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7ce4a7b-aee1-4b45-a17c-73289f866b0d-operator-scripts\") pod \"placement-db-create-gc5cl\" (UID: \"d7ce4a7b-aee1-4b45-a17c-73289f866b0d\") " pod="openstack/placement-db-create-gc5cl" Mar 08 04:15:56.343286 master-0 kubenswrapper[18592]: I0308 04:15:56.343061 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6-operator-scripts\") pod \"keystone-f96f-account-create-update-69kzj\" (UID: \"6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6\") " pod="openstack/keystone-f96f-account-create-update-69kzj" Mar 08 04:15:56.343749 master-0 kubenswrapper[18592]: I0308 04:15:56.343700 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6-operator-scripts\") pod \"keystone-f96f-account-create-update-69kzj\" (UID: \"6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6\") " pod="openstack/keystone-f96f-account-create-update-69kzj" Mar 08 04:15:56.361610 master-0 kubenswrapper[18592]: I0308 04:15:56.357679 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-gc5cl"] Mar 08 04:15:56.361610 master-0 kubenswrapper[18592]: I0308 04:15:56.358176 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6m2qh" Mar 08 04:15:56.368069 master-0 kubenswrapper[18592]: I0308 04:15:56.368013 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdsvc\" (UniqueName: \"kubernetes.io/projected/6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6-kube-api-access-kdsvc\") pod \"keystone-f96f-account-create-update-69kzj\" (UID: \"6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6\") " pod="openstack/keystone-f96f-account-create-update-69kzj" Mar 08 04:15:56.444302 master-0 kubenswrapper[18592]: I0308 04:15:56.444225 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72wj7\" (UniqueName: \"kubernetes.io/projected/d7ce4a7b-aee1-4b45-a17c-73289f866b0d-kube-api-access-72wj7\") pod \"placement-db-create-gc5cl\" (UID: \"d7ce4a7b-aee1-4b45-a17c-73289f866b0d\") " pod="openstack/placement-db-create-gc5cl" Mar 08 04:15:56.444520 master-0 kubenswrapper[18592]: I0308 04:15:56.444386 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7ce4a7b-aee1-4b45-a17c-73289f866b0d-operator-scripts\") pod \"placement-db-create-gc5cl\" (UID: \"d7ce4a7b-aee1-4b45-a17c-73289f866b0d\") " pod="openstack/placement-db-create-gc5cl" Mar 08 04:15:56.445149 master-0 kubenswrapper[18592]: I0308 04:15:56.445118 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7ce4a7b-aee1-4b45-a17c-73289f866b0d-operator-scripts\") pod \"placement-db-create-gc5cl\" (UID: \"d7ce4a7b-aee1-4b45-a17c-73289f866b0d\") " pod="openstack/placement-db-create-gc5cl" Mar 08 04:15:56.448170 master-0 kubenswrapper[18592]: I0308 04:15:56.448125 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d6d7-account-create-update-gmt7m"] Mar 08 04:15:56.450070 master-0 kubenswrapper[18592]: I0308 04:15:56.449878 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d6d7-account-create-update-gmt7m" Mar 08 04:15:56.463985 master-0 kubenswrapper[18592]: I0308 04:15:56.453669 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 08 04:15:56.464583 master-0 kubenswrapper[18592]: I0308 04:15:56.464554 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72wj7\" (UniqueName: \"kubernetes.io/projected/d7ce4a7b-aee1-4b45-a17c-73289f866b0d-kube-api-access-72wj7\") pod \"placement-db-create-gc5cl\" (UID: \"d7ce4a7b-aee1-4b45-a17c-73289f866b0d\") " pod="openstack/placement-db-create-gc5cl" Mar 08 04:15:56.482576 master-0 kubenswrapper[18592]: I0308 04:15:56.481451 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d6d7-account-create-update-gmt7m"] Mar 08 04:15:56.554910 master-0 kubenswrapper[18592]: I0308 04:15:56.553861 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c24fs\" (UniqueName: \"kubernetes.io/projected/83dacd91-8bfe-4f52-a178-b5b51f46a968-kube-api-access-c24fs\") pod \"placement-d6d7-account-create-update-gmt7m\" (UID: \"83dacd91-8bfe-4f52-a178-b5b51f46a968\") " pod="openstack/placement-d6d7-account-create-update-gmt7m" Mar 08 04:15:56.554910 master-0 kubenswrapper[18592]: I0308 04:15:56.554084 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83dacd91-8bfe-4f52-a178-b5b51f46a968-operator-scripts\") pod \"placement-d6d7-account-create-update-gmt7m\" (UID: \"83dacd91-8bfe-4f52-a178-b5b51f46a968\") " pod="openstack/placement-d6d7-account-create-update-gmt7m" Mar 08 04:15:56.555165 master-0 kubenswrapper[18592]: I0308 04:15:56.555076 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f96f-account-create-update-69kzj" Mar 08 04:15:56.656002 master-0 kubenswrapper[18592]: I0308 04:15:56.655957 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83dacd91-8bfe-4f52-a178-b5b51f46a968-operator-scripts\") pod \"placement-d6d7-account-create-update-gmt7m\" (UID: \"83dacd91-8bfe-4f52-a178-b5b51f46a968\") " pod="openstack/placement-d6d7-account-create-update-gmt7m" Mar 08 04:15:56.656628 master-0 kubenswrapper[18592]: I0308 04:15:56.656323 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c24fs\" (UniqueName: \"kubernetes.io/projected/83dacd91-8bfe-4f52-a178-b5b51f46a968-kube-api-access-c24fs\") pod \"placement-d6d7-account-create-update-gmt7m\" (UID: \"83dacd91-8bfe-4f52-a178-b5b51f46a968\") " pod="openstack/placement-d6d7-account-create-update-gmt7m" Mar 08 04:15:56.656957 master-0 kubenswrapper[18592]: I0308 04:15:56.656912 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83dacd91-8bfe-4f52-a178-b5b51f46a968-operator-scripts\") pod \"placement-d6d7-account-create-update-gmt7m\" (UID: \"83dacd91-8bfe-4f52-a178-b5b51f46a968\") " pod="openstack/placement-d6d7-account-create-update-gmt7m" Mar 08 04:15:56.664896 master-0 kubenswrapper[18592]: I0308 04:15:56.664854 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gc5cl" Mar 08 04:15:56.675217 master-0 kubenswrapper[18592]: I0308 04:15:56.675179 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c24fs\" (UniqueName: \"kubernetes.io/projected/83dacd91-8bfe-4f52-a178-b5b51f46a968-kube-api-access-c24fs\") pod \"placement-d6d7-account-create-update-gmt7m\" (UID: \"83dacd91-8bfe-4f52-a178-b5b51f46a968\") " pod="openstack/placement-d6d7-account-create-update-gmt7m" Mar 08 04:15:56.805663 master-0 kubenswrapper[18592]: I0308 04:15:56.805606 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d6d7-account-create-update-gmt7m" Mar 08 04:15:56.859382 master-0 kubenswrapper[18592]: I0308 04:15:56.859306 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-6m2qh"] Mar 08 04:15:57.048113 master-0 kubenswrapper[18592]: I0308 04:15:57.047630 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f96f-account-create-update-69kzj"] Mar 08 04:15:57.142290 master-0 kubenswrapper[18592]: I0308 04:15:57.138807 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc"] Mar 08 04:15:57.144938 master-0 kubenswrapper[18592]: I0308 04:15:57.144887 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:15:57.224741 master-0 kubenswrapper[18592]: I0308 04:15:57.219014 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc"] Mar 08 04:15:57.240588 master-0 kubenswrapper[18592]: W0308 04:15:57.239941 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7ce4a7b_aee1_4b45_a17c_73289f866b0d.slice/crio-230618bea03f1481f990d40631ae276a866cea0368e8537a72a261383c44beff WatchSource:0}: Error finding container 230618bea03f1481f990d40631ae276a866cea0368e8537a72a261383c44beff: Status 404 returned error can't find the container with id 230618bea03f1481f990d40631ae276a866cea0368e8537a72a261383c44beff Mar 08 04:15:57.262015 master-0 kubenswrapper[18592]: I0308 04:15:57.261978 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-gc5cl"] Mar 08 04:15:57.294872 master-0 kubenswrapper[18592]: I0308 04:15:57.292154 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdhsk\" (UniqueName: \"kubernetes.io/projected/2bdf0b67-8d62-4dfc-ae07-730475c2a471-kube-api-access-gdhsk\") pod \"dnsmasq-dns-6b9cd4dcf7-ln4zc\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:15:57.294872 master-0 kubenswrapper[18592]: I0308 04:15:57.292273 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-dns-svc\") pod \"dnsmasq-dns-6b9cd4dcf7-ln4zc\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:15:57.294872 master-0 kubenswrapper[18592]: I0308 04:15:57.292308 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-config\") pod \"dnsmasq-dns-6b9cd4dcf7-ln4zc\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:15:57.294872 master-0 kubenswrapper[18592]: I0308 04:15:57.292372 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9cd4dcf7-ln4zc\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:15:57.294872 master-0 kubenswrapper[18592]: I0308 04:15:57.292419 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9cd4dcf7-ln4zc\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:15:57.400676 master-0 kubenswrapper[18592]: I0308 04:15:57.394260 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdhsk\" (UniqueName: \"kubernetes.io/projected/2bdf0b67-8d62-4dfc-ae07-730475c2a471-kube-api-access-gdhsk\") pod \"dnsmasq-dns-6b9cd4dcf7-ln4zc\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:15:57.400676 master-0 kubenswrapper[18592]: I0308 04:15:57.394403 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-dns-svc\") pod \"dnsmasq-dns-6b9cd4dcf7-ln4zc\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:15:57.400676 master-0 kubenswrapper[18592]: I0308 04:15:57.394447 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-config\") pod \"dnsmasq-dns-6b9cd4dcf7-ln4zc\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:15:57.400676 master-0 kubenswrapper[18592]: I0308 04:15:57.394571 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9cd4dcf7-ln4zc\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:15:57.400676 master-0 kubenswrapper[18592]: I0308 04:15:57.394683 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9cd4dcf7-ln4zc\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:15:57.400676 master-0 kubenswrapper[18592]: I0308 04:15:57.395535 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-dns-svc\") pod \"dnsmasq-dns-6b9cd4dcf7-ln4zc\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:15:57.400676 master-0 kubenswrapper[18592]: I0308 04:15:57.395716 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9cd4dcf7-ln4zc\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:15:57.400676 master-0 kubenswrapper[18592]: I0308 04:15:57.396565 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-config\") pod \"dnsmasq-dns-6b9cd4dcf7-ln4zc\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:15:57.415700 master-0 kubenswrapper[18592]: I0308 04:15:57.415065 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d6d7-account-create-update-gmt7m"] Mar 08 04:15:57.416140 master-0 kubenswrapper[18592]: I0308 04:15:57.416117 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdhsk\" (UniqueName: \"kubernetes.io/projected/2bdf0b67-8d62-4dfc-ae07-730475c2a471-kube-api-access-gdhsk\") pod \"dnsmasq-dns-6b9cd4dcf7-ln4zc\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:15:57.417423 master-0 kubenswrapper[18592]: I0308 04:15:57.417376 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9cd4dcf7-ln4zc\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:15:57.524617 master-0 kubenswrapper[18592]: I0308 04:15:57.524557 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:15:57.829592 master-0 kubenswrapper[18592]: I0308 04:15:57.829479 18592 generic.go:334] "Generic (PLEG): container finished" podID="d7ce4a7b-aee1-4b45-a17c-73289f866b0d" containerID="ec2577ddda46c4d9d1b163c8303b77fd7ba8ba3bfd5a776087bd44130924b8ae" exitCode=0 Mar 08 04:15:57.829592 master-0 kubenswrapper[18592]: I0308 04:15:57.829551 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gc5cl" event={"ID":"d7ce4a7b-aee1-4b45-a17c-73289f866b0d","Type":"ContainerDied","Data":"ec2577ddda46c4d9d1b163c8303b77fd7ba8ba3bfd5a776087bd44130924b8ae"} Mar 08 04:15:57.829592 master-0 kubenswrapper[18592]: I0308 04:15:57.829578 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gc5cl" event={"ID":"d7ce4a7b-aee1-4b45-a17c-73289f866b0d","Type":"ContainerStarted","Data":"230618bea03f1481f990d40631ae276a866cea0368e8537a72a261383c44beff"} Mar 08 04:15:57.836516 master-0 kubenswrapper[18592]: I0308 04:15:57.831264 18592 generic.go:334] "Generic (PLEG): container finished" podID="5d70e50d-f856-4cb6-bebe-58e2584f70dc" containerID="7b154f228ef2e8736ddaa958d956365bfb8eab1e1b87238098a4bd93879e4e96" exitCode=0 Mar 08 04:15:57.836516 master-0 kubenswrapper[18592]: I0308 04:15:57.831508 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6m2qh" event={"ID":"5d70e50d-f856-4cb6-bebe-58e2584f70dc","Type":"ContainerDied","Data":"7b154f228ef2e8736ddaa958d956365bfb8eab1e1b87238098a4bd93879e4e96"} Mar 08 04:15:57.836516 master-0 kubenswrapper[18592]: I0308 04:15:57.831577 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6m2qh" event={"ID":"5d70e50d-f856-4cb6-bebe-58e2584f70dc","Type":"ContainerStarted","Data":"010002a780df01858136dd9ee63cf45070cf8fef1dbf5d661bcf6d346372615c"} Mar 08 04:15:57.836516 master-0 kubenswrapper[18592]: I0308 04:15:57.834008 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d6d7-account-create-update-gmt7m" event={"ID":"83dacd91-8bfe-4f52-a178-b5b51f46a968","Type":"ContainerStarted","Data":"4b5d687e97584a9c47540ce435bbb44ed95fee498a0368023874569c8ebdc477"} Mar 08 04:15:57.836516 master-0 kubenswrapper[18592]: I0308 04:15:57.834052 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d6d7-account-create-update-gmt7m" event={"ID":"83dacd91-8bfe-4f52-a178-b5b51f46a968","Type":"ContainerStarted","Data":"8489167b1e440a53ab9f0db407c28a85d09b75e939fc7c516c09bbb594f98545"} Mar 08 04:15:57.849359 master-0 kubenswrapper[18592]: I0308 04:15:57.848441 18592 generic.go:334] "Generic (PLEG): container finished" podID="6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6" containerID="cd921792f536fe4f46f0df1900361cf86d7579deb07378440159dccd53a673ed" exitCode=0 Mar 08 04:15:57.849359 master-0 kubenswrapper[18592]: I0308 04:15:57.848515 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f96f-account-create-update-69kzj" event={"ID":"6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6","Type":"ContainerDied","Data":"cd921792f536fe4f46f0df1900361cf86d7579deb07378440159dccd53a673ed"} Mar 08 04:15:57.849359 master-0 kubenswrapper[18592]: I0308 04:15:57.848555 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f96f-account-create-update-69kzj" event={"ID":"6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6","Type":"ContainerStarted","Data":"71dad1d918ec87ddbec8529df8ae10d1247fbc1494006ce710d2f00f734eb0d9"} Mar 08 04:15:58.433837 master-0 kubenswrapper[18592]: I0308 04:15:58.433766 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc"] Mar 08 04:15:58.520490 master-0 kubenswrapper[18592]: W0308 04:15:58.520392 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bdf0b67_8d62_4dfc_ae07_730475c2a471.slice/crio-48ee007d399662c19ae25cf2325833aad800b6d6d8cf98341b2fa8805d557b96 WatchSource:0}: Error finding container 48ee007d399662c19ae25cf2325833aad800b6d6d8cf98341b2fa8805d557b96: Status 404 returned error can't find the container with id 48ee007d399662c19ae25cf2325833aad800b6d6d8cf98341b2fa8805d557b96 Mar 08 04:15:58.865095 master-0 kubenswrapper[18592]: I0308 04:15:58.864498 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" event={"ID":"2bdf0b67-8d62-4dfc-ae07-730475c2a471","Type":"ContainerStarted","Data":"48ee007d399662c19ae25cf2325833aad800b6d6d8cf98341b2fa8805d557b96"} Mar 08 04:15:58.866449 master-0 kubenswrapper[18592]: I0308 04:15:58.866373 18592 generic.go:334] "Generic (PLEG): container finished" podID="83dacd91-8bfe-4f52-a178-b5b51f46a968" containerID="4b5d687e97584a9c47540ce435bbb44ed95fee498a0368023874569c8ebdc477" exitCode=0 Mar 08 04:15:58.866547 master-0 kubenswrapper[18592]: I0308 04:15:58.866520 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d6d7-account-create-update-gmt7m" event={"ID":"83dacd91-8bfe-4f52-a178-b5b51f46a968","Type":"ContainerDied","Data":"4b5d687e97584a9c47540ce435bbb44ed95fee498a0368023874569c8ebdc477"} Mar 08 04:15:59.334737 master-0 kubenswrapper[18592]: I0308 04:15:59.334446 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 08 04:15:59.344215 master-0 kubenswrapper[18592]: I0308 04:15:59.344168 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 08 04:15:59.346016 master-0 kubenswrapper[18592]: I0308 04:15:59.345974 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 08 04:15:59.346742 master-0 kubenswrapper[18592]: I0308 04:15:59.346717 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 08 04:15:59.346899 master-0 kubenswrapper[18592]: I0308 04:15:59.346876 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 08 04:15:59.366602 master-0 kubenswrapper[18592]: I0308 04:15:59.363436 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 08 04:15:59.459808 master-0 kubenswrapper[18592]: I0308 04:15:59.459706 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f96f-account-create-update-69kzj" Mar 08 04:15:59.461192 master-0 kubenswrapper[18592]: I0308 04:15:59.461154 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-etc-swift\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:15:59.461275 master-0 kubenswrapper[18592]: I0308 04:15:59.461207 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2166af23-aec1-40ee-9114-2a0ffa1c7f11-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:15:59.461343 master-0 kubenswrapper[18592]: I0308 04:15:59.461319 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2166af23-aec1-40ee-9114-2a0ffa1c7f11-cache\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:15:59.461386 master-0 kubenswrapper[18592]: I0308 04:15:59.461345 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5242282a-c1e6-407f-9a94-df076247e568\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ca7c0a39-d842-4027-b057-143585a04d7a\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:15:59.461386 master-0 kubenswrapper[18592]: I0308 04:15:59.461370 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2166af23-aec1-40ee-9114-2a0ffa1c7f11-lock\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:15:59.461498 master-0 kubenswrapper[18592]: I0308 04:15:59.461404 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khz2m\" (UniqueName: \"kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-kube-api-access-khz2m\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:15:59.565043 master-0 kubenswrapper[18592]: I0308 04:15:59.564275 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdsvc\" (UniqueName: \"kubernetes.io/projected/6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6-kube-api-access-kdsvc\") pod \"6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6\" (UID: \"6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6\") " Mar 08 04:15:59.565043 master-0 kubenswrapper[18592]: I0308 04:15:59.564922 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6-operator-scripts\") pod \"6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6\" (UID: \"6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6\") " Mar 08 04:15:59.565855 master-0 kubenswrapper[18592]: I0308 04:15:59.565799 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6" (UID: "6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:15:59.566643 master-0 kubenswrapper[18592]: I0308 04:15:59.566618 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2166af23-aec1-40ee-9114-2a0ffa1c7f11-cache\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:15:59.566712 master-0 kubenswrapper[18592]: I0308 04:15:59.566659 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5242282a-c1e6-407f-9a94-df076247e568\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ca7c0a39-d842-4027-b057-143585a04d7a\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:15:59.566712 master-0 kubenswrapper[18592]: I0308 04:15:59.566691 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2166af23-aec1-40ee-9114-2a0ffa1c7f11-lock\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:15:59.566771 master-0 kubenswrapper[18592]: I0308 04:15:59.566743 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khz2m\" (UniqueName: \"kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-kube-api-access-khz2m\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:15:59.566847 master-0 kubenswrapper[18592]: I0308 04:15:59.566816 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-etc-swift\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:15:59.567312 master-0 kubenswrapper[18592]: E0308 04:15:59.567249 18592 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 04:15:59.567312 master-0 kubenswrapper[18592]: E0308 04:15:59.567292 18592 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 04:15:59.567580 master-0 kubenswrapper[18592]: E0308 04:15:59.567474 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-etc-swift podName:2166af23-aec1-40ee-9114-2a0ffa1c7f11 nodeName:}" failed. No retries permitted until 2026-03-08 04:16:00.067328011 +0000 UTC m=+1372.166082351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-etc-swift") pod "swift-storage-0" (UID: "2166af23-aec1-40ee-9114-2a0ffa1c7f11") : configmap "swift-ring-files" not found Mar 08 04:15:59.568117 master-0 kubenswrapper[18592]: I0308 04:15:59.567990 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2166af23-aec1-40ee-9114-2a0ffa1c7f11-lock\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:15:59.568659 master-0 kubenswrapper[18592]: I0308 04:15:59.568241 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2166af23-aec1-40ee-9114-2a0ffa1c7f11-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:15:59.569224 master-0 kubenswrapper[18592]: I0308 04:15:59.569205 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2166af23-aec1-40ee-9114-2a0ffa1c7f11-cache\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:15:59.570347 master-0 kubenswrapper[18592]: I0308 04:15:59.569953 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:59.572038 master-0 kubenswrapper[18592]: I0308 04:15:59.571287 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6-kube-api-access-kdsvc" (OuterVolumeSpecName: "kube-api-access-kdsvc") pod "6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6" (UID: "6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6"). InnerVolumeSpecName "kube-api-access-kdsvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:15:59.578870 master-0 kubenswrapper[18592]: I0308 04:15:59.578811 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2166af23-aec1-40ee-9114-2a0ffa1c7f11-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:15:59.579435 master-0 kubenswrapper[18592]: I0308 04:15:59.579387 18592 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 04:15:59.579435 master-0 kubenswrapper[18592]: I0308 04:15:59.579427 18592 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5242282a-c1e6-407f-9a94-df076247e568\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ca7c0a39-d842-4027-b057-143585a04d7a\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/3b23404b1dec12b813530c763a319dfcce12fcecae5b3ec4657ea9277918c73f/globalmount\"" pod="openstack/swift-storage-0" Mar 08 04:15:59.584805 master-0 kubenswrapper[18592]: I0308 04:15:59.584762 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khz2m\" (UniqueName: \"kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-kube-api-access-khz2m\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:15:59.676027 master-0 kubenswrapper[18592]: I0308 04:15:59.675980 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdsvc\" (UniqueName: \"kubernetes.io/projected/6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6-kube-api-access-kdsvc\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:59.691535 master-0 kubenswrapper[18592]: I0308 04:15:59.691478 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6m2qh" Mar 08 04:15:59.699882 master-0 kubenswrapper[18592]: I0308 04:15:59.699831 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gc5cl" Mar 08 04:15:59.776973 master-0 kubenswrapper[18592]: I0308 04:15:59.776915 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d70e50d-f856-4cb6-bebe-58e2584f70dc-operator-scripts\") pod \"5d70e50d-f856-4cb6-bebe-58e2584f70dc\" (UID: \"5d70e50d-f856-4cb6-bebe-58e2584f70dc\") " Mar 08 04:15:59.776973 master-0 kubenswrapper[18592]: I0308 04:15:59.776975 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxmkj\" (UniqueName: \"kubernetes.io/projected/5d70e50d-f856-4cb6-bebe-58e2584f70dc-kube-api-access-kxmkj\") pod \"5d70e50d-f856-4cb6-bebe-58e2584f70dc\" (UID: \"5d70e50d-f856-4cb6-bebe-58e2584f70dc\") " Mar 08 04:15:59.777210 master-0 kubenswrapper[18592]: I0308 04:15:59.777108 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72wj7\" (UniqueName: \"kubernetes.io/projected/d7ce4a7b-aee1-4b45-a17c-73289f866b0d-kube-api-access-72wj7\") pod \"d7ce4a7b-aee1-4b45-a17c-73289f866b0d\" (UID: \"d7ce4a7b-aee1-4b45-a17c-73289f866b0d\") " Mar 08 04:15:59.777210 master-0 kubenswrapper[18592]: I0308 04:15:59.777188 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7ce4a7b-aee1-4b45-a17c-73289f866b0d-operator-scripts\") pod \"d7ce4a7b-aee1-4b45-a17c-73289f866b0d\" (UID: \"d7ce4a7b-aee1-4b45-a17c-73289f866b0d\") " Mar 08 04:15:59.777999 master-0 kubenswrapper[18592]: I0308 04:15:59.777958 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d70e50d-f856-4cb6-bebe-58e2584f70dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d70e50d-f856-4cb6-bebe-58e2584f70dc" (UID: "5d70e50d-f856-4cb6-bebe-58e2584f70dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:15:59.778269 master-0 kubenswrapper[18592]: I0308 04:15:59.778240 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7ce4a7b-aee1-4b45-a17c-73289f866b0d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7ce4a7b-aee1-4b45-a17c-73289f866b0d" (UID: "d7ce4a7b-aee1-4b45-a17c-73289f866b0d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:15:59.782724 master-0 kubenswrapper[18592]: I0308 04:15:59.782678 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7ce4a7b-aee1-4b45-a17c-73289f866b0d-kube-api-access-72wj7" (OuterVolumeSpecName: "kube-api-access-72wj7") pod "d7ce4a7b-aee1-4b45-a17c-73289f866b0d" (UID: "d7ce4a7b-aee1-4b45-a17c-73289f866b0d"). InnerVolumeSpecName "kube-api-access-72wj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:15:59.794207 master-0 kubenswrapper[18592]: I0308 04:15:59.794116 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d70e50d-f856-4cb6-bebe-58e2584f70dc-kube-api-access-kxmkj" (OuterVolumeSpecName: "kube-api-access-kxmkj") pod "5d70e50d-f856-4cb6-bebe-58e2584f70dc" (UID: "5d70e50d-f856-4cb6-bebe-58e2584f70dc"). InnerVolumeSpecName "kube-api-access-kxmkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:15:59.882414 master-0 kubenswrapper[18592]: I0308 04:15:59.879178 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7ce4a7b-aee1-4b45-a17c-73289f866b0d-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:59.882414 master-0 kubenswrapper[18592]: I0308 04:15:59.879226 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d70e50d-f856-4cb6-bebe-58e2584f70dc-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:59.882414 master-0 kubenswrapper[18592]: I0308 04:15:59.879239 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxmkj\" (UniqueName: \"kubernetes.io/projected/5d70e50d-f856-4cb6-bebe-58e2584f70dc-kube-api-access-kxmkj\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:59.882414 master-0 kubenswrapper[18592]: I0308 04:15:59.879253 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72wj7\" (UniqueName: \"kubernetes.io/projected/d7ce4a7b-aee1-4b45-a17c-73289f866b0d-kube-api-access-72wj7\") on node \"master-0\" DevicePath \"\"" Mar 08 04:15:59.928855 master-0 kubenswrapper[18592]: I0308 04:15:59.925161 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f96f-account-create-update-69kzj" event={"ID":"6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6","Type":"ContainerDied","Data":"71dad1d918ec87ddbec8529df8ae10d1247fbc1494006ce710d2f00f734eb0d9"} Mar 08 04:15:59.928855 master-0 kubenswrapper[18592]: I0308 04:15:59.925212 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71dad1d918ec87ddbec8529df8ae10d1247fbc1494006ce710d2f00f734eb0d9" Mar 08 04:15:59.928855 master-0 kubenswrapper[18592]: I0308 04:15:59.925281 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f96f-account-create-update-69kzj" Mar 08 04:15:59.957881 master-0 kubenswrapper[18592]: I0308 04:15:59.957799 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gc5cl" event={"ID":"d7ce4a7b-aee1-4b45-a17c-73289f866b0d","Type":"ContainerDied","Data":"230618bea03f1481f990d40631ae276a866cea0368e8537a72a261383c44beff"} Mar 08 04:15:59.957881 master-0 kubenswrapper[18592]: I0308 04:15:59.957874 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="230618bea03f1481f990d40631ae276a866cea0368e8537a72a261383c44beff" Mar 08 04:15:59.958072 master-0 kubenswrapper[18592]: I0308 04:15:59.957957 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gc5cl" Mar 08 04:15:59.994619 master-0 kubenswrapper[18592]: I0308 04:15:59.994568 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6m2qh" Mar 08 04:15:59.996431 master-0 kubenswrapper[18592]: I0308 04:15:59.995887 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6m2qh" event={"ID":"5d70e50d-f856-4cb6-bebe-58e2584f70dc","Type":"ContainerDied","Data":"010002a780df01858136dd9ee63cf45070cf8fef1dbf5d661bcf6d346372615c"} Mar 08 04:15:59.996431 master-0 kubenswrapper[18592]: I0308 04:15:59.995941 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="010002a780df01858136dd9ee63cf45070cf8fef1dbf5d661bcf6d346372615c" Mar 08 04:16:00.014209 master-0 kubenswrapper[18592]: I0308 04:16:00.014148 18592 generic.go:334] "Generic (PLEG): container finished" podID="2bdf0b67-8d62-4dfc-ae07-730475c2a471" containerID="caba106c67fe491783d1663e22951fe25776b9cb8db793eb30d9ddb21dc94d5c" exitCode=0 Mar 08 04:16:00.017848 master-0 kubenswrapper[18592]: I0308 04:16:00.014873 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" event={"ID":"2bdf0b67-8d62-4dfc-ae07-730475c2a471","Type":"ContainerDied","Data":"caba106c67fe491783d1663e22951fe25776b9cb8db793eb30d9ddb21dc94d5c"} Mar 08 04:16:00.055923 master-0 kubenswrapper[18592]: I0308 04:16:00.051525 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-jqtb7"] Mar 08 04:16:00.064842 master-0 kubenswrapper[18592]: E0308 04:16:00.059580 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6" containerName="mariadb-account-create-update" Mar 08 04:16:00.064842 master-0 kubenswrapper[18592]: I0308 04:16:00.059628 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6" containerName="mariadb-account-create-update" Mar 08 04:16:00.064842 master-0 kubenswrapper[18592]: E0308 04:16:00.059639 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7ce4a7b-aee1-4b45-a17c-73289f866b0d" containerName="mariadb-database-create" Mar 08 04:16:00.064842 master-0 kubenswrapper[18592]: I0308 04:16:00.059645 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ce4a7b-aee1-4b45-a17c-73289f866b0d" containerName="mariadb-database-create" Mar 08 04:16:00.064842 master-0 kubenswrapper[18592]: E0308 04:16:00.059661 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d70e50d-f856-4cb6-bebe-58e2584f70dc" containerName="mariadb-database-create" Mar 08 04:16:00.064842 master-0 kubenswrapper[18592]: I0308 04:16:00.059666 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d70e50d-f856-4cb6-bebe-58e2584f70dc" containerName="mariadb-database-create" Mar 08 04:16:00.064842 master-0 kubenswrapper[18592]: I0308 04:16:00.059875 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7ce4a7b-aee1-4b45-a17c-73289f866b0d" containerName="mariadb-database-create" Mar 08 04:16:00.064842 master-0 kubenswrapper[18592]: I0308 04:16:00.059910 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d70e50d-f856-4cb6-bebe-58e2584f70dc" containerName="mariadb-database-create" Mar 08 04:16:00.064842 master-0 kubenswrapper[18592]: I0308 04:16:00.059929 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6" containerName="mariadb-account-create-update" Mar 08 04:16:00.064842 master-0 kubenswrapper[18592]: I0308 04:16:00.060572 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jqtb7" Mar 08 04:16:00.097845 master-0 kubenswrapper[18592]: I0308 04:16:00.089676 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-etc-swift\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:16:00.097845 master-0 kubenswrapper[18592]: E0308 04:16:00.090411 18592 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 04:16:00.097845 master-0 kubenswrapper[18592]: E0308 04:16:00.090426 18592 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 04:16:00.097845 master-0 kubenswrapper[18592]: E0308 04:16:00.090459 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-etc-swift podName:2166af23-aec1-40ee-9114-2a0ffa1c7f11 nodeName:}" failed. No retries permitted until 2026-03-08 04:16:01.090447402 +0000 UTC m=+1373.189201752 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-etc-swift") pod "swift-storage-0" (UID: "2166af23-aec1-40ee-9114-2a0ffa1c7f11") : configmap "swift-ring-files" not found Mar 08 04:16:00.097845 master-0 kubenswrapper[18592]: I0308 04:16:00.097232 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jqtb7"] Mar 08 04:16:00.201247 master-0 kubenswrapper[18592]: I0308 04:16:00.194245 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5pjp\" (UniqueName: \"kubernetes.io/projected/6e539cde-6478-4bc2-9695-96613a0ef358-kube-api-access-v5pjp\") pod \"glance-db-create-jqtb7\" (UID: \"6e539cde-6478-4bc2-9695-96613a0ef358\") " pod="openstack/glance-db-create-jqtb7" Mar 08 04:16:00.201247 master-0 kubenswrapper[18592]: I0308 04:16:00.194342 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e539cde-6478-4bc2-9695-96613a0ef358-operator-scripts\") pod \"glance-db-create-jqtb7\" (UID: \"6e539cde-6478-4bc2-9695-96613a0ef358\") " pod="openstack/glance-db-create-jqtb7" Mar 08 04:16:00.233726 master-0 kubenswrapper[18592]: I0308 04:16:00.233480 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8cd0-account-create-update-95lhk"] Mar 08 04:16:00.240652 master-0 kubenswrapper[18592]: I0308 04:16:00.235565 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8cd0-account-create-update-95lhk" Mar 08 04:16:00.240652 master-0 kubenswrapper[18592]: I0308 04:16:00.238890 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 08 04:16:00.258035 master-0 kubenswrapper[18592]: I0308 04:16:00.257976 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8cd0-account-create-update-95lhk"] Mar 08 04:16:00.296309 master-0 kubenswrapper[18592]: I0308 04:16:00.296252 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5pjp\" (UniqueName: \"kubernetes.io/projected/6e539cde-6478-4bc2-9695-96613a0ef358-kube-api-access-v5pjp\") pod \"glance-db-create-jqtb7\" (UID: \"6e539cde-6478-4bc2-9695-96613a0ef358\") " pod="openstack/glance-db-create-jqtb7" Mar 08 04:16:00.296409 master-0 kubenswrapper[18592]: I0308 04:16:00.296344 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe552f6f-dbcf-4c72-9829-797c7f222d57-operator-scripts\") pod \"glance-8cd0-account-create-update-95lhk\" (UID: \"fe552f6f-dbcf-4c72-9829-797c7f222d57\") " pod="openstack/glance-8cd0-account-create-update-95lhk" Mar 08 04:16:00.296409 master-0 kubenswrapper[18592]: I0308 04:16:00.296390 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdvmw\" (UniqueName: \"kubernetes.io/projected/fe552f6f-dbcf-4c72-9829-797c7f222d57-kube-api-access-kdvmw\") pod \"glance-8cd0-account-create-update-95lhk\" (UID: \"fe552f6f-dbcf-4c72-9829-797c7f222d57\") " pod="openstack/glance-8cd0-account-create-update-95lhk" Mar 08 04:16:00.296507 master-0 kubenswrapper[18592]: I0308 04:16:00.296452 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e539cde-6478-4bc2-9695-96613a0ef358-operator-scripts\") pod \"glance-db-create-jqtb7\" (UID: \"6e539cde-6478-4bc2-9695-96613a0ef358\") " pod="openstack/glance-db-create-jqtb7" Mar 08 04:16:00.299532 master-0 kubenswrapper[18592]: I0308 04:16:00.299487 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e539cde-6478-4bc2-9695-96613a0ef358-operator-scripts\") pod \"glance-db-create-jqtb7\" (UID: \"6e539cde-6478-4bc2-9695-96613a0ef358\") " pod="openstack/glance-db-create-jqtb7" Mar 08 04:16:00.313943 master-0 kubenswrapper[18592]: I0308 04:16:00.313851 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5pjp\" (UniqueName: \"kubernetes.io/projected/6e539cde-6478-4bc2-9695-96613a0ef358-kube-api-access-v5pjp\") pod \"glance-db-create-jqtb7\" (UID: \"6e539cde-6478-4bc2-9695-96613a0ef358\") " pod="openstack/glance-db-create-jqtb7" Mar 08 04:16:00.399200 master-0 kubenswrapper[18592]: I0308 04:16:00.398878 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe552f6f-dbcf-4c72-9829-797c7f222d57-operator-scripts\") pod \"glance-8cd0-account-create-update-95lhk\" (UID: \"fe552f6f-dbcf-4c72-9829-797c7f222d57\") " pod="openstack/glance-8cd0-account-create-update-95lhk" Mar 08 04:16:00.399200 master-0 kubenswrapper[18592]: I0308 04:16:00.398953 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdvmw\" (UniqueName: \"kubernetes.io/projected/fe552f6f-dbcf-4c72-9829-797c7f222d57-kube-api-access-kdvmw\") pod \"glance-8cd0-account-create-update-95lhk\" (UID: \"fe552f6f-dbcf-4c72-9829-797c7f222d57\") " pod="openstack/glance-8cd0-account-create-update-95lhk" Mar 08 04:16:00.400753 master-0 kubenswrapper[18592]: I0308 04:16:00.399878 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe552f6f-dbcf-4c72-9829-797c7f222d57-operator-scripts\") pod \"glance-8cd0-account-create-update-95lhk\" (UID: \"fe552f6f-dbcf-4c72-9829-797c7f222d57\") " pod="openstack/glance-8cd0-account-create-update-95lhk" Mar 08 04:16:00.415856 master-0 kubenswrapper[18592]: I0308 04:16:00.413346 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdvmw\" (UniqueName: \"kubernetes.io/projected/fe552f6f-dbcf-4c72-9829-797c7f222d57-kube-api-access-kdvmw\") pod \"glance-8cd0-account-create-update-95lhk\" (UID: \"fe552f6f-dbcf-4c72-9829-797c7f222d57\") " pod="openstack/glance-8cd0-account-create-update-95lhk" Mar 08 04:16:00.462764 master-0 kubenswrapper[18592]: I0308 04:16:00.462733 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jqtb7" Mar 08 04:16:00.552059 master-0 kubenswrapper[18592]: I0308 04:16:00.551990 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8cd0-account-create-update-95lhk" Mar 08 04:16:00.579595 master-0 kubenswrapper[18592]: I0308 04:16:00.579261 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d6d7-account-create-update-gmt7m" Mar 08 04:16:00.708343 master-0 kubenswrapper[18592]: I0308 04:16:00.707502 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83dacd91-8bfe-4f52-a178-b5b51f46a968-operator-scripts\") pod \"83dacd91-8bfe-4f52-a178-b5b51f46a968\" (UID: \"83dacd91-8bfe-4f52-a178-b5b51f46a968\") " Mar 08 04:16:00.708343 master-0 kubenswrapper[18592]: I0308 04:16:00.707739 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c24fs\" (UniqueName: \"kubernetes.io/projected/83dacd91-8bfe-4f52-a178-b5b51f46a968-kube-api-access-c24fs\") pod \"83dacd91-8bfe-4f52-a178-b5b51f46a968\" (UID: \"83dacd91-8bfe-4f52-a178-b5b51f46a968\") " Mar 08 04:16:00.709770 master-0 kubenswrapper[18592]: I0308 04:16:00.709719 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83dacd91-8bfe-4f52-a178-b5b51f46a968-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "83dacd91-8bfe-4f52-a178-b5b51f46a968" (UID: "83dacd91-8bfe-4f52-a178-b5b51f46a968"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:00.712856 master-0 kubenswrapper[18592]: I0308 04:16:00.712757 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83dacd91-8bfe-4f52-a178-b5b51f46a968-kube-api-access-c24fs" (OuterVolumeSpecName: "kube-api-access-c24fs") pod "83dacd91-8bfe-4f52-a178-b5b51f46a968" (UID: "83dacd91-8bfe-4f52-a178-b5b51f46a968"). InnerVolumeSpecName "kube-api-access-c24fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:00.811524 master-0 kubenswrapper[18592]: I0308 04:16:00.811466 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83dacd91-8bfe-4f52-a178-b5b51f46a968-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:00.811524 master-0 kubenswrapper[18592]: I0308 04:16:00.811511 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c24fs\" (UniqueName: \"kubernetes.io/projected/83dacd91-8bfe-4f52-a178-b5b51f46a968-kube-api-access-c24fs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:00.952431 master-0 kubenswrapper[18592]: I0308 04:16:00.952374 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5242282a-c1e6-407f-9a94-df076247e568\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ca7c0a39-d842-4027-b057-143585a04d7a\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:16:01.012843 master-0 kubenswrapper[18592]: I0308 04:16:01.012778 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jqtb7"] Mar 08 04:16:01.028104 master-0 kubenswrapper[18592]: W0308 04:16:01.026993 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e539cde_6478_4bc2_9695_96613a0ef358.slice/crio-e176e8c417fa9a9518b7292fd6283d07d6576fb9df3356bf271c0c950729e7e3 WatchSource:0}: Error finding container e176e8c417fa9a9518b7292fd6283d07d6576fb9df3356bf271c0c950729e7e3: Status 404 returned error can't find the container with id e176e8c417fa9a9518b7292fd6283d07d6576fb9df3356bf271c0c950729e7e3 Mar 08 04:16:01.079544 master-0 kubenswrapper[18592]: I0308 04:16:01.079478 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d6d7-account-create-update-gmt7m" event={"ID":"83dacd91-8bfe-4f52-a178-b5b51f46a968","Type":"ContainerDied","Data":"8489167b1e440a53ab9f0db407c28a85d09b75e939fc7c516c09bbb594f98545"} Mar 08 04:16:01.079544 master-0 kubenswrapper[18592]: I0308 04:16:01.079541 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8489167b1e440a53ab9f0db407c28a85d09b75e939fc7c516c09bbb594f98545" Mar 08 04:16:01.079794 master-0 kubenswrapper[18592]: I0308 04:16:01.079639 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d6d7-account-create-update-gmt7m" Mar 08 04:16:01.082336 master-0 kubenswrapper[18592]: I0308 04:16:01.082268 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" event={"ID":"2bdf0b67-8d62-4dfc-ae07-730475c2a471","Type":"ContainerStarted","Data":"0d093d9478d9ee3a76fd2b9381328c07c09751f36489ef028cc9d3e5acc5da32"} Mar 08 04:16:01.088575 master-0 kubenswrapper[18592]: I0308 04:16:01.084094 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:16:01.127425 master-0 kubenswrapper[18592]: I0308 04:16:01.127288 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-etc-swift\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:16:01.127778 master-0 kubenswrapper[18592]: E0308 04:16:01.127745 18592 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 04:16:01.127778 master-0 kubenswrapper[18592]: E0308 04:16:01.127775 18592 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 04:16:01.127879 master-0 kubenswrapper[18592]: E0308 04:16:01.127852 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-etc-swift podName:2166af23-aec1-40ee-9114-2a0ffa1c7f11 nodeName:}" failed. No retries permitted until 2026-03-08 04:16:03.127814761 +0000 UTC m=+1375.226569111 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-etc-swift") pod "swift-storage-0" (UID: "2166af23-aec1-40ee-9114-2a0ffa1c7f11") : configmap "swift-ring-files" not found Mar 08 04:16:01.128096 master-0 kubenswrapper[18592]: I0308 04:16:01.128004 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" podStartSLOduration=4.127896603 podStartE2EDuration="4.127896603s" podCreationTimestamp="2026-03-08 04:15:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:16:01.111652686 +0000 UTC m=+1373.210407026" watchObservedRunningTime="2026-03-08 04:16:01.127896603 +0000 UTC m=+1373.226650953" Mar 08 04:16:01.153313 master-0 kubenswrapper[18592]: I0308 04:16:01.153027 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8cd0-account-create-update-95lhk"] Mar 08 04:16:01.157068 master-0 kubenswrapper[18592]: W0308 04:16:01.157002 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe552f6f_dbcf_4c72_9829_797c7f222d57.slice/crio-c29a1002e956173057f1dedf60e9b24c3b57aa6fa3b50768d2cde90682767a2d WatchSource:0}: Error finding container c29a1002e956173057f1dedf60e9b24c3b57aa6fa3b50768d2cde90682767a2d: Status 404 returned error can't find the container with id c29a1002e956173057f1dedf60e9b24c3b57aa6fa3b50768d2cde90682767a2d Mar 08 04:16:02.097027 master-0 kubenswrapper[18592]: I0308 04:16:02.096955 18592 generic.go:334] "Generic (PLEG): container finished" podID="fe552f6f-dbcf-4c72-9829-797c7f222d57" containerID="e01d199363f07c5fa381899fbf3fe69617c329f6230fb5d4428dd4c45d145b4d" exitCode=0 Mar 08 04:16:02.097648 master-0 kubenswrapper[18592]: I0308 04:16:02.097015 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8cd0-account-create-update-95lhk" event={"ID":"fe552f6f-dbcf-4c72-9829-797c7f222d57","Type":"ContainerDied","Data":"e01d199363f07c5fa381899fbf3fe69617c329f6230fb5d4428dd4c45d145b4d"} Mar 08 04:16:02.097648 master-0 kubenswrapper[18592]: I0308 04:16:02.097107 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8cd0-account-create-update-95lhk" event={"ID":"fe552f6f-dbcf-4c72-9829-797c7f222d57","Type":"ContainerStarted","Data":"c29a1002e956173057f1dedf60e9b24c3b57aa6fa3b50768d2cde90682767a2d"} Mar 08 04:16:02.099655 master-0 kubenswrapper[18592]: I0308 04:16:02.099517 18592 generic.go:334] "Generic (PLEG): container finished" podID="6e539cde-6478-4bc2-9695-96613a0ef358" containerID="bb11fccfedd8503f4a2d9dc61da670f56dde7f182f3884caef5d346ac9ecdd2d" exitCode=0 Mar 08 04:16:02.099655 master-0 kubenswrapper[18592]: I0308 04:16:02.099578 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jqtb7" event={"ID":"6e539cde-6478-4bc2-9695-96613a0ef358","Type":"ContainerDied","Data":"bb11fccfedd8503f4a2d9dc61da670f56dde7f182f3884caef5d346ac9ecdd2d"} Mar 08 04:16:02.099655 master-0 kubenswrapper[18592]: I0308 04:16:02.099651 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jqtb7" event={"ID":"6e539cde-6478-4bc2-9695-96613a0ef358","Type":"ContainerStarted","Data":"e176e8c417fa9a9518b7292fd6283d07d6576fb9df3356bf271c0c950729e7e3"} Mar 08 04:16:02.311919 master-0 kubenswrapper[18592]: I0308 04:16:02.311845 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-hmnb8"] Mar 08 04:16:02.312573 master-0 kubenswrapper[18592]: E0308 04:16:02.312536 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83dacd91-8bfe-4f52-a178-b5b51f46a968" containerName="mariadb-account-create-update" Mar 08 04:16:02.312650 master-0 kubenswrapper[18592]: I0308 04:16:02.312572 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="83dacd91-8bfe-4f52-a178-b5b51f46a968" containerName="mariadb-account-create-update" Mar 08 04:16:02.313123 master-0 kubenswrapper[18592]: I0308 04:16:02.313089 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="83dacd91-8bfe-4f52-a178-b5b51f46a968" containerName="mariadb-account-create-update" Mar 08 04:16:02.314257 master-0 kubenswrapper[18592]: I0308 04:16:02.314215 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.317998 master-0 kubenswrapper[18592]: I0308 04:16:02.317942 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 08 04:16:02.318323 master-0 kubenswrapper[18592]: I0308 04:16:02.318293 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 08 04:16:02.321947 master-0 kubenswrapper[18592]: I0308 04:16:02.321601 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 08 04:16:02.327166 master-0 kubenswrapper[18592]: I0308 04:16:02.327108 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-hmnb8"] Mar 08 04:16:02.455006 master-0 kubenswrapper[18592]: I0308 04:16:02.454929 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-swiftconf\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.455256 master-0 kubenswrapper[18592]: I0308 04:16:02.455204 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-combined-ca-bundle\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.456205 master-0 kubenswrapper[18592]: I0308 04:16:02.455429 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-dispersionconf\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.456205 master-0 kubenswrapper[18592]: I0308 04:16:02.455540 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-scripts\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.456205 master-0 kubenswrapper[18592]: I0308 04:16:02.455743 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-ring-data-devices\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.456205 master-0 kubenswrapper[18592]: I0308 04:16:02.455872 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-etc-swift\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.456205 master-0 kubenswrapper[18592]: I0308 04:16:02.455894 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhs52\" (UniqueName: \"kubernetes.io/projected/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-kube-api-access-nhs52\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.558280 master-0 kubenswrapper[18592]: I0308 04:16:02.558202 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-swiftconf\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.558506 master-0 kubenswrapper[18592]: I0308 04:16:02.558334 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-combined-ca-bundle\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.558506 master-0 kubenswrapper[18592]: I0308 04:16:02.558424 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-dispersionconf\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.558506 master-0 kubenswrapper[18592]: I0308 04:16:02.558488 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-scripts\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.558616 master-0 kubenswrapper[18592]: I0308 04:16:02.558546 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-ring-data-devices\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.558616 master-0 kubenswrapper[18592]: I0308 04:16:02.558587 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-etc-swift\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.558678 master-0 kubenswrapper[18592]: I0308 04:16:02.558625 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhs52\" (UniqueName: \"kubernetes.io/projected/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-kube-api-access-nhs52\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.559647 master-0 kubenswrapper[18592]: I0308 04:16:02.559583 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-etc-swift\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.560248 master-0 kubenswrapper[18592]: I0308 04:16:02.559703 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-ring-data-devices\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.560248 master-0 kubenswrapper[18592]: I0308 04:16:02.560206 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-scripts\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.563175 master-0 kubenswrapper[18592]: I0308 04:16:02.563119 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-swiftconf\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.572906 master-0 kubenswrapper[18592]: I0308 04:16:02.572790 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-combined-ca-bundle\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.578080 master-0 kubenswrapper[18592]: I0308 04:16:02.578038 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhs52\" (UniqueName: \"kubernetes.io/projected/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-kube-api-access-nhs52\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.578546 master-0 kubenswrapper[18592]: I0308 04:16:02.578485 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-dispersionconf\") pod \"swift-ring-rebalance-hmnb8\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:02.652658 master-0 kubenswrapper[18592]: I0308 04:16:02.652529 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:03.033905 master-0 kubenswrapper[18592]: I0308 04:16:03.032885 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-k6c6g"] Mar 08 04:16:03.040942 master-0 kubenswrapper[18592]: I0308 04:16:03.038430 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-k6c6g"] Mar 08 04:16:03.088554 master-0 kubenswrapper[18592]: I0308 04:16:03.088497 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-5qfwd"] Mar 08 04:16:03.090235 master-0 kubenswrapper[18592]: I0308 04:16:03.090202 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5qfwd" Mar 08 04:16:03.092637 master-0 kubenswrapper[18592]: I0308 04:16:03.092575 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 08 04:16:03.097338 master-0 kubenswrapper[18592]: I0308 04:16:03.097307 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5qfwd"] Mar 08 04:16:03.187042 master-0 kubenswrapper[18592]: I0308 04:16:03.185344 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-etc-swift\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:16:03.187042 master-0 kubenswrapper[18592]: I0308 04:16:03.185487 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eed04369-6074-4ce6-9455-f47fe669f97d-operator-scripts\") pod \"root-account-create-update-5qfwd\" (UID: \"eed04369-6074-4ce6-9455-f47fe669f97d\") " pod="openstack/root-account-create-update-5qfwd" Mar 08 04:16:03.187042 master-0 kubenswrapper[18592]: I0308 04:16:03.185526 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkbqc\" (UniqueName: \"kubernetes.io/projected/eed04369-6074-4ce6-9455-f47fe669f97d-kube-api-access-vkbqc\") pod \"root-account-create-update-5qfwd\" (UID: \"eed04369-6074-4ce6-9455-f47fe669f97d\") " pod="openstack/root-account-create-update-5qfwd" Mar 08 04:16:03.193026 master-0 kubenswrapper[18592]: E0308 04:16:03.192986 18592 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 04:16:03.193026 master-0 kubenswrapper[18592]: E0308 04:16:03.193022 18592 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 04:16:03.200362 master-0 kubenswrapper[18592]: E0308 04:16:03.193090 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-etc-swift podName:2166af23-aec1-40ee-9114-2a0ffa1c7f11 nodeName:}" failed. No retries permitted until 2026-03-08 04:16:07.193072037 +0000 UTC m=+1379.291826387 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-etc-swift") pod "swift-storage-0" (UID: "2166af23-aec1-40ee-9114-2a0ffa1c7f11") : configmap "swift-ring-files" not found Mar 08 04:16:03.236489 master-0 kubenswrapper[18592]: I0308 04:16:03.236446 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-hmnb8"] Mar 08 04:16:03.288337 master-0 kubenswrapper[18592]: I0308 04:16:03.288279 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eed04369-6074-4ce6-9455-f47fe669f97d-operator-scripts\") pod \"root-account-create-update-5qfwd\" (UID: \"eed04369-6074-4ce6-9455-f47fe669f97d\") " pod="openstack/root-account-create-update-5qfwd" Mar 08 04:16:03.288549 master-0 kubenswrapper[18592]: I0308 04:16:03.288348 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkbqc\" (UniqueName: \"kubernetes.io/projected/eed04369-6074-4ce6-9455-f47fe669f97d-kube-api-access-vkbqc\") pod \"root-account-create-update-5qfwd\" (UID: \"eed04369-6074-4ce6-9455-f47fe669f97d\") " pod="openstack/root-account-create-update-5qfwd" Mar 08 04:16:03.289318 master-0 kubenswrapper[18592]: I0308 04:16:03.289274 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eed04369-6074-4ce6-9455-f47fe669f97d-operator-scripts\") pod \"root-account-create-update-5qfwd\" (UID: \"eed04369-6074-4ce6-9455-f47fe669f97d\") " pod="openstack/root-account-create-update-5qfwd" Mar 08 04:16:03.305330 master-0 kubenswrapper[18592]: I0308 04:16:03.305274 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkbqc\" (UniqueName: \"kubernetes.io/projected/eed04369-6074-4ce6-9455-f47fe669f97d-kube-api-access-vkbqc\") pod \"root-account-create-update-5qfwd\" (UID: \"eed04369-6074-4ce6-9455-f47fe669f97d\") " pod="openstack/root-account-create-update-5qfwd" Mar 08 04:16:03.432311 master-0 kubenswrapper[18592]: I0308 04:16:03.432242 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5qfwd" Mar 08 04:16:03.627984 master-0 kubenswrapper[18592]: I0308 04:16:03.627951 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8cd0-account-create-update-95lhk" Mar 08 04:16:03.706894 master-0 kubenswrapper[18592]: I0308 04:16:03.701979 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdvmw\" (UniqueName: \"kubernetes.io/projected/fe552f6f-dbcf-4c72-9829-797c7f222d57-kube-api-access-kdvmw\") pod \"fe552f6f-dbcf-4c72-9829-797c7f222d57\" (UID: \"fe552f6f-dbcf-4c72-9829-797c7f222d57\") " Mar 08 04:16:03.706894 master-0 kubenswrapper[18592]: I0308 04:16:03.702092 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe552f6f-dbcf-4c72-9829-797c7f222d57-operator-scripts\") pod \"fe552f6f-dbcf-4c72-9829-797c7f222d57\" (UID: \"fe552f6f-dbcf-4c72-9829-797c7f222d57\") " Mar 08 04:16:03.706894 master-0 kubenswrapper[18592]: I0308 04:16:03.703378 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe552f6f-dbcf-4c72-9829-797c7f222d57-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe552f6f-dbcf-4c72-9829-797c7f222d57" (UID: "fe552f6f-dbcf-4c72-9829-797c7f222d57"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:03.713344 master-0 kubenswrapper[18592]: I0308 04:16:03.713289 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe552f6f-dbcf-4c72-9829-797c7f222d57-kube-api-access-kdvmw" (OuterVolumeSpecName: "kube-api-access-kdvmw") pod "fe552f6f-dbcf-4c72-9829-797c7f222d57" (UID: "fe552f6f-dbcf-4c72-9829-797c7f222d57"). InnerVolumeSpecName "kube-api-access-kdvmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:03.773964 master-0 kubenswrapper[18592]: I0308 04:16:03.773903 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jqtb7" Mar 08 04:16:03.808088 master-0 kubenswrapper[18592]: I0308 04:16:03.808027 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdvmw\" (UniqueName: \"kubernetes.io/projected/fe552f6f-dbcf-4c72-9829-797c7f222d57-kube-api-access-kdvmw\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:03.808088 master-0 kubenswrapper[18592]: I0308 04:16:03.808071 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe552f6f-dbcf-4c72-9829-797c7f222d57-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:03.945339 master-0 kubenswrapper[18592]: I0308 04:16:03.909239 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5pjp\" (UniqueName: \"kubernetes.io/projected/6e539cde-6478-4bc2-9695-96613a0ef358-kube-api-access-v5pjp\") pod \"6e539cde-6478-4bc2-9695-96613a0ef358\" (UID: \"6e539cde-6478-4bc2-9695-96613a0ef358\") " Mar 08 04:16:03.945339 master-0 kubenswrapper[18592]: I0308 04:16:03.909498 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e539cde-6478-4bc2-9695-96613a0ef358-operator-scripts\") pod \"6e539cde-6478-4bc2-9695-96613a0ef358\" (UID: \"6e539cde-6478-4bc2-9695-96613a0ef358\") " Mar 08 04:16:03.945339 master-0 kubenswrapper[18592]: I0308 04:16:03.910209 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e539cde-6478-4bc2-9695-96613a0ef358-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6e539cde-6478-4bc2-9695-96613a0ef358" (UID: "6e539cde-6478-4bc2-9695-96613a0ef358"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:03.946166 master-0 kubenswrapper[18592]: I0308 04:16:03.945650 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e539cde-6478-4bc2-9695-96613a0ef358-kube-api-access-v5pjp" (OuterVolumeSpecName: "kube-api-access-v5pjp") pod "6e539cde-6478-4bc2-9695-96613a0ef358" (UID: "6e539cde-6478-4bc2-9695-96613a0ef358"). InnerVolumeSpecName "kube-api-access-v5pjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:03.996103 master-0 kubenswrapper[18592]: I0308 04:16:03.994471 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5qfwd"] Mar 08 04:16:04.011758 master-0 kubenswrapper[18592]: I0308 04:16:04.011501 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e539cde-6478-4bc2-9695-96613a0ef358-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:04.011758 master-0 kubenswrapper[18592]: I0308 04:16:04.011552 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5pjp\" (UniqueName: \"kubernetes.io/projected/6e539cde-6478-4bc2-9695-96613a0ef358-kube-api-access-v5pjp\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:04.020549 master-0 kubenswrapper[18592]: W0308 04:16:04.020467 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeed04369_6074_4ce6_9455_f47fe669f97d.slice/crio-59c8ffcd88e8a8a140fd5c3a5be8be4bfb625f7be86a7d1a0f7622916b5443c0 WatchSource:0}: Error finding container 59c8ffcd88e8a8a140fd5c3a5be8be4bfb625f7be86a7d1a0f7622916b5443c0: Status 404 returned error can't find the container with id 59c8ffcd88e8a8a140fd5c3a5be8be4bfb625f7be86a7d1a0f7622916b5443c0 Mar 08 04:16:04.133099 master-0 kubenswrapper[18592]: I0308 04:16:04.133017 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hmnb8" event={"ID":"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01","Type":"ContainerStarted","Data":"f2190d44cdc5ff947757ef8157a735170fb60211ca63ac0e7954a99fbeaee8c1"} Mar 08 04:16:04.154886 master-0 kubenswrapper[18592]: I0308 04:16:04.152020 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jqtb7" Mar 08 04:16:04.162570 master-0 kubenswrapper[18592]: I0308 04:16:04.162041 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66da2c95-0991-4d14-900f-de90241dd987" path="/var/lib/kubelet/pods/66da2c95-0991-4d14-900f-de90241dd987/volumes" Mar 08 04:16:04.162570 master-0 kubenswrapper[18592]: I0308 04:16:04.162067 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8cd0-account-create-update-95lhk" Mar 08 04:16:04.163897 master-0 kubenswrapper[18592]: I0308 04:16:04.163027 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jqtb7" event={"ID":"6e539cde-6478-4bc2-9695-96613a0ef358","Type":"ContainerDied","Data":"e176e8c417fa9a9518b7292fd6283d07d6576fb9df3356bf271c0c950729e7e3"} Mar 08 04:16:04.163897 master-0 kubenswrapper[18592]: I0308 04:16:04.163054 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e176e8c417fa9a9518b7292fd6283d07d6576fb9df3356bf271c0c950729e7e3" Mar 08 04:16:04.163897 master-0 kubenswrapper[18592]: I0308 04:16:04.163063 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5qfwd" event={"ID":"eed04369-6074-4ce6-9455-f47fe669f97d","Type":"ContainerStarted","Data":"59c8ffcd88e8a8a140fd5c3a5be8be4bfb625f7be86a7d1a0f7622916b5443c0"} Mar 08 04:16:04.163897 master-0 kubenswrapper[18592]: I0308 04:16:04.163075 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8cd0-account-create-update-95lhk" event={"ID":"fe552f6f-dbcf-4c72-9829-797c7f222d57","Type":"ContainerDied","Data":"c29a1002e956173057f1dedf60e9b24c3b57aa6fa3b50768d2cde90682767a2d"} Mar 08 04:16:04.163897 master-0 kubenswrapper[18592]: I0308 04:16:04.163083 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c29a1002e956173057f1dedf60e9b24c3b57aa6fa3b50768d2cde90682767a2d" Mar 08 04:16:05.178929 master-0 kubenswrapper[18592]: I0308 04:16:05.178848 18592 generic.go:334] "Generic (PLEG): container finished" podID="eed04369-6074-4ce6-9455-f47fe669f97d" containerID="c1f24c7b7642331135e0d3993c516efd5a955c5d4702f432c9fda8e8c1673ae3" exitCode=0 Mar 08 04:16:05.178929 master-0 kubenswrapper[18592]: I0308 04:16:05.178921 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5qfwd" event={"ID":"eed04369-6074-4ce6-9455-f47fe669f97d","Type":"ContainerDied","Data":"c1f24c7b7642331135e0d3993c516efd5a955c5d4702f432c9fda8e8c1673ae3"} Mar 08 04:16:05.276228 master-0 kubenswrapper[18592]: I0308 04:16:05.257183 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-7wxjf"] Mar 08 04:16:05.276228 master-0 kubenswrapper[18592]: E0308 04:16:05.257705 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe552f6f-dbcf-4c72-9829-797c7f222d57" containerName="mariadb-account-create-update" Mar 08 04:16:05.276228 master-0 kubenswrapper[18592]: I0308 04:16:05.257722 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe552f6f-dbcf-4c72-9829-797c7f222d57" containerName="mariadb-account-create-update" Mar 08 04:16:05.276228 master-0 kubenswrapper[18592]: E0308 04:16:05.257754 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e539cde-6478-4bc2-9695-96613a0ef358" containerName="mariadb-database-create" Mar 08 04:16:05.276228 master-0 kubenswrapper[18592]: I0308 04:16:05.257763 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e539cde-6478-4bc2-9695-96613a0ef358" containerName="mariadb-database-create" Mar 08 04:16:05.276228 master-0 kubenswrapper[18592]: I0308 04:16:05.258675 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e539cde-6478-4bc2-9695-96613a0ef358" containerName="mariadb-database-create" Mar 08 04:16:05.276228 master-0 kubenswrapper[18592]: I0308 04:16:05.258714 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe552f6f-dbcf-4c72-9829-797c7f222d57" containerName="mariadb-account-create-update" Mar 08 04:16:05.276228 master-0 kubenswrapper[18592]: I0308 04:16:05.259507 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7wxjf" Mar 08 04:16:05.276228 master-0 kubenswrapper[18592]: I0308 04:16:05.269238 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-afe2b-config-data" Mar 08 04:16:05.276228 master-0 kubenswrapper[18592]: I0308 04:16:05.271864 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7wxjf"] Mar 08 04:16:05.342126 master-0 kubenswrapper[18592]: I0308 04:16:05.342076 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-config-data\") pod \"glance-db-sync-7wxjf\" (UID: \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\") " pod="openstack/glance-db-sync-7wxjf" Mar 08 04:16:05.342267 master-0 kubenswrapper[18592]: I0308 04:16:05.342169 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-combined-ca-bundle\") pod \"glance-db-sync-7wxjf\" (UID: \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\") " pod="openstack/glance-db-sync-7wxjf" Mar 08 04:16:05.342343 master-0 kubenswrapper[18592]: I0308 04:16:05.342316 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-db-sync-config-data\") pod \"glance-db-sync-7wxjf\" (UID: \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\") " pod="openstack/glance-db-sync-7wxjf" Mar 08 04:16:05.342410 master-0 kubenswrapper[18592]: I0308 04:16:05.342383 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brl7d\" (UniqueName: \"kubernetes.io/projected/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-kube-api-access-brl7d\") pod \"glance-db-sync-7wxjf\" (UID: \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\") " pod="openstack/glance-db-sync-7wxjf" Mar 08 04:16:05.444558 master-0 kubenswrapper[18592]: I0308 04:16:05.444496 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-db-sync-config-data\") pod \"glance-db-sync-7wxjf\" (UID: \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\") " pod="openstack/glance-db-sync-7wxjf" Mar 08 04:16:05.444750 master-0 kubenswrapper[18592]: I0308 04:16:05.444576 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brl7d\" (UniqueName: \"kubernetes.io/projected/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-kube-api-access-brl7d\") pod \"glance-db-sync-7wxjf\" (UID: \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\") " pod="openstack/glance-db-sync-7wxjf" Mar 08 04:16:05.444750 master-0 kubenswrapper[18592]: I0308 04:16:05.444653 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-config-data\") pod \"glance-db-sync-7wxjf\" (UID: \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\") " pod="openstack/glance-db-sync-7wxjf" Mar 08 04:16:05.445406 master-0 kubenswrapper[18592]: I0308 04:16:05.445345 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-combined-ca-bundle\") pod \"glance-db-sync-7wxjf\" (UID: \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\") " pod="openstack/glance-db-sync-7wxjf" Mar 08 04:16:05.449723 master-0 kubenswrapper[18592]: I0308 04:16:05.449689 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-config-data\") pod \"glance-db-sync-7wxjf\" (UID: \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\") " pod="openstack/glance-db-sync-7wxjf" Mar 08 04:16:05.452849 master-0 kubenswrapper[18592]: I0308 04:16:05.452805 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-combined-ca-bundle\") pod \"glance-db-sync-7wxjf\" (UID: \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\") " pod="openstack/glance-db-sync-7wxjf" Mar 08 04:16:05.457032 master-0 kubenswrapper[18592]: I0308 04:16:05.456999 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-db-sync-config-data\") pod \"glance-db-sync-7wxjf\" (UID: \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\") " pod="openstack/glance-db-sync-7wxjf" Mar 08 04:16:05.459893 master-0 kubenswrapper[18592]: I0308 04:16:05.459853 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brl7d\" (UniqueName: \"kubernetes.io/projected/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-kube-api-access-brl7d\") pod \"glance-db-sync-7wxjf\" (UID: \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\") " pod="openstack/glance-db-sync-7wxjf" Mar 08 04:16:05.641781 master-0 kubenswrapper[18592]: I0308 04:16:05.641710 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7wxjf" Mar 08 04:16:06.860548 master-0 kubenswrapper[18592]: I0308 04:16:06.860480 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5qfwd" Mar 08 04:16:07.024893 master-0 kubenswrapper[18592]: I0308 04:16:07.020853 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eed04369-6074-4ce6-9455-f47fe669f97d-operator-scripts\") pod \"eed04369-6074-4ce6-9455-f47fe669f97d\" (UID: \"eed04369-6074-4ce6-9455-f47fe669f97d\") " Mar 08 04:16:07.024893 master-0 kubenswrapper[18592]: I0308 04:16:07.021069 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkbqc\" (UniqueName: \"kubernetes.io/projected/eed04369-6074-4ce6-9455-f47fe669f97d-kube-api-access-vkbqc\") pod \"eed04369-6074-4ce6-9455-f47fe669f97d\" (UID: \"eed04369-6074-4ce6-9455-f47fe669f97d\") " Mar 08 04:16:07.028891 master-0 kubenswrapper[18592]: I0308 04:16:07.025037 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eed04369-6074-4ce6-9455-f47fe669f97d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eed04369-6074-4ce6-9455-f47fe669f97d" (UID: "eed04369-6074-4ce6-9455-f47fe669f97d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:07.043948 master-0 kubenswrapper[18592]: I0308 04:16:07.043878 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eed04369-6074-4ce6-9455-f47fe669f97d-kube-api-access-vkbqc" (OuterVolumeSpecName: "kube-api-access-vkbqc") pod "eed04369-6074-4ce6-9455-f47fe669f97d" (UID: "eed04369-6074-4ce6-9455-f47fe669f97d"). InnerVolumeSpecName "kube-api-access-vkbqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:07.123268 master-0 kubenswrapper[18592]: I0308 04:16:07.123197 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eed04369-6074-4ce6-9455-f47fe669f97d-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:07.123268 master-0 kubenswrapper[18592]: I0308 04:16:07.123248 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkbqc\" (UniqueName: \"kubernetes.io/projected/eed04369-6074-4ce6-9455-f47fe669f97d-kube-api-access-vkbqc\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:07.213623 master-0 kubenswrapper[18592]: I0308 04:16:07.213574 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5qfwd" Mar 08 04:16:07.214236 master-0 kubenswrapper[18592]: I0308 04:16:07.213516 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5qfwd" event={"ID":"eed04369-6074-4ce6-9455-f47fe669f97d","Type":"ContainerDied","Data":"59c8ffcd88e8a8a140fd5c3a5be8be4bfb625f7be86a7d1a0f7622916b5443c0"} Mar 08 04:16:07.214236 master-0 kubenswrapper[18592]: I0308 04:16:07.214233 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59c8ffcd88e8a8a140fd5c3a5be8be4bfb625f7be86a7d1a0f7622916b5443c0" Mar 08 04:16:07.218189 master-0 kubenswrapper[18592]: I0308 04:16:07.218144 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hmnb8" event={"ID":"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01","Type":"ContainerStarted","Data":"02841ee2d7211ef38a132b1633d2cdad96a22b9e1af8d72ccb9c10b3f5a4834d"} Mar 08 04:16:07.226584 master-0 kubenswrapper[18592]: I0308 04:16:07.226170 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-etc-swift\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:16:07.226584 master-0 kubenswrapper[18592]: E0308 04:16:07.226393 18592 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 04:16:07.226584 master-0 kubenswrapper[18592]: E0308 04:16:07.226411 18592 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 04:16:07.226584 master-0 kubenswrapper[18592]: E0308 04:16:07.226456 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-etc-swift podName:2166af23-aec1-40ee-9114-2a0ffa1c7f11 nodeName:}" failed. No retries permitted until 2026-03-08 04:16:15.226440839 +0000 UTC m=+1387.325195199 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-etc-swift") pod "swift-storage-0" (UID: "2166af23-aec1-40ee-9114-2a0ffa1c7f11") : configmap "swift-ring-files" not found Mar 08 04:16:07.250483 master-0 kubenswrapper[18592]: I0308 04:16:07.250384 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-hmnb8" podStartSLOduration=1.728827785 podStartE2EDuration="5.250363326s" podCreationTimestamp="2026-03-08 04:16:02 +0000 UTC" firstStartedPulling="2026-03-08 04:16:03.234878616 +0000 UTC m=+1375.333632966" lastFinishedPulling="2026-03-08 04:16:06.756414147 +0000 UTC m=+1378.855168507" observedRunningTime="2026-03-08 04:16:07.237071741 +0000 UTC m=+1379.335826091" watchObservedRunningTime="2026-03-08 04:16:07.250363326 +0000 UTC m=+1379.349117676" Mar 08 04:16:07.306229 master-0 kubenswrapper[18592]: I0308 04:16:07.306175 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7wxjf"] Mar 08 04:16:07.526223 master-0 kubenswrapper[18592]: I0308 04:16:07.526065 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:16:07.649892 master-0 kubenswrapper[18592]: I0308 04:16:07.636462 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689ddcfcf7-jh95m"] Mar 08 04:16:07.649892 master-0 kubenswrapper[18592]: I0308 04:16:07.636734 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" podUID="ff3c3806-caca-42ef-993a-85bc4e509640" containerName="dnsmasq-dns" containerID="cri-o://9bf4910b9e2bde3ae21cefb3f6bf1431a41853ad07a96b9bec2ca95e4593ff20" gracePeriod=10 Mar 08 04:16:08.263274 master-0 kubenswrapper[18592]: I0308 04:16:08.263194 18592 generic.go:334] "Generic (PLEG): container finished" podID="ff3c3806-caca-42ef-993a-85bc4e509640" containerID="9bf4910b9e2bde3ae21cefb3f6bf1431a41853ad07a96b9bec2ca95e4593ff20" exitCode=0 Mar 08 04:16:08.263274 master-0 kubenswrapper[18592]: I0308 04:16:08.263229 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" event={"ID":"ff3c3806-caca-42ef-993a-85bc4e509640","Type":"ContainerDied","Data":"9bf4910b9e2bde3ae21cefb3f6bf1431a41853ad07a96b9bec2ca95e4593ff20"} Mar 08 04:16:08.265881 master-0 kubenswrapper[18592]: I0308 04:16:08.265814 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7wxjf" event={"ID":"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca","Type":"ContainerStarted","Data":"8954344687b37d4d621e1ec7c090371aa16968cff24bae025b7d45a689e1a7f1"} Mar 08 04:16:08.498112 master-0 kubenswrapper[18592]: I0308 04:16:08.498068 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:16:08.572932 master-0 kubenswrapper[18592]: I0308 04:16:08.572803 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfnc6\" (UniqueName: \"kubernetes.io/projected/ff3c3806-caca-42ef-993a-85bc4e509640-kube-api-access-xfnc6\") pod \"ff3c3806-caca-42ef-993a-85bc4e509640\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " Mar 08 04:16:08.573255 master-0 kubenswrapper[18592]: I0308 04:16:08.572899 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-dns-svc\") pod \"ff3c3806-caca-42ef-993a-85bc4e509640\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " Mar 08 04:16:08.573416 master-0 kubenswrapper[18592]: I0308 04:16:08.573358 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-ovsdbserver-nb\") pod \"ff3c3806-caca-42ef-993a-85bc4e509640\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " Mar 08 04:16:08.573653 master-0 kubenswrapper[18592]: I0308 04:16:08.573608 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-config\") pod \"ff3c3806-caca-42ef-993a-85bc4e509640\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " Mar 08 04:16:08.573808 master-0 kubenswrapper[18592]: I0308 04:16:08.573754 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-ovsdbserver-sb\") pod \"ff3c3806-caca-42ef-993a-85bc4e509640\" (UID: \"ff3c3806-caca-42ef-993a-85bc4e509640\") " Mar 08 04:16:08.578073 master-0 kubenswrapper[18592]: I0308 04:16:08.577949 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff3c3806-caca-42ef-993a-85bc4e509640-kube-api-access-xfnc6" (OuterVolumeSpecName: "kube-api-access-xfnc6") pod "ff3c3806-caca-42ef-993a-85bc4e509640" (UID: "ff3c3806-caca-42ef-993a-85bc4e509640"). InnerVolumeSpecName "kube-api-access-xfnc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:08.663713 master-0 kubenswrapper[18592]: I0308 04:16:08.663657 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff3c3806-caca-42ef-993a-85bc4e509640" (UID: "ff3c3806-caca-42ef-993a-85bc4e509640"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:08.671275 master-0 kubenswrapper[18592]: I0308 04:16:08.671223 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-config" (OuterVolumeSpecName: "config") pod "ff3c3806-caca-42ef-993a-85bc4e509640" (UID: "ff3c3806-caca-42ef-993a-85bc4e509640"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:08.678391 master-0 kubenswrapper[18592]: I0308 04:16:08.678345 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfnc6\" (UniqueName: \"kubernetes.io/projected/ff3c3806-caca-42ef-993a-85bc4e509640-kube-api-access-xfnc6\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:08.678391 master-0 kubenswrapper[18592]: I0308 04:16:08.678380 18592 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:08.678391 master-0 kubenswrapper[18592]: I0308 04:16:08.678391 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:08.678635 master-0 kubenswrapper[18592]: I0308 04:16:08.678606 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ff3c3806-caca-42ef-993a-85bc4e509640" (UID: "ff3c3806-caca-42ef-993a-85bc4e509640"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:08.692285 master-0 kubenswrapper[18592]: I0308 04:16:08.692225 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ff3c3806-caca-42ef-993a-85bc4e509640" (UID: "ff3c3806-caca-42ef-993a-85bc4e509640"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:08.783053 master-0 kubenswrapper[18592]: I0308 04:16:08.783014 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:08.783053 master-0 kubenswrapper[18592]: I0308 04:16:08.783049 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff3c3806-caca-42ef-993a-85bc4e509640-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:09.197576 master-0 kubenswrapper[18592]: I0308 04:16:09.197542 18592 trace.go:236] Trace[1207180058]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (08-Mar-2026 04:16:08.132) (total time: 1065ms): Mar 08 04:16:09.197576 master-0 kubenswrapper[18592]: Trace[1207180058]: [1.065105091s] [1.065105091s] END Mar 08 04:16:09.285404 master-0 kubenswrapper[18592]: I0308 04:16:09.285347 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" event={"ID":"ff3c3806-caca-42ef-993a-85bc4e509640","Type":"ContainerDied","Data":"5c89f0b76ab9802411596ba2ae041f5d4c44edb91c44b252892bdfcb11170dce"} Mar 08 04:16:09.285404 master-0 kubenswrapper[18592]: I0308 04:16:09.285413 18592 scope.go:117] "RemoveContainer" containerID="9bf4910b9e2bde3ae21cefb3f6bf1431a41853ad07a96b9bec2ca95e4593ff20" Mar 08 04:16:09.285932 master-0 kubenswrapper[18592]: I0308 04:16:09.285446 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689ddcfcf7-jh95m" Mar 08 04:16:09.313966 master-0 kubenswrapper[18592]: I0308 04:16:09.313470 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 08 04:16:09.320103 master-0 kubenswrapper[18592]: I0308 04:16:09.318868 18592 scope.go:117] "RemoveContainer" containerID="6fa67719324bab80394201b5cf7e864c9c6f746b3a1ad91fae10fe49168214c1" Mar 08 04:16:09.523550 master-0 kubenswrapper[18592]: I0308 04:16:09.521835 18592 trace.go:236] Trace[773702944]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-cell1-server-0" (08-Mar-2026 04:16:08.137) (total time: 1384ms): Mar 08 04:16:09.523550 master-0 kubenswrapper[18592]: Trace[773702944]: [1.384550323s] [1.384550323s] END Mar 08 04:16:09.554643 master-0 kubenswrapper[18592]: I0308 04:16:09.554571 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689ddcfcf7-jh95m"] Mar 08 04:16:09.567922 master-0 kubenswrapper[18592]: I0308 04:16:09.567705 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-689ddcfcf7-jh95m"] Mar 08 04:16:10.160264 master-0 kubenswrapper[18592]: I0308 04:16:10.160210 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff3c3806-caca-42ef-993a-85bc4e509640" path="/var/lib/kubelet/pods/ff3c3806-caca-42ef-993a-85bc4e509640/volumes" Mar 08 04:16:10.213294 master-0 kubenswrapper[18592]: I0308 04:16:10.213085 18592 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hkfg8" podUID="bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1" containerName="ovn-controller" probeResult="failure" output=< Mar 08 04:16:10.213294 master-0 kubenswrapper[18592]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 08 04:16:10.213294 master-0 kubenswrapper[18592]: > Mar 08 04:16:13.334355 master-0 kubenswrapper[18592]: I0308 04:16:13.334248 18592 generic.go:334] "Generic (PLEG): container finished" podID="6cc32df9-dcb4-43f3-b78d-f992b0488bf1" containerID="f547e9071c3e4ece9adb6ee184ffdb6d1ee7719b88141eda4fb0b5e8b1fdc8bb" exitCode=0 Mar 08 04:16:13.335086 master-0 kubenswrapper[18592]: I0308 04:16:13.334342 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6cc32df9-dcb4-43f3-b78d-f992b0488bf1","Type":"ContainerDied","Data":"f547e9071c3e4ece9adb6ee184ffdb6d1ee7719b88141eda4fb0b5e8b1fdc8bb"} Mar 08 04:16:14.361910 master-0 kubenswrapper[18592]: I0308 04:16:14.361809 18592 generic.go:334] "Generic (PLEG): container finished" podID="466c2b13-2b27-4a83-911c-db97d66490a5" containerID="3547d58beca639c1dd05442bb3ec63e90c02312aa9592cb67cc9bcbf9a7f2eaa" exitCode=0 Mar 08 04:16:14.362397 master-0 kubenswrapper[18592]: I0308 04:16:14.361950 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"466c2b13-2b27-4a83-911c-db97d66490a5","Type":"ContainerDied","Data":"3547d58beca639c1dd05442bb3ec63e90c02312aa9592cb67cc9bcbf9a7f2eaa"} Mar 08 04:16:14.375901 master-0 kubenswrapper[18592]: I0308 04:16:14.368213 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6cc32df9-dcb4-43f3-b78d-f992b0488bf1","Type":"ContainerStarted","Data":"042cc0dbca7a10783af8e1778fff0ed87dde4a3a033a79c73b42d9ecef24b9b6"} Mar 08 04:16:14.375901 master-0 kubenswrapper[18592]: I0308 04:16:14.369760 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 08 04:16:14.375901 master-0 kubenswrapper[18592]: I0308 04:16:14.372449 18592 generic.go:334] "Generic (PLEG): container finished" podID="284c1d20-bb45-4e62-9ebe-76fdb2e4fd01" containerID="02841ee2d7211ef38a132b1633d2cdad96a22b9e1af8d72ccb9c10b3f5a4834d" exitCode=0 Mar 08 04:16:14.375901 master-0 kubenswrapper[18592]: I0308 04:16:14.372474 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hmnb8" event={"ID":"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01","Type":"ContainerDied","Data":"02841ee2d7211ef38a132b1633d2cdad96a22b9e1af8d72ccb9c10b3f5a4834d"} Mar 08 04:16:14.441699 master-0 kubenswrapper[18592]: I0308 04:16:14.441528 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.674338524 podStartE2EDuration="1m3.441507009s" podCreationTimestamp="2026-03-08 04:15:11 +0000 UTC" firstStartedPulling="2026-03-08 04:15:27.055122272 +0000 UTC m=+1339.153876622" lastFinishedPulling="2026-03-08 04:15:39.822290757 +0000 UTC m=+1351.921045107" observedRunningTime="2026-03-08 04:16:14.424158633 +0000 UTC m=+1386.522912983" watchObservedRunningTime="2026-03-08 04:16:14.441507009 +0000 UTC m=+1386.540261369" Mar 08 04:16:15.209246 master-0 kubenswrapper[18592]: I0308 04:16:15.209171 18592 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hkfg8" podUID="bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1" containerName="ovn-controller" probeResult="failure" output=< Mar 08 04:16:15.209246 master-0 kubenswrapper[18592]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 08 04:16:15.209246 master-0 kubenswrapper[18592]: > Mar 08 04:16:15.261878 master-0 kubenswrapper[18592]: I0308 04:16:15.261696 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-etc-swift\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:16:15.267247 master-0 kubenswrapper[18592]: I0308 04:16:15.266898 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2166af23-aec1-40ee-9114-2a0ffa1c7f11-etc-swift\") pod \"swift-storage-0\" (UID: \"2166af23-aec1-40ee-9114-2a0ffa1c7f11\") " pod="openstack/swift-storage-0" Mar 08 04:16:15.269652 master-0 kubenswrapper[18592]: I0308 04:16:15.269616 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:16:15.279070 master-0 kubenswrapper[18592]: I0308 04:16:15.278577 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2lfv5" Mar 08 04:16:15.307170 master-0 kubenswrapper[18592]: I0308 04:16:15.307128 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 08 04:16:15.659220 master-0 kubenswrapper[18592]: I0308 04:16:15.659181 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hkfg8-config-g5h82"] Mar 08 04:16:15.660108 master-0 kubenswrapper[18592]: E0308 04:16:15.660092 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff3c3806-caca-42ef-993a-85bc4e509640" containerName="init" Mar 08 04:16:15.660182 master-0 kubenswrapper[18592]: I0308 04:16:15.660172 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff3c3806-caca-42ef-993a-85bc4e509640" containerName="init" Mar 08 04:16:15.660317 master-0 kubenswrapper[18592]: E0308 04:16:15.660306 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eed04369-6074-4ce6-9455-f47fe669f97d" containerName="mariadb-account-create-update" Mar 08 04:16:15.660379 master-0 kubenswrapper[18592]: I0308 04:16:15.660369 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed04369-6074-4ce6-9455-f47fe669f97d" containerName="mariadb-account-create-update" Mar 08 04:16:15.660450 master-0 kubenswrapper[18592]: E0308 04:16:15.660440 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff3c3806-caca-42ef-993a-85bc4e509640" containerName="dnsmasq-dns" Mar 08 04:16:15.660513 master-0 kubenswrapper[18592]: I0308 04:16:15.660504 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff3c3806-caca-42ef-993a-85bc4e509640" containerName="dnsmasq-dns" Mar 08 04:16:15.660910 master-0 kubenswrapper[18592]: I0308 04:16:15.660897 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="eed04369-6074-4ce6-9455-f47fe669f97d" containerName="mariadb-account-create-update" Mar 08 04:16:15.660987 master-0 kubenswrapper[18592]: I0308 04:16:15.660976 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff3c3806-caca-42ef-993a-85bc4e509640" containerName="dnsmasq-dns" Mar 08 04:16:15.661703 master-0 kubenswrapper[18592]: I0308 04:16:15.661663 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:15.664079 master-0 kubenswrapper[18592]: I0308 04:16:15.663965 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 08 04:16:15.688603 master-0 kubenswrapper[18592]: I0308 04:16:15.688542 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hkfg8-config-g5h82"] Mar 08 04:16:15.780126 master-0 kubenswrapper[18592]: I0308 04:16:15.779382 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bfe305c-4206-4ca8-b813-57690920c7a7-var-run-ovn\") pod \"ovn-controller-hkfg8-config-g5h82\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:15.780126 master-0 kubenswrapper[18592]: I0308 04:16:15.779457 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2bfe305c-4206-4ca8-b813-57690920c7a7-additional-scripts\") pod \"ovn-controller-hkfg8-config-g5h82\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:15.780126 master-0 kubenswrapper[18592]: I0308 04:16:15.779500 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2bfe305c-4206-4ca8-b813-57690920c7a7-var-run\") pod \"ovn-controller-hkfg8-config-g5h82\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:15.780126 master-0 kubenswrapper[18592]: I0308 04:16:15.779523 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2bfe305c-4206-4ca8-b813-57690920c7a7-var-log-ovn\") pod \"ovn-controller-hkfg8-config-g5h82\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:15.780126 master-0 kubenswrapper[18592]: I0308 04:16:15.779559 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8dtl\" (UniqueName: \"kubernetes.io/projected/2bfe305c-4206-4ca8-b813-57690920c7a7-kube-api-access-c8dtl\") pod \"ovn-controller-hkfg8-config-g5h82\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:15.780126 master-0 kubenswrapper[18592]: I0308 04:16:15.779629 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bfe305c-4206-4ca8-b813-57690920c7a7-scripts\") pod \"ovn-controller-hkfg8-config-g5h82\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:15.881475 master-0 kubenswrapper[18592]: I0308 04:16:15.881422 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bfe305c-4206-4ca8-b813-57690920c7a7-scripts\") pod \"ovn-controller-hkfg8-config-g5h82\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:15.881786 master-0 kubenswrapper[18592]: I0308 04:16:15.881764 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bfe305c-4206-4ca8-b813-57690920c7a7-var-run-ovn\") pod \"ovn-controller-hkfg8-config-g5h82\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:15.883207 master-0 kubenswrapper[18592]: I0308 04:16:15.883182 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2bfe305c-4206-4ca8-b813-57690920c7a7-additional-scripts\") pod \"ovn-controller-hkfg8-config-g5h82\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:15.883390 master-0 kubenswrapper[18592]: I0308 04:16:15.883367 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2bfe305c-4206-4ca8-b813-57690920c7a7-var-run\") pod \"ovn-controller-hkfg8-config-g5h82\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:15.883515 master-0 kubenswrapper[18592]: I0308 04:16:15.883498 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2bfe305c-4206-4ca8-b813-57690920c7a7-var-log-ovn\") pod \"ovn-controller-hkfg8-config-g5h82\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:15.883651 master-0 kubenswrapper[18592]: I0308 04:16:15.883632 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8dtl\" (UniqueName: \"kubernetes.io/projected/2bfe305c-4206-4ca8-b813-57690920c7a7-kube-api-access-c8dtl\") pod \"ovn-controller-hkfg8-config-g5h82\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:15.883883 master-0 kubenswrapper[18592]: I0308 04:16:15.883845 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2bfe305c-4206-4ca8-b813-57690920c7a7-additional-scripts\") pod \"ovn-controller-hkfg8-config-g5h82\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:15.883964 master-0 kubenswrapper[18592]: I0308 04:16:15.883951 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2bfe305c-4206-4ca8-b813-57690920c7a7-var-run\") pod \"ovn-controller-hkfg8-config-g5h82\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:15.884014 master-0 kubenswrapper[18592]: I0308 04:16:15.883208 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bfe305c-4206-4ca8-b813-57690920c7a7-var-run-ovn\") pod \"ovn-controller-hkfg8-config-g5h82\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:15.884014 master-0 kubenswrapper[18592]: I0308 04:16:15.884008 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2bfe305c-4206-4ca8-b813-57690920c7a7-var-log-ovn\") pod \"ovn-controller-hkfg8-config-g5h82\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:15.885773 master-0 kubenswrapper[18592]: I0308 04:16:15.885723 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bfe305c-4206-4ca8-b813-57690920c7a7-scripts\") pod \"ovn-controller-hkfg8-config-g5h82\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:15.899552 master-0 kubenswrapper[18592]: I0308 04:16:15.899514 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8dtl\" (UniqueName: \"kubernetes.io/projected/2bfe305c-4206-4ca8-b813-57690920c7a7-kube-api-access-c8dtl\") pod \"ovn-controller-hkfg8-config-g5h82\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:16.005451 master-0 kubenswrapper[18592]: I0308 04:16:16.005395 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:20.213292 master-0 kubenswrapper[18592]: I0308 04:16:20.212625 18592 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hkfg8" podUID="bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1" containerName="ovn-controller" probeResult="failure" output=< Mar 08 04:16:20.213292 master-0 kubenswrapper[18592]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 08 04:16:20.213292 master-0 kubenswrapper[18592]: > Mar 08 04:16:22.144863 master-0 kubenswrapper[18592]: I0308 04:16:22.143118 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:22.344971 master-0 kubenswrapper[18592]: I0308 04:16:22.342951 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-swiftconf\") pod \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " Mar 08 04:16:22.345186 master-0 kubenswrapper[18592]: I0308 04:16:22.344983 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-etc-swift\") pod \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " Mar 08 04:16:22.345186 master-0 kubenswrapper[18592]: I0308 04:16:22.345042 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhs52\" (UniqueName: \"kubernetes.io/projected/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-kube-api-access-nhs52\") pod \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " Mar 08 04:16:22.345186 master-0 kubenswrapper[18592]: I0308 04:16:22.345092 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-dispersionconf\") pod \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " Mar 08 04:16:22.345186 master-0 kubenswrapper[18592]: I0308 04:16:22.345112 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-scripts\") pod \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " Mar 08 04:16:22.345186 master-0 kubenswrapper[18592]: I0308 04:16:22.345181 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-ring-data-devices\") pod \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " Mar 08 04:16:22.347926 master-0 kubenswrapper[18592]: I0308 04:16:22.347690 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "284c1d20-bb45-4e62-9ebe-76fdb2e4fd01" (UID: "284c1d20-bb45-4e62-9ebe-76fdb2e4fd01"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:22.348326 master-0 kubenswrapper[18592]: I0308 04:16:22.347869 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-combined-ca-bundle\") pod \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\" (UID: \"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01\") " Mar 08 04:16:22.348326 master-0 kubenswrapper[18592]: I0308 04:16:22.347843 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "284c1d20-bb45-4e62-9ebe-76fdb2e4fd01" (UID: "284c1d20-bb45-4e62-9ebe-76fdb2e4fd01"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:16:22.350706 master-0 kubenswrapper[18592]: I0308 04:16:22.350182 18592 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-ring-data-devices\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:22.350706 master-0 kubenswrapper[18592]: I0308 04:16:22.350265 18592 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-etc-swift\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:22.359292 master-0 kubenswrapper[18592]: I0308 04:16:22.359194 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "284c1d20-bb45-4e62-9ebe-76fdb2e4fd01" (UID: "284c1d20-bb45-4e62-9ebe-76fdb2e4fd01"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:22.359292 master-0 kubenswrapper[18592]: I0308 04:16:22.359185 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-kube-api-access-nhs52" (OuterVolumeSpecName: "kube-api-access-nhs52") pod "284c1d20-bb45-4e62-9ebe-76fdb2e4fd01" (UID: "284c1d20-bb45-4e62-9ebe-76fdb2e4fd01"). InnerVolumeSpecName "kube-api-access-nhs52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:22.384616 master-0 kubenswrapper[18592]: I0308 04:16:22.384447 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "284c1d20-bb45-4e62-9ebe-76fdb2e4fd01" (UID: "284c1d20-bb45-4e62-9ebe-76fdb2e4fd01"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:22.391393 master-0 kubenswrapper[18592]: I0308 04:16:22.391350 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "284c1d20-bb45-4e62-9ebe-76fdb2e4fd01" (UID: "284c1d20-bb45-4e62-9ebe-76fdb2e4fd01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:22.395005 master-0 kubenswrapper[18592]: I0308 04:16:22.394958 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-scripts" (OuterVolumeSpecName: "scripts") pod "284c1d20-bb45-4e62-9ebe-76fdb2e4fd01" (UID: "284c1d20-bb45-4e62-9ebe-76fdb2e4fd01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:22.426032 master-0 kubenswrapper[18592]: I0308 04:16:22.425983 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hkfg8-config-g5h82"] Mar 08 04:16:22.451420 master-0 kubenswrapper[18592]: I0308 04:16:22.451375 18592 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-dispersionconf\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:22.451420 master-0 kubenswrapper[18592]: I0308 04:16:22.451410 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:22.451420 master-0 kubenswrapper[18592]: I0308 04:16:22.451420 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:22.451634 master-0 kubenswrapper[18592]: I0308 04:16:22.451429 18592 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-swiftconf\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:22.451634 master-0 kubenswrapper[18592]: I0308 04:16:22.451441 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhs52\" (UniqueName: \"kubernetes.io/projected/284c1d20-bb45-4e62-9ebe-76fdb2e4fd01-kube-api-access-nhs52\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:22.495386 master-0 kubenswrapper[18592]: I0308 04:16:22.495276 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hmnb8" event={"ID":"284c1d20-bb45-4e62-9ebe-76fdb2e4fd01","Type":"ContainerDied","Data":"f2190d44cdc5ff947757ef8157a735170fb60211ca63ac0e7954a99fbeaee8c1"} Mar 08 04:16:22.495545 master-0 kubenswrapper[18592]: I0308 04:16:22.495526 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2190d44cdc5ff947757ef8157a735170fb60211ca63ac0e7954a99fbeaee8c1" Mar 08 04:16:22.495636 master-0 kubenswrapper[18592]: I0308 04:16:22.495319 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hmnb8" Mar 08 04:16:22.499809 master-0 kubenswrapper[18592]: I0308 04:16:22.499766 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"466c2b13-2b27-4a83-911c-db97d66490a5","Type":"ContainerStarted","Data":"9f6eeba09ef2af029726b9fc80a494b1124cecb07f1647e1475657bef5ef9be7"} Mar 08 04:16:22.500312 master-0 kubenswrapper[18592]: I0308 04:16:22.500273 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:16:22.501324 master-0 kubenswrapper[18592]: I0308 04:16:22.501277 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hkfg8-config-g5h82" event={"ID":"2bfe305c-4206-4ca8-b813-57690920c7a7","Type":"ContainerStarted","Data":"894eb5d8c9d44ad92bf5f37bba0220222150c37d680309c6b690566cb47e032a"} Mar 08 04:16:22.550927 master-0 kubenswrapper[18592]: I0308 04:16:22.550497 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=58.776067881 podStartE2EDuration="1m11.550478375s" podCreationTimestamp="2026-03-08 04:15:11 +0000 UTC" firstStartedPulling="2026-03-08 04:15:27.047882323 +0000 UTC m=+1339.146636673" lastFinishedPulling="2026-03-08 04:15:39.822292797 +0000 UTC m=+1351.921047167" observedRunningTime="2026-03-08 04:16:22.524015157 +0000 UTC m=+1394.622769517" watchObservedRunningTime="2026-03-08 04:16:22.550478375 +0000 UTC m=+1394.649232725" Mar 08 04:16:22.583902 master-0 kubenswrapper[18592]: I0308 04:16:22.583779 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 08 04:16:23.517793 master-0 kubenswrapper[18592]: I0308 04:16:23.517657 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2166af23-aec1-40ee-9114-2a0ffa1c7f11","Type":"ContainerStarted","Data":"7714cd50f8dfd275293015fa12f24041ffd47c58cf3215a9952dba94ecdd8284"} Mar 08 04:16:23.520257 master-0 kubenswrapper[18592]: I0308 04:16:23.520223 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7wxjf" event={"ID":"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca","Type":"ContainerStarted","Data":"82a5ba7a653d1cea9eca76348be272af00817196df0f66112fccb1c20e585194"} Mar 08 04:16:23.522264 master-0 kubenswrapper[18592]: I0308 04:16:23.521981 18592 generic.go:334] "Generic (PLEG): container finished" podID="2bfe305c-4206-4ca8-b813-57690920c7a7" containerID="714c452b7ac9c59920d86a7a152becd45938f5747a5e5273faeb7e0a69bf031d" exitCode=0 Mar 08 04:16:23.522704 master-0 kubenswrapper[18592]: I0308 04:16:23.522619 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hkfg8-config-g5h82" event={"ID":"2bfe305c-4206-4ca8-b813-57690920c7a7","Type":"ContainerDied","Data":"714c452b7ac9c59920d86a7a152becd45938f5747a5e5273faeb7e0a69bf031d"} Mar 08 04:16:23.551869 master-0 kubenswrapper[18592]: I0308 04:16:23.551526 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-7wxjf" podStartSLOduration=3.846036426 podStartE2EDuration="18.551494166s" podCreationTimestamp="2026-03-08 04:16:05 +0000 UTC" firstStartedPulling="2026-03-08 04:16:07.302690775 +0000 UTC m=+1379.401445165" lastFinishedPulling="2026-03-08 04:16:22.008148555 +0000 UTC m=+1394.106902905" observedRunningTime="2026-03-08 04:16:23.541942637 +0000 UTC m=+1395.640696987" watchObservedRunningTime="2026-03-08 04:16:23.551494166 +0000 UTC m=+1395.650248526" Mar 08 04:16:24.536004 master-0 kubenswrapper[18592]: I0308 04:16:24.535938 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2166af23-aec1-40ee-9114-2a0ffa1c7f11","Type":"ContainerStarted","Data":"5f1078acd0e6821008856a967503e5502ea2822b91f0a7ce85ef4fa291159cd6"} Mar 08 04:16:24.536004 master-0 kubenswrapper[18592]: I0308 04:16:24.535996 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2166af23-aec1-40ee-9114-2a0ffa1c7f11","Type":"ContainerStarted","Data":"d3b306e7515fcb0947c76732c9a154c48b21bac1725fc66b0048dc0dab8e911e"} Mar 08 04:16:24.922086 master-0 kubenswrapper[18592]: I0308 04:16:24.922049 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:25.119033 master-0 kubenswrapper[18592]: I0308 04:16:25.118908 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2bfe305c-4206-4ca8-b813-57690920c7a7-additional-scripts\") pod \"2bfe305c-4206-4ca8-b813-57690920c7a7\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " Mar 08 04:16:25.119033 master-0 kubenswrapper[18592]: I0308 04:16:25.118984 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8dtl\" (UniqueName: \"kubernetes.io/projected/2bfe305c-4206-4ca8-b813-57690920c7a7-kube-api-access-c8dtl\") pod \"2bfe305c-4206-4ca8-b813-57690920c7a7\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " Mar 08 04:16:25.119033 master-0 kubenswrapper[18592]: I0308 04:16:25.119030 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2bfe305c-4206-4ca8-b813-57690920c7a7-var-log-ovn\") pod \"2bfe305c-4206-4ca8-b813-57690920c7a7\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " Mar 08 04:16:25.119269 master-0 kubenswrapper[18592]: I0308 04:16:25.119073 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2bfe305c-4206-4ca8-b813-57690920c7a7-var-run\") pod \"2bfe305c-4206-4ca8-b813-57690920c7a7\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " Mar 08 04:16:25.119269 master-0 kubenswrapper[18592]: I0308 04:16:25.119099 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bfe305c-4206-4ca8-b813-57690920c7a7-scripts\") pod \"2bfe305c-4206-4ca8-b813-57690920c7a7\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " Mar 08 04:16:25.119269 master-0 kubenswrapper[18592]: I0308 04:16:25.119126 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bfe305c-4206-4ca8-b813-57690920c7a7-var-run-ovn\") pod \"2bfe305c-4206-4ca8-b813-57690920c7a7\" (UID: \"2bfe305c-4206-4ca8-b813-57690920c7a7\") " Mar 08 04:16:25.119539 master-0 kubenswrapper[18592]: I0308 04:16:25.119475 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bfe305c-4206-4ca8-b813-57690920c7a7-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "2bfe305c-4206-4ca8-b813-57690920c7a7" (UID: "2bfe305c-4206-4ca8-b813-57690920c7a7"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:16:25.120731 master-0 kubenswrapper[18592]: I0308 04:16:25.120590 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bfe305c-4206-4ca8-b813-57690920c7a7-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "2bfe305c-4206-4ca8-b813-57690920c7a7" (UID: "2bfe305c-4206-4ca8-b813-57690920c7a7"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:16:25.120879 master-0 kubenswrapper[18592]: I0308 04:16:25.120669 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bfe305c-4206-4ca8-b813-57690920c7a7-var-run" (OuterVolumeSpecName: "var-run") pod "2bfe305c-4206-4ca8-b813-57690920c7a7" (UID: "2bfe305c-4206-4ca8-b813-57690920c7a7"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:16:25.121513 master-0 kubenswrapper[18592]: I0308 04:16:25.121476 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bfe305c-4206-4ca8-b813-57690920c7a7-scripts" (OuterVolumeSpecName: "scripts") pod "2bfe305c-4206-4ca8-b813-57690920c7a7" (UID: "2bfe305c-4206-4ca8-b813-57690920c7a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:25.121682 master-0 kubenswrapper[18592]: I0308 04:16:25.121648 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bfe305c-4206-4ca8-b813-57690920c7a7-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "2bfe305c-4206-4ca8-b813-57690920c7a7" (UID: "2bfe305c-4206-4ca8-b813-57690920c7a7"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:25.147171 master-0 kubenswrapper[18592]: I0308 04:16:25.147085 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bfe305c-4206-4ca8-b813-57690920c7a7-kube-api-access-c8dtl" (OuterVolumeSpecName: "kube-api-access-c8dtl") pod "2bfe305c-4206-4ca8-b813-57690920c7a7" (UID: "2bfe305c-4206-4ca8-b813-57690920c7a7"). InnerVolumeSpecName "kube-api-access-c8dtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:25.222061 master-0 kubenswrapper[18592]: I0308 04:16:25.222007 18592 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2bfe305c-4206-4ca8-b813-57690920c7a7-additional-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:25.222061 master-0 kubenswrapper[18592]: I0308 04:16:25.222050 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8dtl\" (UniqueName: \"kubernetes.io/projected/2bfe305c-4206-4ca8-b813-57690920c7a7-kube-api-access-c8dtl\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:25.222061 master-0 kubenswrapper[18592]: I0308 04:16:25.222061 18592 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2bfe305c-4206-4ca8-b813-57690920c7a7-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:25.222061 master-0 kubenswrapper[18592]: I0308 04:16:25.222072 18592 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2bfe305c-4206-4ca8-b813-57690920c7a7-var-run\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:25.222387 master-0 kubenswrapper[18592]: I0308 04:16:25.222083 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bfe305c-4206-4ca8-b813-57690920c7a7-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:25.222387 master-0 kubenswrapper[18592]: I0308 04:16:25.222092 18592 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bfe305c-4206-4ca8-b813-57690920c7a7-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:25.225163 master-0 kubenswrapper[18592]: I0308 04:16:25.224690 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-hkfg8" Mar 08 04:16:25.546393 master-0 kubenswrapper[18592]: I0308 04:16:25.546342 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hkfg8-config-g5h82" Mar 08 04:16:25.546393 master-0 kubenswrapper[18592]: I0308 04:16:25.546365 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hkfg8-config-g5h82" event={"ID":"2bfe305c-4206-4ca8-b813-57690920c7a7","Type":"ContainerDied","Data":"894eb5d8c9d44ad92bf5f37bba0220222150c37d680309c6b690566cb47e032a"} Mar 08 04:16:25.546969 master-0 kubenswrapper[18592]: I0308 04:16:25.546430 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="894eb5d8c9d44ad92bf5f37bba0220222150c37d680309c6b690566cb47e032a" Mar 08 04:16:25.552478 master-0 kubenswrapper[18592]: I0308 04:16:25.551790 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2166af23-aec1-40ee-9114-2a0ffa1c7f11","Type":"ContainerStarted","Data":"bab446657a756a200e0ed2017e98f2b8ef75c86e586f2af52d84c5f7a21ce624"} Mar 08 04:16:25.552478 master-0 kubenswrapper[18592]: I0308 04:16:25.551845 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2166af23-aec1-40ee-9114-2a0ffa1c7f11","Type":"ContainerStarted","Data":"caba3e88721100ae2c31bfbeda0296b81cee6859b4bf44838d85039d60ea6eb5"} Mar 08 04:16:26.102721 master-0 kubenswrapper[18592]: I0308 04:16:26.100921 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hkfg8-config-g5h82"] Mar 08 04:16:26.125224 master-0 kubenswrapper[18592]: I0308 04:16:26.125160 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hkfg8-config-g5h82"] Mar 08 04:16:26.160530 master-0 kubenswrapper[18592]: I0308 04:16:26.160444 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bfe305c-4206-4ca8-b813-57690920c7a7" path="/var/lib/kubelet/pods/2bfe305c-4206-4ca8-b813-57690920c7a7/volumes" Mar 08 04:16:26.235483 master-0 kubenswrapper[18592]: I0308 04:16:26.235211 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hkfg8-config-vnthz"] Mar 08 04:16:26.236270 master-0 kubenswrapper[18592]: E0308 04:16:26.236245 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bfe305c-4206-4ca8-b813-57690920c7a7" containerName="ovn-config" Mar 08 04:16:26.236270 master-0 kubenswrapper[18592]: I0308 04:16:26.236269 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bfe305c-4206-4ca8-b813-57690920c7a7" containerName="ovn-config" Mar 08 04:16:26.236406 master-0 kubenswrapper[18592]: E0308 04:16:26.236320 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="284c1d20-bb45-4e62-9ebe-76fdb2e4fd01" containerName="swift-ring-rebalance" Mar 08 04:16:26.236406 master-0 kubenswrapper[18592]: I0308 04:16:26.236328 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="284c1d20-bb45-4e62-9ebe-76fdb2e4fd01" containerName="swift-ring-rebalance" Mar 08 04:16:26.236588 master-0 kubenswrapper[18592]: I0308 04:16:26.236552 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="284c1d20-bb45-4e62-9ebe-76fdb2e4fd01" containerName="swift-ring-rebalance" Mar 08 04:16:26.236588 master-0 kubenswrapper[18592]: I0308 04:16:26.236589 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bfe305c-4206-4ca8-b813-57690920c7a7" containerName="ovn-config" Mar 08 04:16:26.239355 master-0 kubenswrapper[18592]: I0308 04:16:26.239166 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.282108 master-0 kubenswrapper[18592]: I0308 04:16:26.246769 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hkfg8-config-vnthz"] Mar 08 04:16:26.282108 master-0 kubenswrapper[18592]: I0308 04:16:26.248318 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-additional-scripts\") pod \"ovn-controller-hkfg8-config-vnthz\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.282108 master-0 kubenswrapper[18592]: I0308 04:16:26.248379 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-var-log-ovn\") pod \"ovn-controller-hkfg8-config-vnthz\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.282108 master-0 kubenswrapper[18592]: I0308 04:16:26.248452 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-var-run-ovn\") pod \"ovn-controller-hkfg8-config-vnthz\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.282108 master-0 kubenswrapper[18592]: I0308 04:16:26.248489 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-var-run\") pod \"ovn-controller-hkfg8-config-vnthz\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.282108 master-0 kubenswrapper[18592]: I0308 04:16:26.248546 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-scripts\") pod \"ovn-controller-hkfg8-config-vnthz\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.282108 master-0 kubenswrapper[18592]: I0308 04:16:26.248640 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t29xm\" (UniqueName: \"kubernetes.io/projected/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-kube-api-access-t29xm\") pod \"ovn-controller-hkfg8-config-vnthz\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.283099 master-0 kubenswrapper[18592]: I0308 04:16:26.283056 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 08 04:16:26.352277 master-0 kubenswrapper[18592]: I0308 04:16:26.352224 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-var-run-ovn\") pod \"ovn-controller-hkfg8-config-vnthz\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.352277 master-0 kubenswrapper[18592]: I0308 04:16:26.352283 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-var-run\") pod \"ovn-controller-hkfg8-config-vnthz\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.352872 master-0 kubenswrapper[18592]: I0308 04:16:26.352315 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-scripts\") pod \"ovn-controller-hkfg8-config-vnthz\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.352872 master-0 kubenswrapper[18592]: I0308 04:16:26.352371 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t29xm\" (UniqueName: \"kubernetes.io/projected/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-kube-api-access-t29xm\") pod \"ovn-controller-hkfg8-config-vnthz\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.352872 master-0 kubenswrapper[18592]: I0308 04:16:26.352441 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-additional-scripts\") pod \"ovn-controller-hkfg8-config-vnthz\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.352872 master-0 kubenswrapper[18592]: I0308 04:16:26.352466 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-var-log-ovn\") pod \"ovn-controller-hkfg8-config-vnthz\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.352872 master-0 kubenswrapper[18592]: I0308 04:16:26.352576 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-var-log-ovn\") pod \"ovn-controller-hkfg8-config-vnthz\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.352872 master-0 kubenswrapper[18592]: I0308 04:16:26.352622 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-var-run-ovn\") pod \"ovn-controller-hkfg8-config-vnthz\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.355208 master-0 kubenswrapper[18592]: I0308 04:16:26.353428 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-additional-scripts\") pod \"ovn-controller-hkfg8-config-vnthz\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.355208 master-0 kubenswrapper[18592]: I0308 04:16:26.353481 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-var-run\") pod \"ovn-controller-hkfg8-config-vnthz\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.355208 master-0 kubenswrapper[18592]: I0308 04:16:26.354817 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-scripts\") pod \"ovn-controller-hkfg8-config-vnthz\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.375609 master-0 kubenswrapper[18592]: I0308 04:16:26.375550 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t29xm\" (UniqueName: \"kubernetes.io/projected/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-kube-api-access-t29xm\") pod \"ovn-controller-hkfg8-config-vnthz\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:26.614924 master-0 kubenswrapper[18592]: I0308 04:16:26.614310 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:27.357934 master-0 kubenswrapper[18592]: W0308 04:16:27.357812 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7eaf38ef_a74d_4385_abbe_c8c8e285bc73.slice/crio-9f3d107c085818f4988b0333066e086cc42c64c609c43ebc29918b641ef0bde1 WatchSource:0}: Error finding container 9f3d107c085818f4988b0333066e086cc42c64c609c43ebc29918b641ef0bde1: Status 404 returned error can't find the container with id 9f3d107c085818f4988b0333066e086cc42c64c609c43ebc29918b641ef0bde1 Mar 08 04:16:27.364725 master-0 kubenswrapper[18592]: I0308 04:16:27.364669 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hkfg8-config-vnthz"] Mar 08 04:16:27.393224 master-0 kubenswrapper[18592]: I0308 04:16:27.393049 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 08 04:16:27.574888 master-0 kubenswrapper[18592]: I0308 04:16:27.574160 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hkfg8-config-vnthz" event={"ID":"7eaf38ef-a74d-4385-abbe-c8c8e285bc73","Type":"ContainerStarted","Data":"9f3d107c085818f4988b0333066e086cc42c64c609c43ebc29918b641ef0bde1"} Mar 08 04:16:27.583846 master-0 kubenswrapper[18592]: I0308 04:16:27.583499 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2166af23-aec1-40ee-9114-2a0ffa1c7f11","Type":"ContainerStarted","Data":"091df28ff1d1c357b867cb543e7c820e30e31bc87c8ab4f5d1f6755b5937c745"} Mar 08 04:16:27.583846 master-0 kubenswrapper[18592]: I0308 04:16:27.583542 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2166af23-aec1-40ee-9114-2a0ffa1c7f11","Type":"ContainerStarted","Data":"64eddd22bd555d7d3396762f5ab5c2e31019dbf4e33622310a9f90efaef1d0c3"} Mar 08 04:16:27.583846 master-0 kubenswrapper[18592]: I0308 04:16:27.583559 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2166af23-aec1-40ee-9114-2a0ffa1c7f11","Type":"ContainerStarted","Data":"ce0a69c00a4841822eeafb816d54e9b621fa2e8f36fb92f6e95cfd93e3e4a80f"} Mar 08 04:16:28.612929 master-0 kubenswrapper[18592]: I0308 04:16:28.612859 18592 generic.go:334] "Generic (PLEG): container finished" podID="7eaf38ef-a74d-4385-abbe-c8c8e285bc73" containerID="3394f4094cde91116b129fc4d22534c98fbc6ce883bd0bd22cae7316a2296746" exitCode=0 Mar 08 04:16:28.612929 master-0 kubenswrapper[18592]: I0308 04:16:28.612942 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hkfg8-config-vnthz" event={"ID":"7eaf38ef-a74d-4385-abbe-c8c8e285bc73","Type":"ContainerDied","Data":"3394f4094cde91116b129fc4d22534c98fbc6ce883bd0bd22cae7316a2296746"} Mar 08 04:16:28.625724 master-0 kubenswrapper[18592]: I0308 04:16:28.625630 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2166af23-aec1-40ee-9114-2a0ffa1c7f11","Type":"ContainerStarted","Data":"96345137959bf0c793c17ba67d29109e37ca885f2841a327d9a9458c5cca2f2f"} Mar 08 04:16:29.429647 master-0 kubenswrapper[18592]: I0308 04:16:29.429588 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-7954b"] Mar 08 04:16:29.431797 master-0 kubenswrapper[18592]: I0308 04:16:29.431764 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7954b" Mar 08 04:16:29.441394 master-0 kubenswrapper[18592]: I0308 04:16:29.439740 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-7954b"] Mar 08 04:16:29.536510 master-0 kubenswrapper[18592]: I0308 04:16:29.531238 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a1de-account-create-update-2qztj"] Mar 08 04:16:29.536510 master-0 kubenswrapper[18592]: I0308 04:16:29.532501 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a1de-account-create-update-2qztj" Mar 08 04:16:29.536510 master-0 kubenswrapper[18592]: I0308 04:16:29.534979 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 08 04:16:29.545606 master-0 kubenswrapper[18592]: I0308 04:16:29.541806 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a1de-account-create-update-2qztj"] Mar 08 04:16:29.545606 master-0 kubenswrapper[18592]: I0308 04:16:29.545278 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh4mk\" (UniqueName: \"kubernetes.io/projected/f4af0f8d-1061-4359-ad23-aa58a643206d-kube-api-access-zh4mk\") pod \"cinder-db-create-7954b\" (UID: \"f4af0f8d-1061-4359-ad23-aa58a643206d\") " pod="openstack/cinder-db-create-7954b" Mar 08 04:16:29.545606 master-0 kubenswrapper[18592]: I0308 04:16:29.545593 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4af0f8d-1061-4359-ad23-aa58a643206d-operator-scripts\") pod \"cinder-db-create-7954b\" (UID: \"f4af0f8d-1061-4359-ad23-aa58a643206d\") " pod="openstack/cinder-db-create-7954b" Mar 08 04:16:29.649249 master-0 kubenswrapper[18592]: I0308 04:16:29.648461 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5bkn\" (UniqueName: \"kubernetes.io/projected/7e36975b-0c3c-492b-bce9-60a58ceeaf47-kube-api-access-j5bkn\") pod \"cinder-a1de-account-create-update-2qztj\" (UID: \"7e36975b-0c3c-492b-bce9-60a58ceeaf47\") " pod="openstack/cinder-a1de-account-create-update-2qztj" Mar 08 04:16:29.649249 master-0 kubenswrapper[18592]: I0308 04:16:29.648508 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4af0f8d-1061-4359-ad23-aa58a643206d-operator-scripts\") pod \"cinder-db-create-7954b\" (UID: \"f4af0f8d-1061-4359-ad23-aa58a643206d\") " pod="openstack/cinder-db-create-7954b" Mar 08 04:16:29.649249 master-0 kubenswrapper[18592]: I0308 04:16:29.648566 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e36975b-0c3c-492b-bce9-60a58ceeaf47-operator-scripts\") pod \"cinder-a1de-account-create-update-2qztj\" (UID: \"7e36975b-0c3c-492b-bce9-60a58ceeaf47\") " pod="openstack/cinder-a1de-account-create-update-2qztj" Mar 08 04:16:29.649249 master-0 kubenswrapper[18592]: I0308 04:16:29.648646 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh4mk\" (UniqueName: \"kubernetes.io/projected/f4af0f8d-1061-4359-ad23-aa58a643206d-kube-api-access-zh4mk\") pod \"cinder-db-create-7954b\" (UID: \"f4af0f8d-1061-4359-ad23-aa58a643206d\") " pod="openstack/cinder-db-create-7954b" Mar 08 04:16:29.649846 master-0 kubenswrapper[18592]: I0308 04:16:29.649391 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4af0f8d-1061-4359-ad23-aa58a643206d-operator-scripts\") pod \"cinder-db-create-7954b\" (UID: \"f4af0f8d-1061-4359-ad23-aa58a643206d\") " pod="openstack/cinder-db-create-7954b" Mar 08 04:16:29.674082 master-0 kubenswrapper[18592]: I0308 04:16:29.671312 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh4mk\" (UniqueName: \"kubernetes.io/projected/f4af0f8d-1061-4359-ad23-aa58a643206d-kube-api-access-zh4mk\") pod \"cinder-db-create-7954b\" (UID: \"f4af0f8d-1061-4359-ad23-aa58a643206d\") " pod="openstack/cinder-db-create-7954b" Mar 08 04:16:29.758991 master-0 kubenswrapper[18592]: I0308 04:16:29.753521 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e36975b-0c3c-492b-bce9-60a58ceeaf47-operator-scripts\") pod \"cinder-a1de-account-create-update-2qztj\" (UID: \"7e36975b-0c3c-492b-bce9-60a58ceeaf47\") " pod="openstack/cinder-a1de-account-create-update-2qztj" Mar 08 04:16:29.758991 master-0 kubenswrapper[18592]: I0308 04:16:29.753699 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5bkn\" (UniqueName: \"kubernetes.io/projected/7e36975b-0c3c-492b-bce9-60a58ceeaf47-kube-api-access-j5bkn\") pod \"cinder-a1de-account-create-update-2qztj\" (UID: \"7e36975b-0c3c-492b-bce9-60a58ceeaf47\") " pod="openstack/cinder-a1de-account-create-update-2qztj" Mar 08 04:16:29.758991 master-0 kubenswrapper[18592]: I0308 04:16:29.754546 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e36975b-0c3c-492b-bce9-60a58ceeaf47-operator-scripts\") pod \"cinder-a1de-account-create-update-2qztj\" (UID: \"7e36975b-0c3c-492b-bce9-60a58ceeaf47\") " pod="openstack/cinder-a1de-account-create-update-2qztj" Mar 08 04:16:29.765851 master-0 kubenswrapper[18592]: I0308 04:16:29.760368 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-f2htj"] Mar 08 04:16:29.765851 master-0 kubenswrapper[18592]: I0308 04:16:29.761654 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f2htj" Mar 08 04:16:29.766089 master-0 kubenswrapper[18592]: I0308 04:16:29.766050 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 04:16:29.771764 master-0 kubenswrapper[18592]: I0308 04:16:29.766202 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 04:16:29.771764 master-0 kubenswrapper[18592]: I0308 04:16:29.766329 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 04:16:29.771764 master-0 kubenswrapper[18592]: I0308 04:16:29.771258 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7954b" Mar 08 04:16:29.811845 master-0 kubenswrapper[18592]: I0308 04:16:29.811400 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5bkn\" (UniqueName: \"kubernetes.io/projected/7e36975b-0c3c-492b-bce9-60a58ceeaf47-kube-api-access-j5bkn\") pod \"cinder-a1de-account-create-update-2qztj\" (UID: \"7e36975b-0c3c-492b-bce9-60a58ceeaf47\") " pod="openstack/cinder-a1de-account-create-update-2qztj" Mar 08 04:16:29.811845 master-0 kubenswrapper[18592]: I0308 04:16:29.811480 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-f2htj"] Mar 08 04:16:29.831927 master-0 kubenswrapper[18592]: I0308 04:16:29.831007 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-k9b8j"] Mar 08 04:16:29.836847 master-0 kubenswrapper[18592]: I0308 04:16:29.832569 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-k9b8j" Mar 08 04:16:29.846849 master-0 kubenswrapper[18592]: I0308 04:16:29.840505 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-k9b8j"] Mar 08 04:16:29.859870 master-0 kubenswrapper[18592]: I0308 04:16:29.855348 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67dd8\" (UniqueName: \"kubernetes.io/projected/b3d9ef75-99d9-448e-9fc5-e14b49976a0a-kube-api-access-67dd8\") pod \"keystone-db-sync-f2htj\" (UID: \"b3d9ef75-99d9-448e-9fc5-e14b49976a0a\") " pod="openstack/keystone-db-sync-f2htj" Mar 08 04:16:29.859870 master-0 kubenswrapper[18592]: I0308 04:16:29.855427 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d9ef75-99d9-448e-9fc5-e14b49976a0a-combined-ca-bundle\") pod \"keystone-db-sync-f2htj\" (UID: \"b3d9ef75-99d9-448e-9fc5-e14b49976a0a\") " pod="openstack/keystone-db-sync-f2htj" Mar 08 04:16:29.859870 master-0 kubenswrapper[18592]: I0308 04:16:29.855472 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d9ef75-99d9-448e-9fc5-e14b49976a0a-config-data\") pod \"keystone-db-sync-f2htj\" (UID: \"b3d9ef75-99d9-448e-9fc5-e14b49976a0a\") " pod="openstack/keystone-db-sync-f2htj" Mar 08 04:16:29.936881 master-0 kubenswrapper[18592]: I0308 04:16:29.932238 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-3190-account-create-update-p2mgv"] Mar 08 04:16:29.936881 master-0 kubenswrapper[18592]: I0308 04:16:29.933668 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3190-account-create-update-p2mgv" Mar 08 04:16:29.936881 master-0 kubenswrapper[18592]: I0308 04:16:29.936344 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 08 04:16:29.958164 master-0 kubenswrapper[18592]: I0308 04:16:29.957548 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67dd8\" (UniqueName: \"kubernetes.io/projected/b3d9ef75-99d9-448e-9fc5-e14b49976a0a-kube-api-access-67dd8\") pod \"keystone-db-sync-f2htj\" (UID: \"b3d9ef75-99d9-448e-9fc5-e14b49976a0a\") " pod="openstack/keystone-db-sync-f2htj" Mar 08 04:16:29.958164 master-0 kubenswrapper[18592]: I0308 04:16:29.957631 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d9ef75-99d9-448e-9fc5-e14b49976a0a-combined-ca-bundle\") pod \"keystone-db-sync-f2htj\" (UID: \"b3d9ef75-99d9-448e-9fc5-e14b49976a0a\") " pod="openstack/keystone-db-sync-f2htj" Mar 08 04:16:29.958164 master-0 kubenswrapper[18592]: I0308 04:16:29.957660 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d9ef75-99d9-448e-9fc5-e14b49976a0a-config-data\") pod \"keystone-db-sync-f2htj\" (UID: \"b3d9ef75-99d9-448e-9fc5-e14b49976a0a\") " pod="openstack/keystone-db-sync-f2htj" Mar 08 04:16:29.958164 master-0 kubenswrapper[18592]: I0308 04:16:29.957689 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tptdb\" (UniqueName: \"kubernetes.io/projected/30a31504-5cfd-45af-ae76-d76a8fdb816a-kube-api-access-tptdb\") pod \"neutron-db-create-k9b8j\" (UID: \"30a31504-5cfd-45af-ae76-d76a8fdb816a\") " pod="openstack/neutron-db-create-k9b8j" Mar 08 04:16:29.958164 master-0 kubenswrapper[18592]: I0308 04:16:29.957716 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30a31504-5cfd-45af-ae76-d76a8fdb816a-operator-scripts\") pod \"neutron-db-create-k9b8j\" (UID: \"30a31504-5cfd-45af-ae76-d76a8fdb816a\") " pod="openstack/neutron-db-create-k9b8j" Mar 08 04:16:29.984982 master-0 kubenswrapper[18592]: I0308 04:16:29.979585 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67dd8\" (UniqueName: \"kubernetes.io/projected/b3d9ef75-99d9-448e-9fc5-e14b49976a0a-kube-api-access-67dd8\") pod \"keystone-db-sync-f2htj\" (UID: \"b3d9ef75-99d9-448e-9fc5-e14b49976a0a\") " pod="openstack/keystone-db-sync-f2htj" Mar 08 04:16:29.984982 master-0 kubenswrapper[18592]: I0308 04:16:29.981477 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d9ef75-99d9-448e-9fc5-e14b49976a0a-config-data\") pod \"keystone-db-sync-f2htj\" (UID: \"b3d9ef75-99d9-448e-9fc5-e14b49976a0a\") " pod="openstack/keystone-db-sync-f2htj" Mar 08 04:16:30.034888 master-0 kubenswrapper[18592]: I0308 04:16:29.993156 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3190-account-create-update-p2mgv"] Mar 08 04:16:30.034888 master-0 kubenswrapper[18592]: I0308 04:16:29.998475 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d9ef75-99d9-448e-9fc5-e14b49976a0a-combined-ca-bundle\") pod \"keystone-db-sync-f2htj\" (UID: \"b3d9ef75-99d9-448e-9fc5-e14b49976a0a\") " pod="openstack/keystone-db-sync-f2htj" Mar 08 04:16:30.034888 master-0 kubenswrapper[18592]: I0308 04:16:30.024758 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a1de-account-create-update-2qztj" Mar 08 04:16:30.060554 master-0 kubenswrapper[18592]: I0308 04:16:30.060503 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tptdb\" (UniqueName: \"kubernetes.io/projected/30a31504-5cfd-45af-ae76-d76a8fdb816a-kube-api-access-tptdb\") pod \"neutron-db-create-k9b8j\" (UID: \"30a31504-5cfd-45af-ae76-d76a8fdb816a\") " pod="openstack/neutron-db-create-k9b8j" Mar 08 04:16:30.060710 master-0 kubenswrapper[18592]: I0308 04:16:30.060568 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30a31504-5cfd-45af-ae76-d76a8fdb816a-operator-scripts\") pod \"neutron-db-create-k9b8j\" (UID: \"30a31504-5cfd-45af-ae76-d76a8fdb816a\") " pod="openstack/neutron-db-create-k9b8j" Mar 08 04:16:30.060710 master-0 kubenswrapper[18592]: I0308 04:16:30.060608 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shn69\" (UniqueName: \"kubernetes.io/projected/e7105244-a23e-43d2-9d49-3cb1b0edf604-kube-api-access-shn69\") pod \"neutron-3190-account-create-update-p2mgv\" (UID: \"e7105244-a23e-43d2-9d49-3cb1b0edf604\") " pod="openstack/neutron-3190-account-create-update-p2mgv" Mar 08 04:16:30.060710 master-0 kubenswrapper[18592]: I0308 04:16:30.060642 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7105244-a23e-43d2-9d49-3cb1b0edf604-operator-scripts\") pod \"neutron-3190-account-create-update-p2mgv\" (UID: \"e7105244-a23e-43d2-9d49-3cb1b0edf604\") " pod="openstack/neutron-3190-account-create-update-p2mgv" Mar 08 04:16:30.071844 master-0 kubenswrapper[18592]: I0308 04:16:30.062124 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30a31504-5cfd-45af-ae76-d76a8fdb816a-operator-scripts\") pod \"neutron-db-create-k9b8j\" (UID: \"30a31504-5cfd-45af-ae76-d76a8fdb816a\") " pod="openstack/neutron-db-create-k9b8j" Mar 08 04:16:30.082333 master-0 kubenswrapper[18592]: I0308 04:16:30.080750 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tptdb\" (UniqueName: \"kubernetes.io/projected/30a31504-5cfd-45af-ae76-d76a8fdb816a-kube-api-access-tptdb\") pod \"neutron-db-create-k9b8j\" (UID: \"30a31504-5cfd-45af-ae76-d76a8fdb816a\") " pod="openstack/neutron-db-create-k9b8j" Mar 08 04:16:30.094303 master-0 kubenswrapper[18592]: I0308 04:16:30.094264 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f2htj" Mar 08 04:16:30.102163 master-0 kubenswrapper[18592]: I0308 04:16:30.102098 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.162295 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-var-run\") pod \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.162350 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-var-run" (OuterVolumeSpecName: "var-run") pod "7eaf38ef-a74d-4385-abbe-c8c8e285bc73" (UID: "7eaf38ef-a74d-4385-abbe-c8c8e285bc73"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.162374 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t29xm\" (UniqueName: \"kubernetes.io/projected/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-kube-api-access-t29xm\") pod \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.162428 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-additional-scripts\") pod \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.162442 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-var-run-ovn\") pod \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.162533 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-var-log-ovn\") pod \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.162682 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-scripts\") pod \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\" (UID: \"7eaf38ef-a74d-4385-abbe-c8c8e285bc73\") " Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.163024 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "7eaf38ef-a74d-4385-abbe-c8c8e285bc73" (UID: "7eaf38ef-a74d-4385-abbe-c8c8e285bc73"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.163057 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "7eaf38ef-a74d-4385-abbe-c8c8e285bc73" (UID: "7eaf38ef-a74d-4385-abbe-c8c8e285bc73"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.163072 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "7eaf38ef-a74d-4385-abbe-c8c8e285bc73" (UID: "7eaf38ef-a74d-4385-abbe-c8c8e285bc73"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.163321 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shn69\" (UniqueName: \"kubernetes.io/projected/e7105244-a23e-43d2-9d49-3cb1b0edf604-kube-api-access-shn69\") pod \"neutron-3190-account-create-update-p2mgv\" (UID: \"e7105244-a23e-43d2-9d49-3cb1b0edf604\") " pod="openstack/neutron-3190-account-create-update-p2mgv" Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.163372 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7105244-a23e-43d2-9d49-3cb1b0edf604-operator-scripts\") pod \"neutron-3190-account-create-update-p2mgv\" (UID: \"e7105244-a23e-43d2-9d49-3cb1b0edf604\") " pod="openstack/neutron-3190-account-create-update-p2mgv" Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.163502 18592 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-additional-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.163513 18592 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.163522 18592 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.163537 18592 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-var-run\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.163782 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-scripts" (OuterVolumeSpecName: "scripts") pod "7eaf38ef-a74d-4385-abbe-c8c8e285bc73" (UID: "7eaf38ef-a74d-4385-abbe-c8c8e285bc73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.164937 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7105244-a23e-43d2-9d49-3cb1b0edf604-operator-scripts\") pod \"neutron-3190-account-create-update-p2mgv\" (UID: \"e7105244-a23e-43d2-9d49-3cb1b0edf604\") " pod="openstack/neutron-3190-account-create-update-p2mgv" Mar 08 04:16:30.169847 master-0 kubenswrapper[18592]: I0308 04:16:30.167847 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-kube-api-access-t29xm" (OuterVolumeSpecName: "kube-api-access-t29xm") pod "7eaf38ef-a74d-4385-abbe-c8c8e285bc73" (UID: "7eaf38ef-a74d-4385-abbe-c8c8e285bc73"). InnerVolumeSpecName "kube-api-access-t29xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:30.209728 master-0 kubenswrapper[18592]: I0308 04:16:30.201373 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-k9b8j" Mar 08 04:16:30.219596 master-0 kubenswrapper[18592]: I0308 04:16:30.210644 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shn69\" (UniqueName: \"kubernetes.io/projected/e7105244-a23e-43d2-9d49-3cb1b0edf604-kube-api-access-shn69\") pod \"neutron-3190-account-create-update-p2mgv\" (UID: \"e7105244-a23e-43d2-9d49-3cb1b0edf604\") " pod="openstack/neutron-3190-account-create-update-p2mgv" Mar 08 04:16:30.275267 master-0 kubenswrapper[18592]: I0308 04:16:30.265710 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t29xm\" (UniqueName: \"kubernetes.io/projected/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-kube-api-access-t29xm\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:30.275267 master-0 kubenswrapper[18592]: I0308 04:16:30.265759 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7eaf38ef-a74d-4385-abbe-c8c8e285bc73-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:30.305555 master-0 kubenswrapper[18592]: I0308 04:16:30.305519 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3190-account-create-update-p2mgv" Mar 08 04:16:30.365335 master-0 kubenswrapper[18592]: I0308 04:16:30.364949 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-7954b"] Mar 08 04:16:30.563132 master-0 kubenswrapper[18592]: I0308 04:16:30.563085 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a1de-account-create-update-2qztj"] Mar 08 04:16:30.696429 master-0 kubenswrapper[18592]: I0308 04:16:30.696270 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a1de-account-create-update-2qztj" event={"ID":"7e36975b-0c3c-492b-bce9-60a58ceeaf47","Type":"ContainerStarted","Data":"281cb72a1a27d7e9a9fa61b88ffcc091b2bb1ea65cf45ecd13318dbfe49dcd0e"} Mar 08 04:16:30.700546 master-0 kubenswrapper[18592]: I0308 04:16:30.698576 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7954b" event={"ID":"f4af0f8d-1061-4359-ad23-aa58a643206d","Type":"ContainerStarted","Data":"9f789bd4f6b5c3120378d674cd8d7777595ed8a68da18849338703d979a42656"} Mar 08 04:16:30.700546 master-0 kubenswrapper[18592]: I0308 04:16:30.698601 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7954b" event={"ID":"f4af0f8d-1061-4359-ad23-aa58a643206d","Type":"ContainerStarted","Data":"367c18d4643ce69a245ea67a93e0561dad2f751a133d411c85c4636fd4e3619c"} Mar 08 04:16:30.703751 master-0 kubenswrapper[18592]: I0308 04:16:30.703074 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hkfg8-config-vnthz" event={"ID":"7eaf38ef-a74d-4385-abbe-c8c8e285bc73","Type":"ContainerDied","Data":"9f3d107c085818f4988b0333066e086cc42c64c609c43ebc29918b641ef0bde1"} Mar 08 04:16:30.703751 master-0 kubenswrapper[18592]: I0308 04:16:30.703111 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f3d107c085818f4988b0333066e086cc42c64c609c43ebc29918b641ef0bde1" Mar 08 04:16:30.703751 master-0 kubenswrapper[18592]: I0308 04:16:30.703161 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hkfg8-config-vnthz" Mar 08 04:16:30.709927 master-0 kubenswrapper[18592]: I0308 04:16:30.709816 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2166af23-aec1-40ee-9114-2a0ffa1c7f11","Type":"ContainerStarted","Data":"d42f048ac987888b62a0a080c53615a07fca22202ade3f172c522cedcee43f0f"} Mar 08 04:16:30.709927 master-0 kubenswrapper[18592]: I0308 04:16:30.709895 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2166af23-aec1-40ee-9114-2a0ffa1c7f11","Type":"ContainerStarted","Data":"7039fe8efa442c1b4cc65bede1eae203d8ba43bc4ed9ece14d78dbf4b9318420"} Mar 08 04:16:30.709927 master-0 kubenswrapper[18592]: I0308 04:16:30.709905 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2166af23-aec1-40ee-9114-2a0ffa1c7f11","Type":"ContainerStarted","Data":"53d9c222286ce0f59ce13290ad53bcf4910607a968b1d39ede391de75916fa84"} Mar 08 04:16:30.775654 master-0 kubenswrapper[18592]: I0308 04:16:30.775621 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-f2htj"] Mar 08 04:16:30.868073 master-0 kubenswrapper[18592]: I0308 04:16:30.861355 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-7954b" podStartSLOduration=1.861331048 podStartE2EDuration="1.861331048s" podCreationTimestamp="2026-03-08 04:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:16:30.85294594 +0000 UTC m=+1402.951700290" watchObservedRunningTime="2026-03-08 04:16:30.861331048 +0000 UTC m=+1402.960085398" Mar 08 04:16:30.887683 master-0 kubenswrapper[18592]: I0308 04:16:30.887443 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-k9b8j"] Mar 08 04:16:30.954064 master-0 kubenswrapper[18592]: I0308 04:16:30.954000 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3190-account-create-update-p2mgv"] Mar 08 04:16:30.958521 master-0 kubenswrapper[18592]: W0308 04:16:30.958475 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7105244_a23e_43d2_9d49_3cb1b0edf604.slice/crio-f7a3bffae5901f91b3f3126a7869b6557b28904ba182054e0797fc7576944011 WatchSource:0}: Error finding container f7a3bffae5901f91b3f3126a7869b6557b28904ba182054e0797fc7576944011: Status 404 returned error can't find the container with id f7a3bffae5901f91b3f3126a7869b6557b28904ba182054e0797fc7576944011 Mar 08 04:16:31.236596 master-0 kubenswrapper[18592]: I0308 04:16:31.236549 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hkfg8-config-vnthz"] Mar 08 04:16:31.254932 master-0 kubenswrapper[18592]: I0308 04:16:31.254897 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hkfg8-config-vnthz"] Mar 08 04:16:31.726509 master-0 kubenswrapper[18592]: I0308 04:16:31.725772 18592 generic.go:334] "Generic (PLEG): container finished" podID="7e36975b-0c3c-492b-bce9-60a58ceeaf47" containerID="4dfcd2877205aebea28a238cd509c700166963d1bb10dd5b57731fe5381fb617" exitCode=0 Mar 08 04:16:31.726509 master-0 kubenswrapper[18592]: I0308 04:16:31.725882 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a1de-account-create-update-2qztj" event={"ID":"7e36975b-0c3c-492b-bce9-60a58ceeaf47","Type":"ContainerDied","Data":"4dfcd2877205aebea28a238cd509c700166963d1bb10dd5b57731fe5381fb617"} Mar 08 04:16:31.728630 master-0 kubenswrapper[18592]: I0308 04:16:31.728494 18592 generic.go:334] "Generic (PLEG): container finished" podID="f4af0f8d-1061-4359-ad23-aa58a643206d" containerID="9f789bd4f6b5c3120378d674cd8d7777595ed8a68da18849338703d979a42656" exitCode=0 Mar 08 04:16:31.728630 master-0 kubenswrapper[18592]: I0308 04:16:31.728598 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7954b" event={"ID":"f4af0f8d-1061-4359-ad23-aa58a643206d","Type":"ContainerDied","Data":"9f789bd4f6b5c3120378d674cd8d7777595ed8a68da18849338703d979a42656"} Mar 08 04:16:31.739104 master-0 kubenswrapper[18592]: I0308 04:16:31.738946 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2166af23-aec1-40ee-9114-2a0ffa1c7f11","Type":"ContainerStarted","Data":"1deddbdd1fc2eb5642ff0725c9e0dceb496b2e8fe9bfea8b14a62243b539a126"} Mar 08 04:16:31.739104 master-0 kubenswrapper[18592]: I0308 04:16:31.739029 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2166af23-aec1-40ee-9114-2a0ffa1c7f11","Type":"ContainerStarted","Data":"1c0e43a00ac9dc1902a83190576121b9292810ea5cd16712b8364a70245cecdb"} Mar 08 04:16:31.739104 master-0 kubenswrapper[18592]: I0308 04:16:31.739055 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2166af23-aec1-40ee-9114-2a0ffa1c7f11","Type":"ContainerStarted","Data":"de2b6cc0bb1cb9c1d6d25f400daf6265abfdc480d83a6b4e1e088476b48f6af1"} Mar 08 04:16:31.739104 master-0 kubenswrapper[18592]: I0308 04:16:31.739073 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2166af23-aec1-40ee-9114-2a0ffa1c7f11","Type":"ContainerStarted","Data":"ca2cec96dfbfce3d2f203589672a00b01a8ed80711f717fe35d398c002a91c5b"} Mar 08 04:16:31.752010 master-0 kubenswrapper[18592]: I0308 04:16:31.751372 18592 generic.go:334] "Generic (PLEG): container finished" podID="e7105244-a23e-43d2-9d49-3cb1b0edf604" containerID="2d05c434f6340493ea5c71599940121bca0203f9dbd9d148e7d7cd6836dd18dd" exitCode=0 Mar 08 04:16:31.752010 master-0 kubenswrapper[18592]: I0308 04:16:31.751488 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3190-account-create-update-p2mgv" event={"ID":"e7105244-a23e-43d2-9d49-3cb1b0edf604","Type":"ContainerDied","Data":"2d05c434f6340493ea5c71599940121bca0203f9dbd9d148e7d7cd6836dd18dd"} Mar 08 04:16:31.752010 master-0 kubenswrapper[18592]: I0308 04:16:31.751521 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3190-account-create-update-p2mgv" event={"ID":"e7105244-a23e-43d2-9d49-3cb1b0edf604","Type":"ContainerStarted","Data":"f7a3bffae5901f91b3f3126a7869b6557b28904ba182054e0797fc7576944011"} Mar 08 04:16:31.761049 master-0 kubenswrapper[18592]: I0308 04:16:31.760815 18592 generic.go:334] "Generic (PLEG): container finished" podID="30a31504-5cfd-45af-ae76-d76a8fdb816a" containerID="98456ce5c9197901e88dfeb6003bdec1816e2f7af6cb2c9c669b43b5b248a93c" exitCode=0 Mar 08 04:16:31.761049 master-0 kubenswrapper[18592]: I0308 04:16:31.760879 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-k9b8j" event={"ID":"30a31504-5cfd-45af-ae76-d76a8fdb816a","Type":"ContainerDied","Data":"98456ce5c9197901e88dfeb6003bdec1816e2f7af6cb2c9c669b43b5b248a93c"} Mar 08 04:16:31.761049 master-0 kubenswrapper[18592]: I0308 04:16:31.760947 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-k9b8j" event={"ID":"30a31504-5cfd-45af-ae76-d76a8fdb816a","Type":"ContainerStarted","Data":"7adfdac770b9bde17b38bb0ce2d3b06ed5373b6cddd247a85601300be0ad41a4"} Mar 08 04:16:31.763025 master-0 kubenswrapper[18592]: I0308 04:16:31.762623 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f2htj" event={"ID":"b3d9ef75-99d9-448e-9fc5-e14b49976a0a","Type":"ContainerStarted","Data":"ac86bd424a5ec35888de418aff6287152839195f9152f18267e2fee4d1752f3c"} Mar 08 04:16:31.805844 master-0 kubenswrapper[18592]: I0308 04:16:31.804418 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=27.845322535 podStartE2EDuration="34.804399215s" podCreationTimestamp="2026-03-08 04:15:57 +0000 UTC" firstStartedPulling="2026-03-08 04:16:22.584337386 +0000 UTC m=+1394.683091736" lastFinishedPulling="2026-03-08 04:16:29.543414066 +0000 UTC m=+1401.642168416" observedRunningTime="2026-03-08 04:16:31.803712006 +0000 UTC m=+1403.902466376" watchObservedRunningTime="2026-03-08 04:16:31.804399215 +0000 UTC m=+1403.903153565" Mar 08 04:16:32.121193 master-0 kubenswrapper[18592]: I0308 04:16:32.121047 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-559976cb87-w8qr8"] Mar 08 04:16:32.121654 master-0 kubenswrapper[18592]: E0308 04:16:32.121594 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eaf38ef-a74d-4385-abbe-c8c8e285bc73" containerName="ovn-config" Mar 08 04:16:32.121654 master-0 kubenswrapper[18592]: I0308 04:16:32.121617 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eaf38ef-a74d-4385-abbe-c8c8e285bc73" containerName="ovn-config" Mar 08 04:16:32.121910 master-0 kubenswrapper[18592]: I0308 04:16:32.121868 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eaf38ef-a74d-4385-abbe-c8c8e285bc73" containerName="ovn-config" Mar 08 04:16:32.122949 master-0 kubenswrapper[18592]: I0308 04:16:32.122916 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.139019 master-0 kubenswrapper[18592]: I0308 04:16:32.138951 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-559976cb87-w8qr8"] Mar 08 04:16:32.162420 master-0 kubenswrapper[18592]: I0308 04:16:32.162381 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 08 04:16:32.193129 master-0 kubenswrapper[18592]: I0308 04:16:32.192917 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eaf38ef-a74d-4385-abbe-c8c8e285bc73" path="/var/lib/kubelet/pods/7eaf38ef-a74d-4385-abbe-c8c8e285bc73/volumes" Mar 08 04:16:32.229643 master-0 kubenswrapper[18592]: I0308 04:16:32.229597 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-ovsdbserver-sb\") pod \"dnsmasq-dns-559976cb87-w8qr8\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.230055 master-0 kubenswrapper[18592]: I0308 04:16:32.230037 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wn5s\" (UniqueName: \"kubernetes.io/projected/e9d032c7-fe18-4154-8f71-72adc7ea82bc-kube-api-access-2wn5s\") pod \"dnsmasq-dns-559976cb87-w8qr8\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.230237 master-0 kubenswrapper[18592]: I0308 04:16:32.230217 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-ovsdbserver-nb\") pod \"dnsmasq-dns-559976cb87-w8qr8\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.230424 master-0 kubenswrapper[18592]: I0308 04:16:32.230402 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-config\") pod \"dnsmasq-dns-559976cb87-w8qr8\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.230560 master-0 kubenswrapper[18592]: I0308 04:16:32.230543 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-dns-swift-storage-0\") pod \"dnsmasq-dns-559976cb87-w8qr8\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.230728 master-0 kubenswrapper[18592]: I0308 04:16:32.230709 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-dns-svc\") pod \"dnsmasq-dns-559976cb87-w8qr8\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.332895 master-0 kubenswrapper[18592]: I0308 04:16:32.332487 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-config\") pod \"dnsmasq-dns-559976cb87-w8qr8\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.332895 master-0 kubenswrapper[18592]: I0308 04:16:32.332564 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-dns-swift-storage-0\") pod \"dnsmasq-dns-559976cb87-w8qr8\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.332895 master-0 kubenswrapper[18592]: I0308 04:16:32.332597 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-dns-svc\") pod \"dnsmasq-dns-559976cb87-w8qr8\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.332895 master-0 kubenswrapper[18592]: I0308 04:16:32.332661 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-ovsdbserver-sb\") pod \"dnsmasq-dns-559976cb87-w8qr8\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.332895 master-0 kubenswrapper[18592]: I0308 04:16:32.332783 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wn5s\" (UniqueName: \"kubernetes.io/projected/e9d032c7-fe18-4154-8f71-72adc7ea82bc-kube-api-access-2wn5s\") pod \"dnsmasq-dns-559976cb87-w8qr8\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.332895 master-0 kubenswrapper[18592]: I0308 04:16:32.332876 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-ovsdbserver-nb\") pod \"dnsmasq-dns-559976cb87-w8qr8\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.335089 master-0 kubenswrapper[18592]: I0308 04:16:32.333634 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-config\") pod \"dnsmasq-dns-559976cb87-w8qr8\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.335089 master-0 kubenswrapper[18592]: I0308 04:16:32.333752 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-dns-swift-storage-0\") pod \"dnsmasq-dns-559976cb87-w8qr8\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.335089 master-0 kubenswrapper[18592]: I0308 04:16:32.333893 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-ovsdbserver-nb\") pod \"dnsmasq-dns-559976cb87-w8qr8\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.335089 master-0 kubenswrapper[18592]: I0308 04:16:32.334443 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-ovsdbserver-sb\") pod \"dnsmasq-dns-559976cb87-w8qr8\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.335246 master-0 kubenswrapper[18592]: I0308 04:16:32.335110 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-dns-svc\") pod \"dnsmasq-dns-559976cb87-w8qr8\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.358161 master-0 kubenswrapper[18592]: I0308 04:16:32.358122 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wn5s\" (UniqueName: \"kubernetes.io/projected/e9d032c7-fe18-4154-8f71-72adc7ea82bc-kube-api-access-2wn5s\") pod \"dnsmasq-dns-559976cb87-w8qr8\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:32.488359 master-0 kubenswrapper[18592]: I0308 04:16:32.488290 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:33.056008 master-0 kubenswrapper[18592]: I0308 04:16:33.055902 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-559976cb87-w8qr8"] Mar 08 04:16:35.762101 master-0 kubenswrapper[18592]: W0308 04:16:35.762031 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9d032c7_fe18_4154_8f71_72adc7ea82bc.slice/crio-843db7c333adb3615e0adf0d2ea71a241d409c6db8b20f9035ab327185ec3631 WatchSource:0}: Error finding container 843db7c333adb3615e0adf0d2ea71a241d409c6db8b20f9035ab327185ec3631: Status 404 returned error can't find the container with id 843db7c333adb3615e0adf0d2ea71a241d409c6db8b20f9035ab327185ec3631 Mar 08 04:16:35.822751 master-0 kubenswrapper[18592]: I0308 04:16:35.822683 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-k9b8j" event={"ID":"30a31504-5cfd-45af-ae76-d76a8fdb816a","Type":"ContainerDied","Data":"7adfdac770b9bde17b38bb0ce2d3b06ed5373b6cddd247a85601300be0ad41a4"} Mar 08 04:16:35.822751 master-0 kubenswrapper[18592]: I0308 04:16:35.822722 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7adfdac770b9bde17b38bb0ce2d3b06ed5373b6cddd247a85601300be0ad41a4" Mar 08 04:16:35.825615 master-0 kubenswrapper[18592]: I0308 04:16:35.825545 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a1de-account-create-update-2qztj" event={"ID":"7e36975b-0c3c-492b-bce9-60a58ceeaf47","Type":"ContainerDied","Data":"281cb72a1a27d7e9a9fa61b88ffcc091b2bb1ea65cf45ecd13318dbfe49dcd0e"} Mar 08 04:16:35.825615 master-0 kubenswrapper[18592]: I0308 04:16:35.825567 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="281cb72a1a27d7e9a9fa61b88ffcc091b2bb1ea65cf45ecd13318dbfe49dcd0e" Mar 08 04:16:35.829206 master-0 kubenswrapper[18592]: I0308 04:16:35.829136 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7954b" event={"ID":"f4af0f8d-1061-4359-ad23-aa58a643206d","Type":"ContainerDied","Data":"367c18d4643ce69a245ea67a93e0561dad2f751a133d411c85c4636fd4e3619c"} Mar 08 04:16:35.829206 master-0 kubenswrapper[18592]: I0308 04:16:35.829184 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="367c18d4643ce69a245ea67a93e0561dad2f751a133d411c85c4636fd4e3619c" Mar 08 04:16:35.831009 master-0 kubenswrapper[18592]: I0308 04:16:35.830947 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559976cb87-w8qr8" event={"ID":"e9d032c7-fe18-4154-8f71-72adc7ea82bc","Type":"ContainerStarted","Data":"843db7c333adb3615e0adf0d2ea71a241d409c6db8b20f9035ab327185ec3631"} Mar 08 04:16:35.834556 master-0 kubenswrapper[18592]: I0308 04:16:35.834500 18592 generic.go:334] "Generic (PLEG): container finished" podID="ce83c2f4-a8a4-4cda-ae8c-3cb1197decca" containerID="82a5ba7a653d1cea9eca76348be272af00817196df0f66112fccb1c20e585194" exitCode=0 Mar 08 04:16:35.834654 master-0 kubenswrapper[18592]: I0308 04:16:35.834615 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7wxjf" event={"ID":"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca","Type":"ContainerDied","Data":"82a5ba7a653d1cea9eca76348be272af00817196df0f66112fccb1c20e585194"} Mar 08 04:16:35.849994 master-0 kubenswrapper[18592]: I0308 04:16:35.844152 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3190-account-create-update-p2mgv" event={"ID":"e7105244-a23e-43d2-9d49-3cb1b0edf604","Type":"ContainerDied","Data":"f7a3bffae5901f91b3f3126a7869b6557b28904ba182054e0797fc7576944011"} Mar 08 04:16:35.849994 master-0 kubenswrapper[18592]: I0308 04:16:35.844234 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7a3bffae5901f91b3f3126a7869b6557b28904ba182054e0797fc7576944011" Mar 08 04:16:36.133036 master-0 kubenswrapper[18592]: I0308 04:16:36.132923 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-k9b8j" Mar 08 04:16:36.209677 master-0 kubenswrapper[18592]: I0308 04:16:36.209623 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3190-account-create-update-p2mgv" Mar 08 04:16:36.218384 master-0 kubenswrapper[18592]: I0308 04:16:36.218104 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7954b" Mar 08 04:16:36.237762 master-0 kubenswrapper[18592]: I0308 04:16:36.234934 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tptdb\" (UniqueName: \"kubernetes.io/projected/30a31504-5cfd-45af-ae76-d76a8fdb816a-kube-api-access-tptdb\") pod \"30a31504-5cfd-45af-ae76-d76a8fdb816a\" (UID: \"30a31504-5cfd-45af-ae76-d76a8fdb816a\") " Mar 08 04:16:36.237762 master-0 kubenswrapper[18592]: I0308 04:16:36.235015 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4af0f8d-1061-4359-ad23-aa58a643206d-operator-scripts\") pod \"f4af0f8d-1061-4359-ad23-aa58a643206d\" (UID: \"f4af0f8d-1061-4359-ad23-aa58a643206d\") " Mar 08 04:16:36.237762 master-0 kubenswrapper[18592]: I0308 04:16:36.235078 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30a31504-5cfd-45af-ae76-d76a8fdb816a-operator-scripts\") pod \"30a31504-5cfd-45af-ae76-d76a8fdb816a\" (UID: \"30a31504-5cfd-45af-ae76-d76a8fdb816a\") " Mar 08 04:16:36.237762 master-0 kubenswrapper[18592]: I0308 04:16:36.235279 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7105244-a23e-43d2-9d49-3cb1b0edf604-operator-scripts\") pod \"e7105244-a23e-43d2-9d49-3cb1b0edf604\" (UID: \"e7105244-a23e-43d2-9d49-3cb1b0edf604\") " Mar 08 04:16:36.237762 master-0 kubenswrapper[18592]: I0308 04:16:36.235395 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shn69\" (UniqueName: \"kubernetes.io/projected/e7105244-a23e-43d2-9d49-3cb1b0edf604-kube-api-access-shn69\") pod \"e7105244-a23e-43d2-9d49-3cb1b0edf604\" (UID: \"e7105244-a23e-43d2-9d49-3cb1b0edf604\") " Mar 08 04:16:36.237762 master-0 kubenswrapper[18592]: I0308 04:16:36.235618 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30a31504-5cfd-45af-ae76-d76a8fdb816a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30a31504-5cfd-45af-ae76-d76a8fdb816a" (UID: "30a31504-5cfd-45af-ae76-d76a8fdb816a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:36.237762 master-0 kubenswrapper[18592]: I0308 04:16:36.236871 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4af0f8d-1061-4359-ad23-aa58a643206d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f4af0f8d-1061-4359-ad23-aa58a643206d" (UID: "f4af0f8d-1061-4359-ad23-aa58a643206d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:36.259872 master-0 kubenswrapper[18592]: I0308 04:16:36.259809 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh4mk\" (UniqueName: \"kubernetes.io/projected/f4af0f8d-1061-4359-ad23-aa58a643206d-kube-api-access-zh4mk\") pod \"f4af0f8d-1061-4359-ad23-aa58a643206d\" (UID: \"f4af0f8d-1061-4359-ad23-aa58a643206d\") " Mar 08 04:16:36.263565 master-0 kubenswrapper[18592]: I0308 04:16:36.263517 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7105244-a23e-43d2-9d49-3cb1b0edf604-kube-api-access-shn69" (OuterVolumeSpecName: "kube-api-access-shn69") pod "e7105244-a23e-43d2-9d49-3cb1b0edf604" (UID: "e7105244-a23e-43d2-9d49-3cb1b0edf604"). InnerVolumeSpecName "kube-api-access-shn69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:36.266049 master-0 kubenswrapper[18592]: I0308 04:16:36.266016 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a1de-account-create-update-2qztj" Mar 08 04:16:36.267325 master-0 kubenswrapper[18592]: I0308 04:16:36.267281 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4af0f8d-1061-4359-ad23-aa58a643206d-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:36.267398 master-0 kubenswrapper[18592]: I0308 04:16:36.267342 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30a31504-5cfd-45af-ae76-d76a8fdb816a-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:36.267398 master-0 kubenswrapper[18592]: I0308 04:16:36.267359 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shn69\" (UniqueName: \"kubernetes.io/projected/e7105244-a23e-43d2-9d49-3cb1b0edf604-kube-api-access-shn69\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:36.282877 master-0 kubenswrapper[18592]: I0308 04:16:36.282813 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4af0f8d-1061-4359-ad23-aa58a643206d-kube-api-access-zh4mk" (OuterVolumeSpecName: "kube-api-access-zh4mk") pod "f4af0f8d-1061-4359-ad23-aa58a643206d" (UID: "f4af0f8d-1061-4359-ad23-aa58a643206d"). InnerVolumeSpecName "kube-api-access-zh4mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:36.286987 master-0 kubenswrapper[18592]: I0308 04:16:36.286800 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30a31504-5cfd-45af-ae76-d76a8fdb816a-kube-api-access-tptdb" (OuterVolumeSpecName: "kube-api-access-tptdb") pod "30a31504-5cfd-45af-ae76-d76a8fdb816a" (UID: "30a31504-5cfd-45af-ae76-d76a8fdb816a"). InnerVolumeSpecName "kube-api-access-tptdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:36.295180 master-0 kubenswrapper[18592]: I0308 04:16:36.295115 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7105244-a23e-43d2-9d49-3cb1b0edf604-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e7105244-a23e-43d2-9d49-3cb1b0edf604" (UID: "e7105244-a23e-43d2-9d49-3cb1b0edf604"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:36.370906 master-0 kubenswrapper[18592]: I0308 04:16:36.370249 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e36975b-0c3c-492b-bce9-60a58ceeaf47-operator-scripts\") pod \"7e36975b-0c3c-492b-bce9-60a58ceeaf47\" (UID: \"7e36975b-0c3c-492b-bce9-60a58ceeaf47\") " Mar 08 04:16:36.370906 master-0 kubenswrapper[18592]: I0308 04:16:36.370602 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5bkn\" (UniqueName: \"kubernetes.io/projected/7e36975b-0c3c-492b-bce9-60a58ceeaf47-kube-api-access-j5bkn\") pod \"7e36975b-0c3c-492b-bce9-60a58ceeaf47\" (UID: \"7e36975b-0c3c-492b-bce9-60a58ceeaf47\") " Mar 08 04:16:36.371431 master-0 kubenswrapper[18592]: I0308 04:16:36.371381 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e36975b-0c3c-492b-bce9-60a58ceeaf47-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e36975b-0c3c-492b-bce9-60a58ceeaf47" (UID: "7e36975b-0c3c-492b-bce9-60a58ceeaf47"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:36.372567 master-0 kubenswrapper[18592]: I0308 04:16:36.372533 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e36975b-0c3c-492b-bce9-60a58ceeaf47-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:36.372567 master-0 kubenswrapper[18592]: I0308 04:16:36.372564 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh4mk\" (UniqueName: \"kubernetes.io/projected/f4af0f8d-1061-4359-ad23-aa58a643206d-kube-api-access-zh4mk\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:36.372696 master-0 kubenswrapper[18592]: I0308 04:16:36.372580 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tptdb\" (UniqueName: \"kubernetes.io/projected/30a31504-5cfd-45af-ae76-d76a8fdb816a-kube-api-access-tptdb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:36.372696 master-0 kubenswrapper[18592]: I0308 04:16:36.372592 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7105244-a23e-43d2-9d49-3cb1b0edf604-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:36.375557 master-0 kubenswrapper[18592]: I0308 04:16:36.375517 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e36975b-0c3c-492b-bce9-60a58ceeaf47-kube-api-access-j5bkn" (OuterVolumeSpecName: "kube-api-access-j5bkn") pod "7e36975b-0c3c-492b-bce9-60a58ceeaf47" (UID: "7e36975b-0c3c-492b-bce9-60a58ceeaf47"). InnerVolumeSpecName "kube-api-access-j5bkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:36.473343 master-0 kubenswrapper[18592]: I0308 04:16:36.473278 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5bkn\" (UniqueName: \"kubernetes.io/projected/7e36975b-0c3c-492b-bce9-60a58ceeaf47-kube-api-access-j5bkn\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:36.866783 master-0 kubenswrapper[18592]: I0308 04:16:36.862255 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f2htj" event={"ID":"b3d9ef75-99d9-448e-9fc5-e14b49976a0a","Type":"ContainerStarted","Data":"6c4cfb4394d4b2d083f3fdd5d6040a7cb6d0ea33dc2f9b2a15af4fadfce19341"} Mar 08 04:16:36.866783 master-0 kubenswrapper[18592]: I0308 04:16:36.865237 18592 generic.go:334] "Generic (PLEG): container finished" podID="e9d032c7-fe18-4154-8f71-72adc7ea82bc" containerID="ebbd83913f92799481112aea154f68c8ef5b16e5df8b389e8f7d1e26123db09e" exitCode=0 Mar 08 04:16:36.866783 master-0 kubenswrapper[18592]: I0308 04:16:36.865411 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a1de-account-create-update-2qztj" Mar 08 04:16:36.866783 master-0 kubenswrapper[18592]: I0308 04:16:36.865469 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-k9b8j" Mar 08 04:16:36.866783 master-0 kubenswrapper[18592]: I0308 04:16:36.865474 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559976cb87-w8qr8" event={"ID":"e9d032c7-fe18-4154-8f71-72adc7ea82bc","Type":"ContainerDied","Data":"ebbd83913f92799481112aea154f68c8ef5b16e5df8b389e8f7d1e26123db09e"} Mar 08 04:16:36.866783 master-0 kubenswrapper[18592]: I0308 04:16:36.865573 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3190-account-create-update-p2mgv" Mar 08 04:16:36.866783 master-0 kubenswrapper[18592]: I0308 04:16:36.865600 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7954b" Mar 08 04:16:36.914684 master-0 kubenswrapper[18592]: I0308 04:16:36.914533 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-f2htj" podStartSLOduration=2.872742581 podStartE2EDuration="7.914507533s" podCreationTimestamp="2026-03-08 04:16:29 +0000 UTC" firstStartedPulling="2026-03-08 04:16:30.773111691 +0000 UTC m=+1402.871866041" lastFinishedPulling="2026-03-08 04:16:35.814876603 +0000 UTC m=+1407.913630993" observedRunningTime="2026-03-08 04:16:36.901688505 +0000 UTC m=+1409.000442875" watchObservedRunningTime="2026-03-08 04:16:36.914507533 +0000 UTC m=+1409.013261903" Mar 08 04:16:37.413590 master-0 kubenswrapper[18592]: I0308 04:16:37.413542 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7wxjf" Mar 08 04:16:37.499209 master-0 kubenswrapper[18592]: I0308 04:16:37.498556 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brl7d\" (UniqueName: \"kubernetes.io/projected/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-kube-api-access-brl7d\") pod \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\" (UID: \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\") " Mar 08 04:16:37.499209 master-0 kubenswrapper[18592]: I0308 04:16:37.498635 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-db-sync-config-data\") pod \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\" (UID: \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\") " Mar 08 04:16:37.499209 master-0 kubenswrapper[18592]: I0308 04:16:37.498708 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-config-data\") pod \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\" (UID: \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\") " Mar 08 04:16:37.499209 master-0 kubenswrapper[18592]: I0308 04:16:37.498814 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-combined-ca-bundle\") pod \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\" (UID: \"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca\") " Mar 08 04:16:37.502955 master-0 kubenswrapper[18592]: I0308 04:16:37.502892 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ce83c2f4-a8a4-4cda-ae8c-3cb1197decca" (UID: "ce83c2f4-a8a4-4cda-ae8c-3cb1197decca"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:37.504654 master-0 kubenswrapper[18592]: I0308 04:16:37.504595 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-kube-api-access-brl7d" (OuterVolumeSpecName: "kube-api-access-brl7d") pod "ce83c2f4-a8a4-4cda-ae8c-3cb1197decca" (UID: "ce83c2f4-a8a4-4cda-ae8c-3cb1197decca"). InnerVolumeSpecName "kube-api-access-brl7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:37.533085 master-0 kubenswrapper[18592]: I0308 04:16:37.533015 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce83c2f4-a8a4-4cda-ae8c-3cb1197decca" (UID: "ce83c2f4-a8a4-4cda-ae8c-3cb1197decca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:37.553308 master-0 kubenswrapper[18592]: I0308 04:16:37.553239 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-config-data" (OuterVolumeSpecName: "config-data") pod "ce83c2f4-a8a4-4cda-ae8c-3cb1197decca" (UID: "ce83c2f4-a8a4-4cda-ae8c-3cb1197decca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:37.601012 master-0 kubenswrapper[18592]: I0308 04:16:37.600955 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brl7d\" (UniqueName: \"kubernetes.io/projected/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-kube-api-access-brl7d\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:37.601012 master-0 kubenswrapper[18592]: I0308 04:16:37.601003 18592 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:37.601012 master-0 kubenswrapper[18592]: I0308 04:16:37.601016 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:37.601012 master-0 kubenswrapper[18592]: I0308 04:16:37.601028 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:37.880037 master-0 kubenswrapper[18592]: I0308 04:16:37.879810 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559976cb87-w8qr8" event={"ID":"e9d032c7-fe18-4154-8f71-72adc7ea82bc","Type":"ContainerStarted","Data":"799569e034c664d7bc5ca3625bd8165808dfa6ccc445ccad71153b61040b2d46"} Mar 08 04:16:37.880037 master-0 kubenswrapper[18592]: I0308 04:16:37.879986 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:37.882235 master-0 kubenswrapper[18592]: I0308 04:16:37.882149 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7wxjf" event={"ID":"ce83c2f4-a8a4-4cda-ae8c-3cb1197decca","Type":"ContainerDied","Data":"8954344687b37d4d621e1ec7c090371aa16968cff24bae025b7d45a689e1a7f1"} Mar 08 04:16:37.882304 master-0 kubenswrapper[18592]: I0308 04:16:37.882244 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8954344687b37d4d621e1ec7c090371aa16968cff24bae025b7d45a689e1a7f1" Mar 08 04:16:37.882348 master-0 kubenswrapper[18592]: I0308 04:16:37.882314 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7wxjf" Mar 08 04:16:37.919993 master-0 kubenswrapper[18592]: I0308 04:16:37.919912 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-559976cb87-w8qr8" podStartSLOduration=5.919889483 podStartE2EDuration="5.919889483s" podCreationTimestamp="2026-03-08 04:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:16:37.919244355 +0000 UTC m=+1410.017998705" watchObservedRunningTime="2026-03-08 04:16:37.919889483 +0000 UTC m=+1410.018643843" Mar 08 04:16:38.466849 master-0 kubenswrapper[18592]: I0308 04:16:38.456896 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-559976cb87-w8qr8"] Mar 08 04:16:38.488846 master-0 kubenswrapper[18592]: I0308 04:16:38.482335 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f4bc7c77c-swkkv"] Mar 08 04:16:38.488846 master-0 kubenswrapper[18592]: E0308 04:16:38.482799 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7105244-a23e-43d2-9d49-3cb1b0edf604" containerName="mariadb-account-create-update" Mar 08 04:16:38.488846 master-0 kubenswrapper[18592]: I0308 04:16:38.482812 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7105244-a23e-43d2-9d49-3cb1b0edf604" containerName="mariadb-account-create-update" Mar 08 04:16:38.488846 master-0 kubenswrapper[18592]: E0308 04:16:38.482883 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a31504-5cfd-45af-ae76-d76a8fdb816a" containerName="mariadb-database-create" Mar 08 04:16:38.488846 master-0 kubenswrapper[18592]: I0308 04:16:38.482893 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a31504-5cfd-45af-ae76-d76a8fdb816a" containerName="mariadb-database-create" Mar 08 04:16:38.488846 master-0 kubenswrapper[18592]: E0308 04:16:38.482908 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce83c2f4-a8a4-4cda-ae8c-3cb1197decca" containerName="glance-db-sync" Mar 08 04:16:38.488846 master-0 kubenswrapper[18592]: I0308 04:16:38.482914 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce83c2f4-a8a4-4cda-ae8c-3cb1197decca" containerName="glance-db-sync" Mar 08 04:16:38.488846 master-0 kubenswrapper[18592]: E0308 04:16:38.482927 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e36975b-0c3c-492b-bce9-60a58ceeaf47" containerName="mariadb-account-create-update" Mar 08 04:16:38.488846 master-0 kubenswrapper[18592]: I0308 04:16:38.482932 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e36975b-0c3c-492b-bce9-60a58ceeaf47" containerName="mariadb-account-create-update" Mar 08 04:16:38.488846 master-0 kubenswrapper[18592]: E0308 04:16:38.482963 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4af0f8d-1061-4359-ad23-aa58a643206d" containerName="mariadb-database-create" Mar 08 04:16:38.488846 master-0 kubenswrapper[18592]: I0308 04:16:38.482969 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4af0f8d-1061-4359-ad23-aa58a643206d" containerName="mariadb-database-create" Mar 08 04:16:38.488846 master-0 kubenswrapper[18592]: I0308 04:16:38.483161 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="30a31504-5cfd-45af-ae76-d76a8fdb816a" containerName="mariadb-database-create" Mar 08 04:16:38.488846 master-0 kubenswrapper[18592]: I0308 04:16:38.483187 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce83c2f4-a8a4-4cda-ae8c-3cb1197decca" containerName="glance-db-sync" Mar 08 04:16:38.488846 master-0 kubenswrapper[18592]: I0308 04:16:38.483214 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4af0f8d-1061-4359-ad23-aa58a643206d" containerName="mariadb-database-create" Mar 08 04:16:38.488846 master-0 kubenswrapper[18592]: I0308 04:16:38.483230 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e36975b-0c3c-492b-bce9-60a58ceeaf47" containerName="mariadb-account-create-update" Mar 08 04:16:38.488846 master-0 kubenswrapper[18592]: I0308 04:16:38.483248 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7105244-a23e-43d2-9d49-3cb1b0edf604" containerName="mariadb-account-create-update" Mar 08 04:16:38.488846 master-0 kubenswrapper[18592]: I0308 04:16:38.484304 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:38.545286 master-0 kubenswrapper[18592]: I0308 04:16:38.524981 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 08 04:16:38.545286 master-0 kubenswrapper[18592]: I0308 04:16:38.525555 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f4bc7c77c-swkkv"] Mar 08 04:16:38.582845 master-0 kubenswrapper[18592]: I0308 04:16:38.571030 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-ovsdbserver-sb\") pod \"dnsmasq-dns-5f4bc7c77c-swkkv\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:38.582845 master-0 kubenswrapper[18592]: I0308 04:16:38.571159 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jrh6\" (UniqueName: \"kubernetes.io/projected/55996053-3eb1-4222-b62f-9fca82e2c6fc-kube-api-access-7jrh6\") pod \"dnsmasq-dns-5f4bc7c77c-swkkv\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:38.582845 master-0 kubenswrapper[18592]: I0308 04:16:38.571223 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-ovsdbserver-nb\") pod \"dnsmasq-dns-5f4bc7c77c-swkkv\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:38.582845 master-0 kubenswrapper[18592]: I0308 04:16:38.571259 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-dns-swift-storage-0\") pod \"dnsmasq-dns-5f4bc7c77c-swkkv\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:38.582845 master-0 kubenswrapper[18592]: I0308 04:16:38.571308 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-dns-svc\") pod \"dnsmasq-dns-5f4bc7c77c-swkkv\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:38.582845 master-0 kubenswrapper[18592]: I0308 04:16:38.571381 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-config\") pod \"dnsmasq-dns-5f4bc7c77c-swkkv\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:38.676844 master-0 kubenswrapper[18592]: I0308 04:16:38.675871 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-ovsdbserver-nb\") pod \"dnsmasq-dns-5f4bc7c77c-swkkv\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:38.676844 master-0 kubenswrapper[18592]: I0308 04:16:38.675962 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-dns-swift-storage-0\") pod \"dnsmasq-dns-5f4bc7c77c-swkkv\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:38.676844 master-0 kubenswrapper[18592]: I0308 04:16:38.676014 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-dns-svc\") pod \"dnsmasq-dns-5f4bc7c77c-swkkv\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:38.676844 master-0 kubenswrapper[18592]: I0308 04:16:38.676035 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-config\") pod \"dnsmasq-dns-5f4bc7c77c-swkkv\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:38.676844 master-0 kubenswrapper[18592]: I0308 04:16:38.676060 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-ovsdbserver-sb\") pod \"dnsmasq-dns-5f4bc7c77c-swkkv\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:38.676844 master-0 kubenswrapper[18592]: I0308 04:16:38.676129 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jrh6\" (UniqueName: \"kubernetes.io/projected/55996053-3eb1-4222-b62f-9fca82e2c6fc-kube-api-access-7jrh6\") pod \"dnsmasq-dns-5f4bc7c77c-swkkv\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:38.677298 master-0 kubenswrapper[18592]: I0308 04:16:38.677249 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-dns-svc\") pod \"dnsmasq-dns-5f4bc7c77c-swkkv\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:38.680840 master-0 kubenswrapper[18592]: I0308 04:16:38.677322 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-config\") pod \"dnsmasq-dns-5f4bc7c77c-swkkv\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:38.680840 master-0 kubenswrapper[18592]: I0308 04:16:38.677870 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-ovsdbserver-sb\") pod \"dnsmasq-dns-5f4bc7c77c-swkkv\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:38.680840 master-0 kubenswrapper[18592]: I0308 04:16:38.678008 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-dns-swift-storage-0\") pod \"dnsmasq-dns-5f4bc7c77c-swkkv\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:38.680840 master-0 kubenswrapper[18592]: I0308 04:16:38.678375 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-ovsdbserver-nb\") pod \"dnsmasq-dns-5f4bc7c77c-swkkv\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:38.847552 master-0 kubenswrapper[18592]: I0308 04:16:38.847500 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jrh6\" (UniqueName: \"kubernetes.io/projected/55996053-3eb1-4222-b62f-9fca82e2c6fc-kube-api-access-7jrh6\") pod \"dnsmasq-dns-5f4bc7c77c-swkkv\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:39.146168 master-0 kubenswrapper[18592]: I0308 04:16:39.146053 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:39.667078 master-0 kubenswrapper[18592]: I0308 04:16:39.666966 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f4bc7c77c-swkkv"] Mar 08 04:16:39.680179 master-0 kubenswrapper[18592]: W0308 04:16:39.680126 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55996053_3eb1_4222_b62f_9fca82e2c6fc.slice/crio-60a5655005d07e97ec6f7cbb1a9f1831e66859dedb1196963d849e5b8e5b0d0c WatchSource:0}: Error finding container 60a5655005d07e97ec6f7cbb1a9f1831e66859dedb1196963d849e5b8e5b0d0c: Status 404 returned error can't find the container with id 60a5655005d07e97ec6f7cbb1a9f1831e66859dedb1196963d849e5b8e5b0d0c Mar 08 04:16:39.911087 master-0 kubenswrapper[18592]: I0308 04:16:39.910978 18592 generic.go:334] "Generic (PLEG): container finished" podID="55996053-3eb1-4222-b62f-9fca82e2c6fc" containerID="ca5218db5c8613c2a953030d8ffa068b176f28de0ea17c9ddb0fe6c548f23d33" exitCode=0 Mar 08 04:16:39.911239 master-0 kubenswrapper[18592]: I0308 04:16:39.911166 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-559976cb87-w8qr8" podUID="e9d032c7-fe18-4154-8f71-72adc7ea82bc" containerName="dnsmasq-dns" containerID="cri-o://799569e034c664d7bc5ca3625bd8165808dfa6ccc445ccad71153b61040b2d46" gracePeriod=10 Mar 08 04:16:39.911891 master-0 kubenswrapper[18592]: I0308 04:16:39.911864 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" event={"ID":"55996053-3eb1-4222-b62f-9fca82e2c6fc","Type":"ContainerDied","Data":"ca5218db5c8613c2a953030d8ffa068b176f28de0ea17c9ddb0fe6c548f23d33"} Mar 08 04:16:39.911942 master-0 kubenswrapper[18592]: I0308 04:16:39.911894 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" event={"ID":"55996053-3eb1-4222-b62f-9fca82e2c6fc","Type":"ContainerStarted","Data":"60a5655005d07e97ec6f7cbb1a9f1831e66859dedb1196963d849e5b8e5b0d0c"} Mar 08 04:16:40.477812 master-0 kubenswrapper[18592]: I0308 04:16:40.477776 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:40.637705 master-0 kubenswrapper[18592]: I0308 04:16:40.637579 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-ovsdbserver-nb\") pod \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " Mar 08 04:16:40.638085 master-0 kubenswrapper[18592]: I0308 04:16:40.637744 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-ovsdbserver-sb\") pod \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " Mar 08 04:16:40.638085 master-0 kubenswrapper[18592]: I0308 04:16:40.637773 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wn5s\" (UniqueName: \"kubernetes.io/projected/e9d032c7-fe18-4154-8f71-72adc7ea82bc-kube-api-access-2wn5s\") pod \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " Mar 08 04:16:40.638085 master-0 kubenswrapper[18592]: I0308 04:16:40.637960 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-dns-svc\") pod \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " Mar 08 04:16:40.638085 master-0 kubenswrapper[18592]: I0308 04:16:40.638009 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-config\") pod \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " Mar 08 04:16:40.638085 master-0 kubenswrapper[18592]: I0308 04:16:40.638057 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-dns-swift-storage-0\") pod \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\" (UID: \"e9d032c7-fe18-4154-8f71-72adc7ea82bc\") " Mar 08 04:16:40.646431 master-0 kubenswrapper[18592]: I0308 04:16:40.643447 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9d032c7-fe18-4154-8f71-72adc7ea82bc-kube-api-access-2wn5s" (OuterVolumeSpecName: "kube-api-access-2wn5s") pod "e9d032c7-fe18-4154-8f71-72adc7ea82bc" (UID: "e9d032c7-fe18-4154-8f71-72adc7ea82bc"). InnerVolumeSpecName "kube-api-access-2wn5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:40.694907 master-0 kubenswrapper[18592]: I0308 04:16:40.694845 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e9d032c7-fe18-4154-8f71-72adc7ea82bc" (UID: "e9d032c7-fe18-4154-8f71-72adc7ea82bc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:40.709607 master-0 kubenswrapper[18592]: I0308 04:16:40.709567 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e9d032c7-fe18-4154-8f71-72adc7ea82bc" (UID: "e9d032c7-fe18-4154-8f71-72adc7ea82bc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:40.714687 master-0 kubenswrapper[18592]: I0308 04:16:40.714628 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-config" (OuterVolumeSpecName: "config") pod "e9d032c7-fe18-4154-8f71-72adc7ea82bc" (UID: "e9d032c7-fe18-4154-8f71-72adc7ea82bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:40.717598 master-0 kubenswrapper[18592]: I0308 04:16:40.717543 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e9d032c7-fe18-4154-8f71-72adc7ea82bc" (UID: "e9d032c7-fe18-4154-8f71-72adc7ea82bc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:40.728687 master-0 kubenswrapper[18592]: I0308 04:16:40.728513 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e9d032c7-fe18-4154-8f71-72adc7ea82bc" (UID: "e9d032c7-fe18-4154-8f71-72adc7ea82bc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:40.740218 master-0 kubenswrapper[18592]: I0308 04:16:40.740014 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:40.740218 master-0 kubenswrapper[18592]: I0308 04:16:40.740077 18592 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:40.740218 master-0 kubenswrapper[18592]: I0308 04:16:40.740088 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:40.740218 master-0 kubenswrapper[18592]: I0308 04:16:40.740096 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:40.740218 master-0 kubenswrapper[18592]: I0308 04:16:40.740107 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wn5s\" (UniqueName: \"kubernetes.io/projected/e9d032c7-fe18-4154-8f71-72adc7ea82bc-kube-api-access-2wn5s\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:40.740218 master-0 kubenswrapper[18592]: I0308 04:16:40.740115 18592 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9d032c7-fe18-4154-8f71-72adc7ea82bc-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:40.922001 master-0 kubenswrapper[18592]: I0308 04:16:40.921707 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" event={"ID":"55996053-3eb1-4222-b62f-9fca82e2c6fc","Type":"ContainerStarted","Data":"ae4a7d3f2cefb3b571e417223de83e4fe0b777d0233c8d00286b5211d2239b72"} Mar 08 04:16:40.922001 master-0 kubenswrapper[18592]: I0308 04:16:40.921763 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:40.924168 master-0 kubenswrapper[18592]: I0308 04:16:40.924143 18592 generic.go:334] "Generic (PLEG): container finished" podID="e9d032c7-fe18-4154-8f71-72adc7ea82bc" containerID="799569e034c664d7bc5ca3625bd8165808dfa6ccc445ccad71153b61040b2d46" exitCode=0 Mar 08 04:16:40.924251 master-0 kubenswrapper[18592]: I0308 04:16:40.924172 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559976cb87-w8qr8" event={"ID":"e9d032c7-fe18-4154-8f71-72adc7ea82bc","Type":"ContainerDied","Data":"799569e034c664d7bc5ca3625bd8165808dfa6ccc445ccad71153b61040b2d46"} Mar 08 04:16:40.924251 master-0 kubenswrapper[18592]: I0308 04:16:40.924192 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559976cb87-w8qr8" event={"ID":"e9d032c7-fe18-4154-8f71-72adc7ea82bc","Type":"ContainerDied","Data":"843db7c333adb3615e0adf0d2ea71a241d409c6db8b20f9035ab327185ec3631"} Mar 08 04:16:40.924251 master-0 kubenswrapper[18592]: I0308 04:16:40.924208 18592 scope.go:117] "RemoveContainer" containerID="799569e034c664d7bc5ca3625bd8165808dfa6ccc445ccad71153b61040b2d46" Mar 08 04:16:40.924392 master-0 kubenswrapper[18592]: I0308 04:16:40.924307 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559976cb87-w8qr8" Mar 08 04:16:40.948495 master-0 kubenswrapper[18592]: I0308 04:16:40.948221 18592 scope.go:117] "RemoveContainer" containerID="ebbd83913f92799481112aea154f68c8ef5b16e5df8b389e8f7d1e26123db09e" Mar 08 04:16:40.965636 master-0 kubenswrapper[18592]: I0308 04:16:40.965542 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" podStartSLOduration=2.965514552 podStartE2EDuration="2.965514552s" podCreationTimestamp="2026-03-08 04:16:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:16:40.951877801 +0000 UTC m=+1413.050632161" watchObservedRunningTime="2026-03-08 04:16:40.965514552 +0000 UTC m=+1413.064268922" Mar 08 04:16:40.980083 master-0 kubenswrapper[18592]: I0308 04:16:40.979714 18592 scope.go:117] "RemoveContainer" containerID="799569e034c664d7bc5ca3625bd8165808dfa6ccc445ccad71153b61040b2d46" Mar 08 04:16:40.980322 master-0 kubenswrapper[18592]: E0308 04:16:40.980268 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"799569e034c664d7bc5ca3625bd8165808dfa6ccc445ccad71153b61040b2d46\": container with ID starting with 799569e034c664d7bc5ca3625bd8165808dfa6ccc445ccad71153b61040b2d46 not found: ID does not exist" containerID="799569e034c664d7bc5ca3625bd8165808dfa6ccc445ccad71153b61040b2d46" Mar 08 04:16:40.980440 master-0 kubenswrapper[18592]: I0308 04:16:40.980319 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"799569e034c664d7bc5ca3625bd8165808dfa6ccc445ccad71153b61040b2d46"} err="failed to get container status \"799569e034c664d7bc5ca3625bd8165808dfa6ccc445ccad71153b61040b2d46\": rpc error: code = NotFound desc = could not find container \"799569e034c664d7bc5ca3625bd8165808dfa6ccc445ccad71153b61040b2d46\": container with ID starting with 799569e034c664d7bc5ca3625bd8165808dfa6ccc445ccad71153b61040b2d46 not found: ID does not exist" Mar 08 04:16:40.980440 master-0 kubenswrapper[18592]: I0308 04:16:40.980344 18592 scope.go:117] "RemoveContainer" containerID="ebbd83913f92799481112aea154f68c8ef5b16e5df8b389e8f7d1e26123db09e" Mar 08 04:16:40.980860 master-0 kubenswrapper[18592]: E0308 04:16:40.980799 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebbd83913f92799481112aea154f68c8ef5b16e5df8b389e8f7d1e26123db09e\": container with ID starting with ebbd83913f92799481112aea154f68c8ef5b16e5df8b389e8f7d1e26123db09e not found: ID does not exist" containerID="ebbd83913f92799481112aea154f68c8ef5b16e5df8b389e8f7d1e26123db09e" Mar 08 04:16:40.980960 master-0 kubenswrapper[18592]: I0308 04:16:40.980864 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebbd83913f92799481112aea154f68c8ef5b16e5df8b389e8f7d1e26123db09e"} err="failed to get container status \"ebbd83913f92799481112aea154f68c8ef5b16e5df8b389e8f7d1e26123db09e\": rpc error: code = NotFound desc = could not find container \"ebbd83913f92799481112aea154f68c8ef5b16e5df8b389e8f7d1e26123db09e\": container with ID starting with ebbd83913f92799481112aea154f68c8ef5b16e5df8b389e8f7d1e26123db09e not found: ID does not exist" Mar 08 04:16:40.998251 master-0 kubenswrapper[18592]: I0308 04:16:40.997255 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-559976cb87-w8qr8"] Mar 08 04:16:41.010474 master-0 kubenswrapper[18592]: I0308 04:16:41.008211 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-559976cb87-w8qr8"] Mar 08 04:16:41.940550 master-0 kubenswrapper[18592]: I0308 04:16:41.940249 18592 generic.go:334] "Generic (PLEG): container finished" podID="b3d9ef75-99d9-448e-9fc5-e14b49976a0a" containerID="6c4cfb4394d4b2d083f3fdd5d6040a7cb6d0ea33dc2f9b2a15af4fadfce19341" exitCode=0 Mar 08 04:16:41.941456 master-0 kubenswrapper[18592]: I0308 04:16:41.940577 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f2htj" event={"ID":"b3d9ef75-99d9-448e-9fc5-e14b49976a0a","Type":"ContainerDied","Data":"6c4cfb4394d4b2d083f3fdd5d6040a7cb6d0ea33dc2f9b2a15af4fadfce19341"} Mar 08 04:16:42.164776 master-0 kubenswrapper[18592]: I0308 04:16:42.164720 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9d032c7-fe18-4154-8f71-72adc7ea82bc" path="/var/lib/kubelet/pods/e9d032c7-fe18-4154-8f71-72adc7ea82bc/volumes" Mar 08 04:16:43.396484 master-0 kubenswrapper[18592]: I0308 04:16:43.396430 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f2htj" Mar 08 04:16:43.501795 master-0 kubenswrapper[18592]: I0308 04:16:43.501738 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d9ef75-99d9-448e-9fc5-e14b49976a0a-config-data\") pod \"b3d9ef75-99d9-448e-9fc5-e14b49976a0a\" (UID: \"b3d9ef75-99d9-448e-9fc5-e14b49976a0a\") " Mar 08 04:16:43.501795 master-0 kubenswrapper[18592]: I0308 04:16:43.501794 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d9ef75-99d9-448e-9fc5-e14b49976a0a-combined-ca-bundle\") pod \"b3d9ef75-99d9-448e-9fc5-e14b49976a0a\" (UID: \"b3d9ef75-99d9-448e-9fc5-e14b49976a0a\") " Mar 08 04:16:43.502170 master-0 kubenswrapper[18592]: I0308 04:16:43.501864 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67dd8\" (UniqueName: \"kubernetes.io/projected/b3d9ef75-99d9-448e-9fc5-e14b49976a0a-kube-api-access-67dd8\") pod \"b3d9ef75-99d9-448e-9fc5-e14b49976a0a\" (UID: \"b3d9ef75-99d9-448e-9fc5-e14b49976a0a\") " Mar 08 04:16:43.505020 master-0 kubenswrapper[18592]: I0308 04:16:43.504976 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d9ef75-99d9-448e-9fc5-e14b49976a0a-kube-api-access-67dd8" (OuterVolumeSpecName: "kube-api-access-67dd8") pod "b3d9ef75-99d9-448e-9fc5-e14b49976a0a" (UID: "b3d9ef75-99d9-448e-9fc5-e14b49976a0a"). InnerVolumeSpecName "kube-api-access-67dd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:43.530872 master-0 kubenswrapper[18592]: I0308 04:16:43.530774 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d9ef75-99d9-448e-9fc5-e14b49976a0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3d9ef75-99d9-448e-9fc5-e14b49976a0a" (UID: "b3d9ef75-99d9-448e-9fc5-e14b49976a0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:43.544059 master-0 kubenswrapper[18592]: I0308 04:16:43.544007 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d9ef75-99d9-448e-9fc5-e14b49976a0a-config-data" (OuterVolumeSpecName: "config-data") pod "b3d9ef75-99d9-448e-9fc5-e14b49976a0a" (UID: "b3d9ef75-99d9-448e-9fc5-e14b49976a0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:43.604488 master-0 kubenswrapper[18592]: I0308 04:16:43.604436 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d9ef75-99d9-448e-9fc5-e14b49976a0a-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:43.604488 master-0 kubenswrapper[18592]: I0308 04:16:43.604474 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d9ef75-99d9-448e-9fc5-e14b49976a0a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:43.604488 master-0 kubenswrapper[18592]: I0308 04:16:43.604485 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67dd8\" (UniqueName: \"kubernetes.io/projected/b3d9ef75-99d9-448e-9fc5-e14b49976a0a-kube-api-access-67dd8\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:43.972026 master-0 kubenswrapper[18592]: I0308 04:16:43.971232 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f2htj" event={"ID":"b3d9ef75-99d9-448e-9fc5-e14b49976a0a","Type":"ContainerDied","Data":"ac86bd424a5ec35888de418aff6287152839195f9152f18267e2fee4d1752f3c"} Mar 08 04:16:43.972026 master-0 kubenswrapper[18592]: I0308 04:16:43.971341 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac86bd424a5ec35888de418aff6287152839195f9152f18267e2fee4d1752f3c" Mar 08 04:16:43.972026 master-0 kubenswrapper[18592]: I0308 04:16:43.971884 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f2htj" Mar 08 04:16:44.310777 master-0 kubenswrapper[18592]: I0308 04:16:44.310713 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-z8kn6"] Mar 08 04:16:44.311281 master-0 kubenswrapper[18592]: E0308 04:16:44.311232 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d032c7-fe18-4154-8f71-72adc7ea82bc" containerName="init" Mar 08 04:16:44.311281 master-0 kubenswrapper[18592]: I0308 04:16:44.311280 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d032c7-fe18-4154-8f71-72adc7ea82bc" containerName="init" Mar 08 04:16:44.311432 master-0 kubenswrapper[18592]: E0308 04:16:44.311315 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d032c7-fe18-4154-8f71-72adc7ea82bc" containerName="dnsmasq-dns" Mar 08 04:16:44.311432 master-0 kubenswrapper[18592]: I0308 04:16:44.311327 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d032c7-fe18-4154-8f71-72adc7ea82bc" containerName="dnsmasq-dns" Mar 08 04:16:44.311432 master-0 kubenswrapper[18592]: E0308 04:16:44.311369 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d9ef75-99d9-448e-9fc5-e14b49976a0a" containerName="keystone-db-sync" Mar 08 04:16:44.311432 master-0 kubenswrapper[18592]: I0308 04:16:44.311381 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d9ef75-99d9-448e-9fc5-e14b49976a0a" containerName="keystone-db-sync" Mar 08 04:16:44.311971 master-0 kubenswrapper[18592]: I0308 04:16:44.311675 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9d032c7-fe18-4154-8f71-72adc7ea82bc" containerName="dnsmasq-dns" Mar 08 04:16:44.311971 master-0 kubenswrapper[18592]: I0308 04:16:44.311693 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d9ef75-99d9-448e-9fc5-e14b49976a0a" containerName="keystone-db-sync" Mar 08 04:16:44.317092 master-0 kubenswrapper[18592]: I0308 04:16:44.312339 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.320970 master-0 kubenswrapper[18592]: I0308 04:16:44.318357 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 04:16:44.320970 master-0 kubenswrapper[18592]: I0308 04:16:44.318670 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 04:16:44.320970 master-0 kubenswrapper[18592]: I0308 04:16:44.318783 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 04:16:44.320970 master-0 kubenswrapper[18592]: I0308 04:16:44.318924 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 04:16:44.346913 master-0 kubenswrapper[18592]: I0308 04:16:44.346422 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-z8kn6"] Mar 08 04:16:44.446097 master-0 kubenswrapper[18592]: I0308 04:16:44.445721 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpm5w\" (UniqueName: \"kubernetes.io/projected/319bbb45-9cc5-4110-8aed-94ccbe808779-kube-api-access-tpm5w\") pod \"keystone-bootstrap-z8kn6\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.446097 master-0 kubenswrapper[18592]: I0308 04:16:44.445768 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-config-data\") pod \"keystone-bootstrap-z8kn6\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.446097 master-0 kubenswrapper[18592]: I0308 04:16:44.445805 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-combined-ca-bundle\") pod \"keystone-bootstrap-z8kn6\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.446097 master-0 kubenswrapper[18592]: I0308 04:16:44.445849 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-scripts\") pod \"keystone-bootstrap-z8kn6\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.446097 master-0 kubenswrapper[18592]: I0308 04:16:44.445891 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-fernet-keys\") pod \"keystone-bootstrap-z8kn6\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.446097 master-0 kubenswrapper[18592]: I0308 04:16:44.445939 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-credential-keys\") pod \"keystone-bootstrap-z8kn6\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.474086 master-0 kubenswrapper[18592]: I0308 04:16:44.469759 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f4bc7c77c-swkkv"] Mar 08 04:16:44.474086 master-0 kubenswrapper[18592]: I0308 04:16:44.470078 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" podUID="55996053-3eb1-4222-b62f-9fca82e2c6fc" containerName="dnsmasq-dns" containerID="cri-o://ae4a7d3f2cefb3b571e417223de83e4fe0b777d0233c8d00286b5211d2239b72" gracePeriod=10 Mar 08 04:16:44.547933 master-0 kubenswrapper[18592]: I0308 04:16:44.547797 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpm5w\" (UniqueName: \"kubernetes.io/projected/319bbb45-9cc5-4110-8aed-94ccbe808779-kube-api-access-tpm5w\") pod \"keystone-bootstrap-z8kn6\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.547933 master-0 kubenswrapper[18592]: I0308 04:16:44.547866 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-config-data\") pod \"keystone-bootstrap-z8kn6\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.547933 master-0 kubenswrapper[18592]: I0308 04:16:44.547900 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-combined-ca-bundle\") pod \"keystone-bootstrap-z8kn6\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.548173 master-0 kubenswrapper[18592]: I0308 04:16:44.547944 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-scripts\") pod \"keystone-bootstrap-z8kn6\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.548173 master-0 kubenswrapper[18592]: I0308 04:16:44.547986 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-fernet-keys\") pod \"keystone-bootstrap-z8kn6\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.548173 master-0 kubenswrapper[18592]: I0308 04:16:44.548033 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-credential-keys\") pod \"keystone-bootstrap-z8kn6\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.551371 master-0 kubenswrapper[18592]: I0308 04:16:44.551338 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-credential-keys\") pod \"keystone-bootstrap-z8kn6\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.552797 master-0 kubenswrapper[18592]: I0308 04:16:44.552752 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-fernet-keys\") pod \"keystone-bootstrap-z8kn6\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.553619 master-0 kubenswrapper[18592]: I0308 04:16:44.553590 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-config-data\") pod \"keystone-bootstrap-z8kn6\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.554298 master-0 kubenswrapper[18592]: I0308 04:16:44.554268 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-combined-ca-bundle\") pod \"keystone-bootstrap-z8kn6\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.554509 master-0 kubenswrapper[18592]: I0308 04:16:44.554467 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-scripts\") pod \"keystone-bootstrap-z8kn6\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.615036 master-0 kubenswrapper[18592]: I0308 04:16:44.614360 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpm5w\" (UniqueName: \"kubernetes.io/projected/319bbb45-9cc5-4110-8aed-94ccbe808779-kube-api-access-tpm5w\") pod \"keystone-bootstrap-z8kn6\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.691279 master-0 kubenswrapper[18592]: I0308 04:16:44.691235 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:44.810674 master-0 kubenswrapper[18592]: I0308 04:16:44.806301 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79c75d759-lh7hc"] Mar 08 04:16:44.810674 master-0 kubenswrapper[18592]: I0308 04:16:44.809772 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:44.851451 master-0 kubenswrapper[18592]: I0308 04:16:44.848878 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79c75d759-lh7hc"] Mar 08 04:16:44.899740 master-0 kubenswrapper[18592]: I0308 04:16:44.897914 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-create-rk2j4"] Mar 08 04:16:44.910318 master-0 kubenswrapper[18592]: I0308 04:16:44.910266 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-rk2j4" Mar 08 04:16:44.928049 master-0 kubenswrapper[18592]: I0308 04:16:44.926103 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-rk2j4"] Mar 08 04:16:44.965014 master-0 kubenswrapper[18592]: I0308 04:16:44.964960 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-vhcff"] Mar 08 04:16:44.969184 master-0 kubenswrapper[18592]: I0308 04:16:44.969144 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-ovsdbserver-nb\") pod \"dnsmasq-dns-79c75d759-lh7hc\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:44.969373 master-0 kubenswrapper[18592]: I0308 04:16:44.969213 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-ovsdbserver-sb\") pod \"dnsmasq-dns-79c75d759-lh7hc\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:44.969373 master-0 kubenswrapper[18592]: I0308 04:16:44.969255 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4fsp\" (UniqueName: \"kubernetes.io/projected/fc53e341-1423-4a10-b198-5fb7f51dcc52-kube-api-access-h4fsp\") pod \"dnsmasq-dns-79c75d759-lh7hc\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:44.969373 master-0 kubenswrapper[18592]: I0308 04:16:44.969299 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-config\") pod \"dnsmasq-dns-79c75d759-lh7hc\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:44.969373 master-0 kubenswrapper[18592]: I0308 04:16:44.969354 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-dns-svc\") pod \"dnsmasq-dns-79c75d759-lh7hc\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:44.969373 master-0 kubenswrapper[18592]: I0308 04:16:44.969370 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-dns-swift-storage-0\") pod \"dnsmasq-dns-79c75d759-lh7hc\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:44.998506 master-0 kubenswrapper[18592]: I0308 04:16:44.998393 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vhcff" Mar 08 04:16:45.014280 master-0 kubenswrapper[18592]: I0308 04:16:45.012790 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 08 04:16:45.014280 master-0 kubenswrapper[18592]: I0308 04:16:45.013425 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 08 04:16:45.049817 master-0 kubenswrapper[18592]: I0308 04:16:45.047401 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-e561-account-create-update-pmf7m"] Mar 08 04:16:45.049817 master-0 kubenswrapper[18592]: I0308 04:16:45.048772 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-e561-account-create-update-pmf7m" Mar 08 04:16:45.050600 master-0 kubenswrapper[18592]: I0308 04:16:45.050560 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-db-secret" Mar 08 04:16:45.067843 master-0 kubenswrapper[18592]: I0308 04:16:45.064926 18592 generic.go:334] "Generic (PLEG): container finished" podID="55996053-3eb1-4222-b62f-9fca82e2c6fc" containerID="ae4a7d3f2cefb3b571e417223de83e4fe0b777d0233c8d00286b5211d2239b72" exitCode=0 Mar 08 04:16:45.067843 master-0 kubenswrapper[18592]: I0308 04:16:45.064974 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" event={"ID":"55996053-3eb1-4222-b62f-9fca82e2c6fc","Type":"ContainerDied","Data":"ae4a7d3f2cefb3b571e417223de83e4fe0b777d0233c8d00286b5211d2239b72"} Mar 08 04:16:45.072786 master-0 kubenswrapper[18592]: I0308 04:16:45.072721 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-dns-svc\") pod \"dnsmasq-dns-79c75d759-lh7hc\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:45.072868 master-0 kubenswrapper[18592]: I0308 04:16:45.072791 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-dns-swift-storage-0\") pod \"dnsmasq-dns-79c75d759-lh7hc\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:45.073675 master-0 kubenswrapper[18592]: I0308 04:16:45.073619 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-dns-svc\") pod \"dnsmasq-dns-79c75d759-lh7hc\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:45.073713 master-0 kubenswrapper[18592]: I0308 04:16:45.073694 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-ovsdbserver-nb\") pod \"dnsmasq-dns-79c75d759-lh7hc\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:45.073843 master-0 kubenswrapper[18592]: I0308 04:16:45.073809 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-ovsdbserver-sb\") pod \"dnsmasq-dns-79c75d759-lh7hc\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:45.073929 master-0 kubenswrapper[18592]: I0308 04:16:45.073907 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4fsp\" (UniqueName: \"kubernetes.io/projected/fc53e341-1423-4a10-b198-5fb7f51dcc52-kube-api-access-h4fsp\") pod \"dnsmasq-dns-79c75d759-lh7hc\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:45.073981 master-0 kubenswrapper[18592]: I0308 04:16:45.073966 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0eaefda-7b75-43b9-8d18-8c1476db321d-config\") pod \"neutron-db-sync-vhcff\" (UID: \"d0eaefda-7b75-43b9-8d18-8c1476db321d\") " pod="openstack/neutron-db-sync-vhcff" Mar 08 04:16:45.074039 master-0 kubenswrapper[18592]: I0308 04:16:45.074023 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-config\") pod \"dnsmasq-dns-79c75d759-lh7hc\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:45.074077 master-0 kubenswrapper[18592]: I0308 04:16:45.074049 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt6pq\" (UniqueName: \"kubernetes.io/projected/1f03ac81-08d4-4053-9a33-ae4228fb3d3b-kube-api-access-wt6pq\") pod \"ironic-db-create-rk2j4\" (UID: \"1f03ac81-08d4-4053-9a33-ae4228fb3d3b\") " pod="openstack/ironic-db-create-rk2j4" Mar 08 04:16:45.074112 master-0 kubenswrapper[18592]: I0308 04:16:45.074088 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0eaefda-7b75-43b9-8d18-8c1476db321d-combined-ca-bundle\") pod \"neutron-db-sync-vhcff\" (UID: \"d0eaefda-7b75-43b9-8d18-8c1476db321d\") " pod="openstack/neutron-db-sync-vhcff" Mar 08 04:16:45.074112 master-0 kubenswrapper[18592]: I0308 04:16:45.074107 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f03ac81-08d4-4053-9a33-ae4228fb3d3b-operator-scripts\") pod \"ironic-db-create-rk2j4\" (UID: \"1f03ac81-08d4-4053-9a33-ae4228fb3d3b\") " pod="openstack/ironic-db-create-rk2j4" Mar 08 04:16:45.074188 master-0 kubenswrapper[18592]: I0308 04:16:45.074171 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz6tr\" (UniqueName: \"kubernetes.io/projected/d0eaefda-7b75-43b9-8d18-8c1476db321d-kube-api-access-wz6tr\") pod \"neutron-db-sync-vhcff\" (UID: \"d0eaefda-7b75-43b9-8d18-8c1476db321d\") " pod="openstack/neutron-db-sync-vhcff" Mar 08 04:16:45.075383 master-0 kubenswrapper[18592]: I0308 04:16:45.074859 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-config\") pod \"dnsmasq-dns-79c75d759-lh7hc\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:45.075383 master-0 kubenswrapper[18592]: I0308 04:16:45.075007 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-dns-swift-storage-0\") pod \"dnsmasq-dns-79c75d759-lh7hc\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:45.075487 master-0 kubenswrapper[18592]: I0308 04:16:45.075392 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-ovsdbserver-nb\") pod \"dnsmasq-dns-79c75d759-lh7hc\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:45.082850 master-0 kubenswrapper[18592]: I0308 04:16:45.079059 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-e561-account-create-update-pmf7m"] Mar 08 04:16:45.095947 master-0 kubenswrapper[18592]: I0308 04:16:45.095894 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vhcff"] Mar 08 04:16:45.108593 master-0 kubenswrapper[18592]: I0308 04:16:45.104314 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-ovsdbserver-sb\") pod \"dnsmasq-dns-79c75d759-lh7hc\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:45.113273 master-0 kubenswrapper[18592]: I0308 04:16:45.113198 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ff301-db-sync-bgkm7"] Mar 08 04:16:45.117057 master-0 kubenswrapper[18592]: I0308 04:16:45.117026 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4fsp\" (UniqueName: \"kubernetes.io/projected/fc53e341-1423-4a10-b198-5fb7f51dcc52-kube-api-access-h4fsp\") pod \"dnsmasq-dns-79c75d759-lh7hc\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:45.128988 master-0 kubenswrapper[18592]: I0308 04:16:45.128699 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.135999 master-0 kubenswrapper[18592]: I0308 04:16:45.135951 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-ff301-config-data" Mar 08 04:16:45.136210 master-0 kubenswrapper[18592]: I0308 04:16:45.136146 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-ff301-scripts" Mar 08 04:16:45.204617 master-0 kubenswrapper[18592]: I0308 04:16:45.202654 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81d0a12f-c85b-41ef-a076-efa3dd40f9aa-operator-scripts\") pod \"ironic-e561-account-create-update-pmf7m\" (UID: \"81d0a12f-c85b-41ef-a076-efa3dd40f9aa\") " pod="openstack/ironic-e561-account-create-update-pmf7m" Mar 08 04:16:45.204617 master-0 kubenswrapper[18592]: I0308 04:16:45.202766 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0eaefda-7b75-43b9-8d18-8c1476db321d-config\") pod \"neutron-db-sync-vhcff\" (UID: \"d0eaefda-7b75-43b9-8d18-8c1476db321d\") " pod="openstack/neutron-db-sync-vhcff" Mar 08 04:16:45.204617 master-0 kubenswrapper[18592]: I0308 04:16:45.202838 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt6pq\" (UniqueName: \"kubernetes.io/projected/1f03ac81-08d4-4053-9a33-ae4228fb3d3b-kube-api-access-wt6pq\") pod \"ironic-db-create-rk2j4\" (UID: \"1f03ac81-08d4-4053-9a33-ae4228fb3d3b\") " pod="openstack/ironic-db-create-rk2j4" Mar 08 04:16:45.204617 master-0 kubenswrapper[18592]: I0308 04:16:45.202882 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0eaefda-7b75-43b9-8d18-8c1476db321d-combined-ca-bundle\") pod \"neutron-db-sync-vhcff\" (UID: \"d0eaefda-7b75-43b9-8d18-8c1476db321d\") " pod="openstack/neutron-db-sync-vhcff" Mar 08 04:16:45.204617 master-0 kubenswrapper[18592]: I0308 04:16:45.202898 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f03ac81-08d4-4053-9a33-ae4228fb3d3b-operator-scripts\") pod \"ironic-db-create-rk2j4\" (UID: \"1f03ac81-08d4-4053-9a33-ae4228fb3d3b\") " pod="openstack/ironic-db-create-rk2j4" Mar 08 04:16:45.204617 master-0 kubenswrapper[18592]: I0308 04:16:45.202966 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz6tr\" (UniqueName: \"kubernetes.io/projected/d0eaefda-7b75-43b9-8d18-8c1476db321d-kube-api-access-wz6tr\") pod \"neutron-db-sync-vhcff\" (UID: \"d0eaefda-7b75-43b9-8d18-8c1476db321d\") " pod="openstack/neutron-db-sync-vhcff" Mar 08 04:16:45.204617 master-0 kubenswrapper[18592]: I0308 04:16:45.203002 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kk8z\" (UniqueName: \"kubernetes.io/projected/81d0a12f-c85b-41ef-a076-efa3dd40f9aa-kube-api-access-2kk8z\") pod \"ironic-e561-account-create-update-pmf7m\" (UID: \"81d0a12f-c85b-41ef-a076-efa3dd40f9aa\") " pod="openstack/ironic-e561-account-create-update-pmf7m" Mar 08 04:16:45.208261 master-0 kubenswrapper[18592]: I0308 04:16:45.206492 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f03ac81-08d4-4053-9a33-ae4228fb3d3b-operator-scripts\") pod \"ironic-db-create-rk2j4\" (UID: \"1f03ac81-08d4-4053-9a33-ae4228fb3d3b\") " pod="openstack/ironic-db-create-rk2j4" Mar 08 04:16:45.217221 master-0 kubenswrapper[18592]: I0308 04:16:45.208882 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0eaefda-7b75-43b9-8d18-8c1476db321d-config\") pod \"neutron-db-sync-vhcff\" (UID: \"d0eaefda-7b75-43b9-8d18-8c1476db321d\") " pod="openstack/neutron-db-sync-vhcff" Mar 08 04:16:45.231598 master-0 kubenswrapper[18592]: I0308 04:16:45.224245 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0eaefda-7b75-43b9-8d18-8c1476db321d-combined-ca-bundle\") pod \"neutron-db-sync-vhcff\" (UID: \"d0eaefda-7b75-43b9-8d18-8c1476db321d\") " pod="openstack/neutron-db-sync-vhcff" Mar 08 04:16:45.231598 master-0 kubenswrapper[18592]: I0308 04:16:45.225159 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:45.257454 master-0 kubenswrapper[18592]: I0308 04:16:45.257278 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz6tr\" (UniqueName: \"kubernetes.io/projected/d0eaefda-7b75-43b9-8d18-8c1476db321d-kube-api-access-wz6tr\") pod \"neutron-db-sync-vhcff\" (UID: \"d0eaefda-7b75-43b9-8d18-8c1476db321d\") " pod="openstack/neutron-db-sync-vhcff" Mar 08 04:16:45.294029 master-0 kubenswrapper[18592]: I0308 04:16:45.286903 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ff301-db-sync-bgkm7"] Mar 08 04:16:45.294029 master-0 kubenswrapper[18592]: I0308 04:16:45.289200 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt6pq\" (UniqueName: \"kubernetes.io/projected/1f03ac81-08d4-4053-9a33-ae4228fb3d3b-kube-api-access-wt6pq\") pod \"ironic-db-create-rk2j4\" (UID: \"1f03ac81-08d4-4053-9a33-ae4228fb3d3b\") " pod="openstack/ironic-db-create-rk2j4" Mar 08 04:16:45.305884 master-0 kubenswrapper[18592]: I0308 04:16:45.305815 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kk8z\" (UniqueName: \"kubernetes.io/projected/81d0a12f-c85b-41ef-a076-efa3dd40f9aa-kube-api-access-2kk8z\") pod \"ironic-e561-account-create-update-pmf7m\" (UID: \"81d0a12f-c85b-41ef-a076-efa3dd40f9aa\") " pod="openstack/ironic-e561-account-create-update-pmf7m" Mar 08 04:16:45.306053 master-0 kubenswrapper[18592]: I0308 04:16:45.305933 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-scripts\") pod \"cinder-ff301-db-sync-bgkm7\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.306053 master-0 kubenswrapper[18592]: I0308 04:16:45.305987 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-db-sync-config-data\") pod \"cinder-ff301-db-sync-bgkm7\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.306053 master-0 kubenswrapper[18592]: I0308 04:16:45.306049 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-config-data\") pod \"cinder-ff301-db-sync-bgkm7\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.306147 master-0 kubenswrapper[18592]: I0308 04:16:45.306109 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-combined-ca-bundle\") pod \"cinder-ff301-db-sync-bgkm7\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.307158 master-0 kubenswrapper[18592]: I0308 04:16:45.306421 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81d0a12f-c85b-41ef-a076-efa3dd40f9aa-operator-scripts\") pod \"ironic-e561-account-create-update-pmf7m\" (UID: \"81d0a12f-c85b-41ef-a076-efa3dd40f9aa\") " pod="openstack/ironic-e561-account-create-update-pmf7m" Mar 08 04:16:45.307158 master-0 kubenswrapper[18592]: I0308 04:16:45.306562 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/117a9c49-cd48-4a2c-bdee-10bb60588b20-etc-machine-id\") pod \"cinder-ff301-db-sync-bgkm7\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.307158 master-0 kubenswrapper[18592]: I0308 04:16:45.306851 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdkv6\" (UniqueName: \"kubernetes.io/projected/117a9c49-cd48-4a2c-bdee-10bb60588b20-kube-api-access-wdkv6\") pod \"cinder-ff301-db-sync-bgkm7\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.308010 master-0 kubenswrapper[18592]: I0308 04:16:45.307966 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81d0a12f-c85b-41ef-a076-efa3dd40f9aa-operator-scripts\") pod \"ironic-e561-account-create-update-pmf7m\" (UID: \"81d0a12f-c85b-41ef-a076-efa3dd40f9aa\") " pod="openstack/ironic-e561-account-create-update-pmf7m" Mar 08 04:16:45.320032 master-0 kubenswrapper[18592]: I0308 04:16:45.320005 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kk8z\" (UniqueName: \"kubernetes.io/projected/81d0a12f-c85b-41ef-a076-efa3dd40f9aa-kube-api-access-2kk8z\") pod \"ironic-e561-account-create-update-pmf7m\" (UID: \"81d0a12f-c85b-41ef-a076-efa3dd40f9aa\") " pod="openstack/ironic-e561-account-create-update-pmf7m" Mar 08 04:16:45.320204 master-0 kubenswrapper[18592]: I0308 04:16:45.320056 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-7ttxm"] Mar 08 04:16:45.321942 master-0 kubenswrapper[18592]: I0308 04:16:45.321912 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7ttxm"] Mar 08 04:16:45.322006 master-0 kubenswrapper[18592]: I0308 04:16:45.321999 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7ttxm" Mar 08 04:16:45.326415 master-0 kubenswrapper[18592]: I0308 04:16:45.326379 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 08 04:16:45.327166 master-0 kubenswrapper[18592]: I0308 04:16:45.327139 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 08 04:16:45.378797 master-0 kubenswrapper[18592]: I0308 04:16:45.378743 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79c75d759-lh7hc"] Mar 08 04:16:45.405340 master-0 kubenswrapper[18592]: I0308 04:16:45.405280 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-rk2j4" Mar 08 04:16:45.437870 master-0 kubenswrapper[18592]: I0308 04:16:45.419601 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-db-sync-config-data\") pod \"cinder-ff301-db-sync-bgkm7\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.437870 master-0 kubenswrapper[18592]: I0308 04:16:45.419699 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-config-data\") pod \"cinder-ff301-db-sync-bgkm7\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.437870 master-0 kubenswrapper[18592]: I0308 04:16:45.419745 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-logs\") pod \"placement-db-sync-7ttxm\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " pod="openstack/placement-db-sync-7ttxm" Mar 08 04:16:45.437870 master-0 kubenswrapper[18592]: I0308 04:16:45.419762 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-combined-ca-bundle\") pod \"cinder-ff301-db-sync-bgkm7\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.437870 master-0 kubenswrapper[18592]: I0308 04:16:45.419970 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwwgd\" (UniqueName: \"kubernetes.io/projected/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-kube-api-access-zwwgd\") pod \"placement-db-sync-7ttxm\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " pod="openstack/placement-db-sync-7ttxm" Mar 08 04:16:45.437870 master-0 kubenswrapper[18592]: I0308 04:16:45.419998 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-combined-ca-bundle\") pod \"placement-db-sync-7ttxm\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " pod="openstack/placement-db-sync-7ttxm" Mar 08 04:16:45.437870 master-0 kubenswrapper[18592]: I0308 04:16:45.420031 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-config-data\") pod \"placement-db-sync-7ttxm\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " pod="openstack/placement-db-sync-7ttxm" Mar 08 04:16:45.437870 master-0 kubenswrapper[18592]: I0308 04:16:45.423856 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-combined-ca-bundle\") pod \"cinder-ff301-db-sync-bgkm7\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.437870 master-0 kubenswrapper[18592]: I0308 04:16:45.423891 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-scripts\") pod \"placement-db-sync-7ttxm\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " pod="openstack/placement-db-sync-7ttxm" Mar 08 04:16:45.437870 master-0 kubenswrapper[18592]: I0308 04:16:45.424801 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/117a9c49-cd48-4a2c-bdee-10bb60588b20-etc-machine-id\") pod \"cinder-ff301-db-sync-bgkm7\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.437870 master-0 kubenswrapper[18592]: I0308 04:16:45.424878 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdkv6\" (UniqueName: \"kubernetes.io/projected/117a9c49-cd48-4a2c-bdee-10bb60588b20-kube-api-access-wdkv6\") pod \"cinder-ff301-db-sync-bgkm7\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.437870 master-0 kubenswrapper[18592]: I0308 04:16:45.424911 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/117a9c49-cd48-4a2c-bdee-10bb60588b20-etc-machine-id\") pod \"cinder-ff301-db-sync-bgkm7\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.437870 master-0 kubenswrapper[18592]: I0308 04:16:45.424973 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-db-sync-config-data\") pod \"cinder-ff301-db-sync-bgkm7\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.437870 master-0 kubenswrapper[18592]: I0308 04:16:45.425401 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-scripts\") pod \"cinder-ff301-db-sync-bgkm7\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.437870 master-0 kubenswrapper[18592]: I0308 04:16:45.429331 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-config-data\") pod \"cinder-ff301-db-sync-bgkm7\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.437870 master-0 kubenswrapper[18592]: I0308 04:16:45.432515 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-scripts\") pod \"cinder-ff301-db-sync-bgkm7\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.468413 master-0 kubenswrapper[18592]: I0308 04:16:45.449675 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdkv6\" (UniqueName: \"kubernetes.io/projected/117a9c49-cd48-4a2c-bdee-10bb60588b20-kube-api-access-wdkv6\") pod \"cinder-ff301-db-sync-bgkm7\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.475620 master-0 kubenswrapper[18592]: I0308 04:16:45.474709 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5584fcc769-csbz6"] Mar 08 04:16:45.481122 master-0 kubenswrapper[18592]: W0308 04:16:45.480488 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod319bbb45_9cc5_4110_8aed_94ccbe808779.slice/crio-bc21c00eac515f76c290618c7f8a28f745651b7325e8c3f3a3e7beeeff9378cd WatchSource:0}: Error finding container bc21c00eac515f76c290618c7f8a28f745651b7325e8c3f3a3e7beeeff9378cd: Status 404 returned error can't find the container with id bc21c00eac515f76c290618c7f8a28f745651b7325e8c3f3a3e7beeeff9378cd Mar 08 04:16:45.498090 master-0 kubenswrapper[18592]: I0308 04:16:45.498008 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5584fcc769-csbz6"] Mar 08 04:16:45.498698 master-0 kubenswrapper[18592]: I0308 04:16:45.498262 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.509100 master-0 kubenswrapper[18592]: I0308 04:16:45.508683 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-z8kn6"] Mar 08 04:16:45.521001 master-0 kubenswrapper[18592]: I0308 04:16:45.520928 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vhcff" Mar 08 04:16:45.528702 master-0 kubenswrapper[18592]: I0308 04:16:45.527245 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-logs\") pod \"placement-db-sync-7ttxm\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " pod="openstack/placement-db-sync-7ttxm" Mar 08 04:16:45.528702 master-0 kubenswrapper[18592]: I0308 04:16:45.527302 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwwgd\" (UniqueName: \"kubernetes.io/projected/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-kube-api-access-zwwgd\") pod \"placement-db-sync-7ttxm\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " pod="openstack/placement-db-sync-7ttxm" Mar 08 04:16:45.528702 master-0 kubenswrapper[18592]: I0308 04:16:45.527326 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-combined-ca-bundle\") pod \"placement-db-sync-7ttxm\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " pod="openstack/placement-db-sync-7ttxm" Mar 08 04:16:45.528702 master-0 kubenswrapper[18592]: I0308 04:16:45.527359 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-config-data\") pod \"placement-db-sync-7ttxm\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " pod="openstack/placement-db-sync-7ttxm" Mar 08 04:16:45.528702 master-0 kubenswrapper[18592]: I0308 04:16:45.527378 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-scripts\") pod \"placement-db-sync-7ttxm\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " pod="openstack/placement-db-sync-7ttxm" Mar 08 04:16:45.528702 master-0 kubenswrapper[18592]: I0308 04:16:45.528689 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-logs\") pod \"placement-db-sync-7ttxm\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " pod="openstack/placement-db-sync-7ttxm" Mar 08 04:16:45.531706 master-0 kubenswrapper[18592]: I0308 04:16:45.531643 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-combined-ca-bundle\") pod \"placement-db-sync-7ttxm\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " pod="openstack/placement-db-sync-7ttxm" Mar 08 04:16:45.535677 master-0 kubenswrapper[18592]: I0308 04:16:45.534586 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-scripts\") pod \"placement-db-sync-7ttxm\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " pod="openstack/placement-db-sync-7ttxm" Mar 08 04:16:45.535677 master-0 kubenswrapper[18592]: I0308 04:16:45.535625 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-config-data\") pod \"placement-db-sync-7ttxm\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " pod="openstack/placement-db-sync-7ttxm" Mar 08 04:16:45.548701 master-0 kubenswrapper[18592]: I0308 04:16:45.548660 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwwgd\" (UniqueName: \"kubernetes.io/projected/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-kube-api-access-zwwgd\") pod \"placement-db-sync-7ttxm\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " pod="openstack/placement-db-sync-7ttxm" Mar 08 04:16:45.582898 master-0 kubenswrapper[18592]: I0308 04:16:45.582744 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-e561-account-create-update-pmf7m" Mar 08 04:16:45.604004 master-0 kubenswrapper[18592]: I0308 04:16:45.603424 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:16:45.631480 master-0 kubenswrapper[18592]: I0308 04:16:45.631355 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-config\") pod \"dnsmasq-dns-5584fcc769-csbz6\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.631791 master-0 kubenswrapper[18592]: I0308 04:16:45.631741 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-dns-svc\") pod \"dnsmasq-dns-5584fcc769-csbz6\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.632006 master-0 kubenswrapper[18592]: I0308 04:16:45.631899 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-ovsdbserver-sb\") pod \"dnsmasq-dns-5584fcc769-csbz6\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.632006 master-0 kubenswrapper[18592]: I0308 04:16:45.631965 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95w42\" (UniqueName: \"kubernetes.io/projected/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-kube-api-access-95w42\") pod \"dnsmasq-dns-5584fcc769-csbz6\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.632244 master-0 kubenswrapper[18592]: I0308 04:16:45.632167 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-dns-swift-storage-0\") pod \"dnsmasq-dns-5584fcc769-csbz6\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.632400 master-0 kubenswrapper[18592]: I0308 04:16:45.632365 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-ovsdbserver-nb\") pod \"dnsmasq-dns-5584fcc769-csbz6\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.660666 master-0 kubenswrapper[18592]: I0308 04:16:45.659135 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7ttxm" Mar 08 04:16:45.752060 master-0 kubenswrapper[18592]: I0308 04:16:45.745743 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-dns-swift-storage-0\") pod \"dnsmasq-dns-5584fcc769-csbz6\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.752060 master-0 kubenswrapper[18592]: I0308 04:16:45.745862 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-ovsdbserver-nb\") pod \"dnsmasq-dns-5584fcc769-csbz6\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.752060 master-0 kubenswrapper[18592]: I0308 04:16:45.745905 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-config\") pod \"dnsmasq-dns-5584fcc769-csbz6\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.752060 master-0 kubenswrapper[18592]: I0308 04:16:45.745932 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-dns-svc\") pod \"dnsmasq-dns-5584fcc769-csbz6\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.752060 master-0 kubenswrapper[18592]: I0308 04:16:45.745977 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-ovsdbserver-sb\") pod \"dnsmasq-dns-5584fcc769-csbz6\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.752060 master-0 kubenswrapper[18592]: I0308 04:16:45.746016 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95w42\" (UniqueName: \"kubernetes.io/projected/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-kube-api-access-95w42\") pod \"dnsmasq-dns-5584fcc769-csbz6\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.752060 master-0 kubenswrapper[18592]: I0308 04:16:45.747464 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-dns-svc\") pod \"dnsmasq-dns-5584fcc769-csbz6\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.752060 master-0 kubenswrapper[18592]: I0308 04:16:45.748002 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-dns-swift-storage-0\") pod \"dnsmasq-dns-5584fcc769-csbz6\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.752060 master-0 kubenswrapper[18592]: I0308 04:16:45.748092 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-ovsdbserver-nb\") pod \"dnsmasq-dns-5584fcc769-csbz6\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.752060 master-0 kubenswrapper[18592]: I0308 04:16:45.748690 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-config\") pod \"dnsmasq-dns-5584fcc769-csbz6\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.752060 master-0 kubenswrapper[18592]: I0308 04:16:45.748748 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-ovsdbserver-sb\") pod \"dnsmasq-dns-5584fcc769-csbz6\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.801481 master-0 kubenswrapper[18592]: I0308 04:16:45.799495 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:45.813797 master-0 kubenswrapper[18592]: I0308 04:16:45.813550 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95w42\" (UniqueName: \"kubernetes.io/projected/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-kube-api-access-95w42\") pod \"dnsmasq-dns-5584fcc769-csbz6\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.842126 master-0 kubenswrapper[18592]: I0308 04:16:45.841022 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:45.977729 master-0 kubenswrapper[18592]: I0308 04:16:45.968979 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-ovsdbserver-nb\") pod \"55996053-3eb1-4222-b62f-9fca82e2c6fc\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " Mar 08 04:16:45.977729 master-0 kubenswrapper[18592]: I0308 04:16:45.969049 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jrh6\" (UniqueName: \"kubernetes.io/projected/55996053-3eb1-4222-b62f-9fca82e2c6fc-kube-api-access-7jrh6\") pod \"55996053-3eb1-4222-b62f-9fca82e2c6fc\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " Mar 08 04:16:45.977729 master-0 kubenswrapper[18592]: I0308 04:16:45.969121 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-ovsdbserver-sb\") pod \"55996053-3eb1-4222-b62f-9fca82e2c6fc\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " Mar 08 04:16:45.977729 master-0 kubenswrapper[18592]: I0308 04:16:45.969158 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-config\") pod \"55996053-3eb1-4222-b62f-9fca82e2c6fc\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " Mar 08 04:16:45.977729 master-0 kubenswrapper[18592]: I0308 04:16:45.969237 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-dns-svc\") pod \"55996053-3eb1-4222-b62f-9fca82e2c6fc\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " Mar 08 04:16:45.977729 master-0 kubenswrapper[18592]: I0308 04:16:45.969295 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-dns-swift-storage-0\") pod \"55996053-3eb1-4222-b62f-9fca82e2c6fc\" (UID: \"55996053-3eb1-4222-b62f-9fca82e2c6fc\") " Mar 08 04:16:46.083841 master-0 kubenswrapper[18592]: I0308 04:16:46.075945 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55996053-3eb1-4222-b62f-9fca82e2c6fc-kube-api-access-7jrh6" (OuterVolumeSpecName: "kube-api-access-7jrh6") pod "55996053-3eb1-4222-b62f-9fca82e2c6fc" (UID: "55996053-3eb1-4222-b62f-9fca82e2c6fc"). InnerVolumeSpecName "kube-api-access-7jrh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:46.098907 master-0 kubenswrapper[18592]: I0308 04:16:46.093833 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79c75d759-lh7hc"] Mar 08 04:16:46.173553 master-0 kubenswrapper[18592]: I0308 04:16:46.173476 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "55996053-3eb1-4222-b62f-9fca82e2c6fc" (UID: "55996053-3eb1-4222-b62f-9fca82e2c6fc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:46.188539 master-0 kubenswrapper[18592]: I0308 04:16:46.176438 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jrh6\" (UniqueName: \"kubernetes.io/projected/55996053-3eb1-4222-b62f-9fca82e2c6fc-kube-api-access-7jrh6\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:46.188539 master-0 kubenswrapper[18592]: I0308 04:16:46.176487 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:46.188539 master-0 kubenswrapper[18592]: I0308 04:16:46.186857 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" Mar 08 04:16:46.278041 master-0 kubenswrapper[18592]: I0308 04:16:46.272812 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-config" (OuterVolumeSpecName: "config") pod "55996053-3eb1-4222-b62f-9fca82e2c6fc" (UID: "55996053-3eb1-4222-b62f-9fca82e2c6fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:46.280756 master-0 kubenswrapper[18592]: I0308 04:16:46.280359 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-z8kn6" podStartSLOduration=2.280338003 podStartE2EDuration="2.280338003s" podCreationTimestamp="2026-03-08 04:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:16:46.265048268 +0000 UTC m=+1418.363802618" watchObservedRunningTime="2026-03-08 04:16:46.280338003 +0000 UTC m=+1418.379092363" Mar 08 04:16:46.295057 master-0 kubenswrapper[18592]: I0308 04:16:46.294483 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:46.303446 master-0 kubenswrapper[18592]: I0308 04:16:46.298802 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "55996053-3eb1-4222-b62f-9fca82e2c6fc" (UID: "55996053-3eb1-4222-b62f-9fca82e2c6fc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:46.317041 master-0 kubenswrapper[18592]: I0308 04:16:46.315912 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "55996053-3eb1-4222-b62f-9fca82e2c6fc" (UID: "55996053-3eb1-4222-b62f-9fca82e2c6fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:46.352278 master-0 kubenswrapper[18592]: I0308 04:16:46.350632 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vhcff"] Mar 08 04:16:46.352278 master-0 kubenswrapper[18592]: I0308 04:16:46.350678 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-z8kn6" event={"ID":"319bbb45-9cc5-4110-8aed-94ccbe808779","Type":"ContainerStarted","Data":"815a2ab3203863df3a809e029c429539a9587ad09ba883aa8520d64bec13105d"} Mar 08 04:16:46.352278 master-0 kubenswrapper[18592]: I0308 04:16:46.350698 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-z8kn6" event={"ID":"319bbb45-9cc5-4110-8aed-94ccbe808779","Type":"ContainerStarted","Data":"bc21c00eac515f76c290618c7f8a28f745651b7325e8c3f3a3e7beeeff9378cd"} Mar 08 04:16:46.352278 master-0 kubenswrapper[18592]: I0308 04:16:46.350708 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4bc7c77c-swkkv" event={"ID":"55996053-3eb1-4222-b62f-9fca82e2c6fc","Type":"ContainerDied","Data":"60a5655005d07e97ec6f7cbb1a9f1831e66859dedb1196963d849e5b8e5b0d0c"} Mar 08 04:16:46.352278 master-0 kubenswrapper[18592]: I0308 04:16:46.350731 18592 scope.go:117] "RemoveContainer" containerID="ae4a7d3f2cefb3b571e417223de83e4fe0b777d0233c8d00286b5211d2239b72" Mar 08 04:16:46.368062 master-0 kubenswrapper[18592]: I0308 04:16:46.360418 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "55996053-3eb1-4222-b62f-9fca82e2c6fc" (UID: "55996053-3eb1-4222-b62f-9fca82e2c6fc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:46.405654 master-0 kubenswrapper[18592]: I0308 04:16:46.398985 18592 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:46.405654 master-0 kubenswrapper[18592]: I0308 04:16:46.399011 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:46.405654 master-0 kubenswrapper[18592]: I0308 04:16:46.399021 18592 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55996053-3eb1-4222-b62f-9fca82e2c6fc-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:46.429970 master-0 kubenswrapper[18592]: I0308 04:16:46.426983 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-rk2j4"] Mar 08 04:16:46.434070 master-0 kubenswrapper[18592]: I0308 04:16:46.433991 18592 scope.go:117] "RemoveContainer" containerID="ca5218db5c8613c2a953030d8ffa068b176f28de0ea17c9ddb0fe6c548f23d33" Mar 08 04:16:46.456355 master-0 kubenswrapper[18592]: W0308 04:16:46.456284 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f03ac81_08d4_4053_9a33_ae4228fb3d3b.slice/crio-010b7028b350b87b2ea58db37a7bea893edf0e0dbad2644eea198d27baf58f4d WatchSource:0}: Error finding container 010b7028b350b87b2ea58db37a7bea893edf0e0dbad2644eea198d27baf58f4d: Status 404 returned error can't find the container with id 010b7028b350b87b2ea58db37a7bea893edf0e0dbad2644eea198d27baf58f4d Mar 08 04:16:46.586182 master-0 kubenswrapper[18592]: I0308 04:16:46.586105 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f4bc7c77c-swkkv"] Mar 08 04:16:46.595908 master-0 kubenswrapper[18592]: I0308 04:16:46.595862 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f4bc7c77c-swkkv"] Mar 08 04:16:46.708896 master-0 kubenswrapper[18592]: I0308 04:16:46.706619 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-e561-account-create-update-pmf7m"] Mar 08 04:16:46.718943 master-0 kubenswrapper[18592]: W0308 04:16:46.717369 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81d0a12f_c85b_41ef_a076_efa3dd40f9aa.slice/crio-d81fda4b7a51ee89a266c3dd1cb518265f1f7785498ef9978b50da312c922602 WatchSource:0}: Error finding container d81fda4b7a51ee89a266c3dd1cb518265f1f7785498ef9978b50da312c922602: Status 404 returned error can't find the container with id d81fda4b7a51ee89a266c3dd1cb518265f1f7785498ef9978b50da312c922602 Mar 08 04:16:46.811213 master-0 kubenswrapper[18592]: I0308 04:16:46.810936 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-afe2b-default-external-api-0"] Mar 08 04:16:46.812333 master-0 kubenswrapper[18592]: E0308 04:16:46.811444 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55996053-3eb1-4222-b62f-9fca82e2c6fc" containerName="dnsmasq-dns" Mar 08 04:16:46.812333 master-0 kubenswrapper[18592]: I0308 04:16:46.811460 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="55996053-3eb1-4222-b62f-9fca82e2c6fc" containerName="dnsmasq-dns" Mar 08 04:16:46.812333 master-0 kubenswrapper[18592]: E0308 04:16:46.811473 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55996053-3eb1-4222-b62f-9fca82e2c6fc" containerName="init" Mar 08 04:16:46.812333 master-0 kubenswrapper[18592]: I0308 04:16:46.811479 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="55996053-3eb1-4222-b62f-9fca82e2c6fc" containerName="init" Mar 08 04:16:46.812333 master-0 kubenswrapper[18592]: I0308 04:16:46.811672 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="55996053-3eb1-4222-b62f-9fca82e2c6fc" containerName="dnsmasq-dns" Mar 08 04:16:46.813294 master-0 kubenswrapper[18592]: I0308 04:16:46.812682 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:46.815872 master-0 kubenswrapper[18592]: I0308 04:16:46.815596 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 04:16:46.815872 master-0 kubenswrapper[18592]: I0308 04:16:46.815745 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-afe2b-default-external-config-data" Mar 08 04:16:46.816734 master-0 kubenswrapper[18592]: I0308 04:16:46.816028 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 08 04:16:46.845847 master-0 kubenswrapper[18592]: I0308 04:16:46.845368 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7ttxm"] Mar 08 04:16:46.854115 master-0 kubenswrapper[18592]: I0308 04:16:46.853441 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-afe2b-default-external-api-0"] Mar 08 04:16:46.871047 master-0 kubenswrapper[18592]: W0308 04:16:46.870510 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad7d2ffe_d3ac_4a74_ae47_46241d6c4769.slice/crio-4ec4ec720b8c14311ccb5daf4442b2945876cee8fd583600358c5bf9725d0763 WatchSource:0}: Error finding container 4ec4ec720b8c14311ccb5daf4442b2945876cee8fd583600358c5bf9725d0763: Status 404 returned error can't find the container with id 4ec4ec720b8c14311ccb5daf4442b2945876cee8fd583600358c5bf9725d0763 Mar 08 04:16:46.914721 master-0 kubenswrapper[18592]: I0308 04:16:46.912743 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5958986d-a206-420e-91c0-5d55f14f2183-logs\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:46.914721 master-0 kubenswrapper[18592]: I0308 04:16:46.912869 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7l4q\" (UniqueName: \"kubernetes.io/projected/5958986d-a206-420e-91c0-5d55f14f2183-kube-api-access-x7l4q\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:46.915745 master-0 kubenswrapper[18592]: I0308 04:16:46.915027 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-scripts\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:46.915745 master-0 kubenswrapper[18592]: I0308 04:16:46.915101 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-acff9521-23da-47da-b539-1fad9dc0c8dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:46.915745 master-0 kubenswrapper[18592]: I0308 04:16:46.915151 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5958986d-a206-420e-91c0-5d55f14f2183-httpd-run\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:46.915745 master-0 kubenswrapper[18592]: I0308 04:16:46.915284 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-config-data\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:46.915745 master-0 kubenswrapper[18592]: I0308 04:16:46.915359 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-combined-ca-bundle\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:46.915745 master-0 kubenswrapper[18592]: I0308 04:16:46.915558 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-public-tls-certs\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:47.000505 master-0 kubenswrapper[18592]: I0308 04:16:47.000437 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5584fcc769-csbz6"] Mar 08 04:16:47.025092 master-0 kubenswrapper[18592]: I0308 04:16:47.017496 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-config-data\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:47.025092 master-0 kubenswrapper[18592]: I0308 04:16:47.017571 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-combined-ca-bundle\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:47.025092 master-0 kubenswrapper[18592]: I0308 04:16:47.017852 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-public-tls-certs\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:47.025092 master-0 kubenswrapper[18592]: I0308 04:16:47.018140 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5958986d-a206-420e-91c0-5d55f14f2183-logs\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:47.025092 master-0 kubenswrapper[18592]: I0308 04:16:47.018285 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7l4q\" (UniqueName: \"kubernetes.io/projected/5958986d-a206-420e-91c0-5d55f14f2183-kube-api-access-x7l4q\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:47.025092 master-0 kubenswrapper[18592]: I0308 04:16:47.018376 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-scripts\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:47.025092 master-0 kubenswrapper[18592]: I0308 04:16:47.018402 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-acff9521-23da-47da-b539-1fad9dc0c8dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:47.025092 master-0 kubenswrapper[18592]: I0308 04:16:47.018433 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5958986d-a206-420e-91c0-5d55f14f2183-httpd-run\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:47.025092 master-0 kubenswrapper[18592]: W0308 04:16:47.018981 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea394be4_dd2b_4519_b7bc_80ac84a4cd16.slice/crio-b55ff8dfc679a86aeb9f480f0251a0568e987ff79fa428cfbf0c61d62ffb8556 WatchSource:0}: Error finding container b55ff8dfc679a86aeb9f480f0251a0568e987ff79fa428cfbf0c61d62ffb8556: Status 404 returned error can't find the container with id b55ff8dfc679a86aeb9f480f0251a0568e987ff79fa428cfbf0c61d62ffb8556 Mar 08 04:16:47.025092 master-0 kubenswrapper[18592]: I0308 04:16:47.020184 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5958986d-a206-420e-91c0-5d55f14f2183-httpd-run\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:47.025092 master-0 kubenswrapper[18592]: I0308 04:16:47.023341 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5958986d-a206-420e-91c0-5d55f14f2183-logs\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:47.025092 master-0 kubenswrapper[18592]: I0308 04:16:47.023647 18592 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 04:16:47.025092 master-0 kubenswrapper[18592]: I0308 04:16:47.023671 18592 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-acff9521-23da-47da-b539-1fad9dc0c8dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/4fa84c0fff972147ae10cb80fb512dbc343a3b9b44d5fb87069bb7869e732a9b/globalmount\"" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:47.025491 master-0 kubenswrapper[18592]: I0308 04:16:47.025330 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-public-tls-certs\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:47.025773 master-0 kubenswrapper[18592]: I0308 04:16:47.025735 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-combined-ca-bundle\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:47.029566 master-0 kubenswrapper[18592]: I0308 04:16:47.026719 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-config-data\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:47.031881 master-0 kubenswrapper[18592]: I0308 04:16:47.030004 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-scripts\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:47.050885 master-0 kubenswrapper[18592]: I0308 04:16:47.050574 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7l4q\" (UniqueName: \"kubernetes.io/projected/5958986d-a206-420e-91c0-5d55f14f2183-kube-api-access-x7l4q\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:47.057911 master-0 kubenswrapper[18592]: I0308 04:16:47.057779 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ff301-db-sync-bgkm7"] Mar 08 04:16:47.236269 master-0 kubenswrapper[18592]: I0308 04:16:47.236139 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vhcff" event={"ID":"d0eaefda-7b75-43b9-8d18-8c1476db321d","Type":"ContainerStarted","Data":"fddf0c276ff3e965c6a7d21cfe20238c47c133412bb2794d15ef17ace13bbc58"} Mar 08 04:16:47.236269 master-0 kubenswrapper[18592]: I0308 04:16:47.236200 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vhcff" event={"ID":"d0eaefda-7b75-43b9-8d18-8c1476db321d","Type":"ContainerStarted","Data":"3c4c3a7d2636242bfc2a1d3a1e5905ae0bec590b3fcd546cfff36c40db486da7"} Mar 08 04:16:47.238815 master-0 kubenswrapper[18592]: I0308 04:16:47.238789 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-rk2j4" event={"ID":"1f03ac81-08d4-4053-9a33-ae4228fb3d3b","Type":"ContainerStarted","Data":"c721ea8060313fa5fc0c7961cf3fa40bc577a54204648b62c16a81b5d46812eb"} Mar 08 04:16:47.238815 master-0 kubenswrapper[18592]: I0308 04:16:47.238816 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-rk2j4" event={"ID":"1f03ac81-08d4-4053-9a33-ae4228fb3d3b","Type":"ContainerStarted","Data":"010b7028b350b87b2ea58db37a7bea893edf0e0dbad2644eea198d27baf58f4d"} Mar 08 04:16:47.240616 master-0 kubenswrapper[18592]: I0308 04:16:47.240477 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-db-sync-bgkm7" event={"ID":"117a9c49-cd48-4a2c-bdee-10bb60588b20","Type":"ContainerStarted","Data":"2a58ccfbb8785c254db329c6d3dbdc21d79ca9f1b47d1e1f3e720136bb4002c9"} Mar 08 04:16:47.242167 master-0 kubenswrapper[18592]: I0308 04:16:47.242109 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7ttxm" event={"ID":"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769","Type":"ContainerStarted","Data":"4ec4ec720b8c14311ccb5daf4442b2945876cee8fd583600358c5bf9725d0763"} Mar 08 04:16:47.244118 master-0 kubenswrapper[18592]: I0308 04:16:47.244090 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-e561-account-create-update-pmf7m" event={"ID":"81d0a12f-c85b-41ef-a076-efa3dd40f9aa","Type":"ContainerStarted","Data":"d81fda4b7a51ee89a266c3dd1cb518265f1f7785498ef9978b50da312c922602"} Mar 08 04:16:47.247939 master-0 kubenswrapper[18592]: I0308 04:16:47.247880 18592 generic.go:334] "Generic (PLEG): container finished" podID="fc53e341-1423-4a10-b198-5fb7f51dcc52" containerID="19efb7e26a8677fa2c71d89e45940bc1d01493ba48ef6da814dd41c19e27e353" exitCode=0 Mar 08 04:16:47.248026 master-0 kubenswrapper[18592]: I0308 04:16:47.247965 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c75d759-lh7hc" event={"ID":"fc53e341-1423-4a10-b198-5fb7f51dcc52","Type":"ContainerDied","Data":"19efb7e26a8677fa2c71d89e45940bc1d01493ba48ef6da814dd41c19e27e353"} Mar 08 04:16:47.248026 master-0 kubenswrapper[18592]: I0308 04:16:47.247983 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c75d759-lh7hc" event={"ID":"fc53e341-1423-4a10-b198-5fb7f51dcc52","Type":"ContainerStarted","Data":"e7368d19b930aafba2cae447514c0c5ceb5beb4a054caec6568215559f5738de"} Mar 08 04:16:47.255685 master-0 kubenswrapper[18592]: I0308 04:16:47.255627 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5584fcc769-csbz6" event={"ID":"ea394be4-dd2b-4519-b7bc-80ac84a4cd16","Type":"ContainerStarted","Data":"b55ff8dfc679a86aeb9f480f0251a0568e987ff79fa428cfbf0c61d62ffb8556"} Mar 08 04:16:47.264291 master-0 kubenswrapper[18592]: I0308 04:16:47.264209 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-vhcff" podStartSLOduration=3.264191829 podStartE2EDuration="3.264191829s" podCreationTimestamp="2026-03-08 04:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:16:47.251195395 +0000 UTC m=+1419.349949745" watchObservedRunningTime="2026-03-08 04:16:47.264191829 +0000 UTC m=+1419.362946179" Mar 08 04:16:47.346403 master-0 kubenswrapper[18592]: I0308 04:16:47.346323 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-db-create-rk2j4" podStartSLOduration=3.34630185 podStartE2EDuration="3.34630185s" podCreationTimestamp="2026-03-08 04:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:16:47.30768768 +0000 UTC m=+1419.406442030" watchObservedRunningTime="2026-03-08 04:16:47.34630185 +0000 UTC m=+1419.445056190" Mar 08 04:16:47.894525 master-0 kubenswrapper[18592]: I0308 04:16:47.894455 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-afe2b-default-external-api-0"] Mar 08 04:16:47.895360 master-0 kubenswrapper[18592]: E0308 04:16:47.895317 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-afe2b-default-external-api-0" podUID="5958986d-a206-420e-91c0-5d55f14f2183" Mar 08 04:16:47.940880 master-0 kubenswrapper[18592]: I0308 04:16:47.940736 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:47.952497 master-0 kubenswrapper[18592]: I0308 04:16:47.949768 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-afe2b-default-internal-api-0"] Mar 08 04:16:47.952497 master-0 kubenswrapper[18592]: E0308 04:16:47.950339 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc53e341-1423-4a10-b198-5fb7f51dcc52" containerName="init" Mar 08 04:16:47.952497 master-0 kubenswrapper[18592]: I0308 04:16:47.950356 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc53e341-1423-4a10-b198-5fb7f51dcc52" containerName="init" Mar 08 04:16:47.952497 master-0 kubenswrapper[18592]: I0308 04:16:47.950564 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc53e341-1423-4a10-b198-5fb7f51dcc52" containerName="init" Mar 08 04:16:47.952497 master-0 kubenswrapper[18592]: I0308 04:16:47.951613 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:47.959911 master-0 kubenswrapper[18592]: I0308 04:16:47.953879 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 08 04:16:47.959911 master-0 kubenswrapper[18592]: I0308 04:16:47.954092 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-afe2b-default-internal-config-data" Mar 08 04:16:47.966535 master-0 kubenswrapper[18592]: I0308 04:16:47.966485 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-afe2b-default-internal-api-0"] Mar 08 04:16:47.993192 master-0 kubenswrapper[18592]: E0308 04:16:47.993136 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-p9d6x logs scripts], unattached volumes=[], failed to process volumes=[combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-p9d6x logs scripts]: context canceled" pod="openstack/glance-afe2b-default-internal-api-0" podUID="3aea55ba-fea5-4a2c-8101-a7a94f8a518a" Mar 08 04:16:47.998730 master-0 kubenswrapper[18592]: I0308 04:16:47.998636 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-afe2b-default-internal-api-0"] Mar 08 04:16:48.042865 master-0 kubenswrapper[18592]: I0308 04:16:48.042801 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4fsp\" (UniqueName: \"kubernetes.io/projected/fc53e341-1423-4a10-b198-5fb7f51dcc52-kube-api-access-h4fsp\") pod \"fc53e341-1423-4a10-b198-5fb7f51dcc52\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " Mar 08 04:16:48.043117 master-0 kubenswrapper[18592]: I0308 04:16:48.042882 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-ovsdbserver-sb\") pod \"fc53e341-1423-4a10-b198-5fb7f51dcc52\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " Mar 08 04:16:48.043117 master-0 kubenswrapper[18592]: I0308 04:16:48.042951 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-dns-svc\") pod \"fc53e341-1423-4a10-b198-5fb7f51dcc52\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " Mar 08 04:16:48.043117 master-0 kubenswrapper[18592]: I0308 04:16:48.042970 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-config\") pod \"fc53e341-1423-4a10-b198-5fb7f51dcc52\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " Mar 08 04:16:48.043117 master-0 kubenswrapper[18592]: I0308 04:16:48.043018 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-ovsdbserver-nb\") pod \"fc53e341-1423-4a10-b198-5fb7f51dcc52\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " Mar 08 04:16:48.043117 master-0 kubenswrapper[18592]: I0308 04:16:48.043102 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-dns-swift-storage-0\") pod \"fc53e341-1423-4a10-b198-5fb7f51dcc52\" (UID: \"fc53e341-1423-4a10-b198-5fb7f51dcc52\") " Mar 08 04:16:48.043349 master-0 kubenswrapper[18592]: I0308 04:16:48.043326 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-logs\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.043402 master-0 kubenswrapper[18592]: I0308 04:16:48.043393 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-config-data\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.043506 master-0 kubenswrapper[18592]: I0308 04:16:48.043444 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-combined-ca-bundle\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.043506 master-0 kubenswrapper[18592]: I0308 04:16:48.043490 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9d6x\" (UniqueName: \"kubernetes.io/projected/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-kube-api-access-p9d6x\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.043573 master-0 kubenswrapper[18592]: I0308 04:16:48.043512 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-internal-tls-certs\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.043573 master-0 kubenswrapper[18592]: I0308 04:16:48.043537 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9bbcd16a-3cea-4572-8bc5-480804def335\" (UniqueName: \"kubernetes.io/csi/topolvm.io^613c9bf7-76cc-44ce-8d9f-8cdae5e6db9e\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.043726 master-0 kubenswrapper[18592]: I0308 04:16:48.043584 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-httpd-run\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.043726 master-0 kubenswrapper[18592]: I0308 04:16:48.043611 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-scripts\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.048641 master-0 kubenswrapper[18592]: I0308 04:16:48.048587 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc53e341-1423-4a10-b198-5fb7f51dcc52-kube-api-access-h4fsp" (OuterVolumeSpecName: "kube-api-access-h4fsp") pod "fc53e341-1423-4a10-b198-5fb7f51dcc52" (UID: "fc53e341-1423-4a10-b198-5fb7f51dcc52"). InnerVolumeSpecName "kube-api-access-h4fsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:48.071083 master-0 kubenswrapper[18592]: I0308 04:16:48.071012 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fc53e341-1423-4a10-b198-5fb7f51dcc52" (UID: "fc53e341-1423-4a10-b198-5fb7f51dcc52"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:48.071254 master-0 kubenswrapper[18592]: I0308 04:16:48.071179 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fc53e341-1423-4a10-b198-5fb7f51dcc52" (UID: "fc53e341-1423-4a10-b198-5fb7f51dcc52"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:48.073009 master-0 kubenswrapper[18592]: I0308 04:16:48.072965 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fc53e341-1423-4a10-b198-5fb7f51dcc52" (UID: "fc53e341-1423-4a10-b198-5fb7f51dcc52"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:48.078438 master-0 kubenswrapper[18592]: I0308 04:16:48.078370 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fc53e341-1423-4a10-b198-5fb7f51dcc52" (UID: "fc53e341-1423-4a10-b198-5fb7f51dcc52"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:48.117506 master-0 kubenswrapper[18592]: I0308 04:16:48.117443 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-config" (OuterVolumeSpecName: "config") pod "fc53e341-1423-4a10-b198-5fb7f51dcc52" (UID: "fc53e341-1423-4a10-b198-5fb7f51dcc52"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:48.149995 master-0 kubenswrapper[18592]: I0308 04:16:48.145454 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9d6x\" (UniqueName: \"kubernetes.io/projected/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-kube-api-access-p9d6x\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.149995 master-0 kubenswrapper[18592]: I0308 04:16:48.145545 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-internal-tls-certs\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.149995 master-0 kubenswrapper[18592]: I0308 04:16:48.145642 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9bbcd16a-3cea-4572-8bc5-480804def335\" (UniqueName: \"kubernetes.io/csi/topolvm.io^613c9bf7-76cc-44ce-8d9f-8cdae5e6db9e\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.149995 master-0 kubenswrapper[18592]: I0308 04:16:48.145721 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-httpd-run\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.149995 master-0 kubenswrapper[18592]: I0308 04:16:48.145750 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-scripts\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.149995 master-0 kubenswrapper[18592]: I0308 04:16:48.145809 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-logs\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.149995 master-0 kubenswrapper[18592]: I0308 04:16:48.145916 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-config-data\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.149995 master-0 kubenswrapper[18592]: I0308 04:16:48.146003 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-combined-ca-bundle\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.149995 master-0 kubenswrapper[18592]: I0308 04:16:48.146124 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.149995 master-0 kubenswrapper[18592]: I0308 04:16:48.146144 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.149995 master-0 kubenswrapper[18592]: I0308 04:16:48.146160 18592 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.149995 master-0 kubenswrapper[18592]: I0308 04:16:48.146174 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4fsp\" (UniqueName: \"kubernetes.io/projected/fc53e341-1423-4a10-b198-5fb7f51dcc52-kube-api-access-h4fsp\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.149995 master-0 kubenswrapper[18592]: I0308 04:16:48.146187 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.149995 master-0 kubenswrapper[18592]: I0308 04:16:48.146200 18592 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc53e341-1423-4a10-b198-5fb7f51dcc52-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.149995 master-0 kubenswrapper[18592]: I0308 04:16:48.146949 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-httpd-run\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.149995 master-0 kubenswrapper[18592]: I0308 04:16:48.147960 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-logs\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.151043 master-0 kubenswrapper[18592]: I0308 04:16:48.151015 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-internal-tls-certs\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.151501 master-0 kubenswrapper[18592]: I0308 04:16:48.151456 18592 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 04:16:48.152280 master-0 kubenswrapper[18592]: I0308 04:16:48.151500 18592 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9bbcd16a-3cea-4572-8bc5-480804def335\" (UniqueName: \"kubernetes.io/csi/topolvm.io^613c9bf7-76cc-44ce-8d9f-8cdae5e6db9e\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/13a0545ec1feabd7833445e4e7b9c29b3f2190095c44424ae5035fff8a8ec5c4/globalmount\"" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.152635 master-0 kubenswrapper[18592]: I0308 04:16:48.152589 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-combined-ca-bundle\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.154912 master-0 kubenswrapper[18592]: I0308 04:16:48.154238 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-scripts\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.156945 master-0 kubenswrapper[18592]: I0308 04:16:48.156890 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-config-data\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.184191 master-0 kubenswrapper[18592]: I0308 04:16:48.184131 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55996053-3eb1-4222-b62f-9fca82e2c6fc" path="/var/lib/kubelet/pods/55996053-3eb1-4222-b62f-9fca82e2c6fc/volumes" Mar 08 04:16:48.184918 master-0 kubenswrapper[18592]: I0308 04:16:48.184874 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9d6x\" (UniqueName: \"kubernetes.io/projected/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-kube-api-access-p9d6x\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.267362 master-0 kubenswrapper[18592]: I0308 04:16:48.267258 18592 generic.go:334] "Generic (PLEG): container finished" podID="81d0a12f-c85b-41ef-a076-efa3dd40f9aa" containerID="18b5cd4d1d14d3966303213f9dee13d75278af393efa6352d9cb626117413a9f" exitCode=0 Mar 08 04:16:48.267362 master-0 kubenswrapper[18592]: I0308 04:16:48.267320 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-e561-account-create-update-pmf7m" event={"ID":"81d0a12f-c85b-41ef-a076-efa3dd40f9aa","Type":"ContainerDied","Data":"18b5cd4d1d14d3966303213f9dee13d75278af393efa6352d9cb626117413a9f"} Mar 08 04:16:48.275380 master-0 kubenswrapper[18592]: I0308 04:16:48.275331 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c75d759-lh7hc" event={"ID":"fc53e341-1423-4a10-b198-5fb7f51dcc52","Type":"ContainerDied","Data":"e7368d19b930aafba2cae447514c0c5ceb5beb4a054caec6568215559f5738de"} Mar 08 04:16:48.275452 master-0 kubenswrapper[18592]: I0308 04:16:48.275386 18592 scope.go:117] "RemoveContainer" containerID="19efb7e26a8677fa2c71d89e45940bc1d01493ba48ef6da814dd41c19e27e353" Mar 08 04:16:48.275516 master-0 kubenswrapper[18592]: I0308 04:16:48.275492 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c75d759-lh7hc" Mar 08 04:16:48.279484 master-0 kubenswrapper[18592]: I0308 04:16:48.279454 18592 generic.go:334] "Generic (PLEG): container finished" podID="ea394be4-dd2b-4519-b7bc-80ac84a4cd16" containerID="a8b5d8309bbaf476dac3e6aafc6d67940f17c8c7d9318f0f2f1137533e2777d0" exitCode=0 Mar 08 04:16:48.279549 master-0 kubenswrapper[18592]: I0308 04:16:48.279503 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5584fcc769-csbz6" event={"ID":"ea394be4-dd2b-4519-b7bc-80ac84a4cd16","Type":"ContainerDied","Data":"a8b5d8309bbaf476dac3e6aafc6d67940f17c8c7d9318f0f2f1137533e2777d0"} Mar 08 04:16:48.289273 master-0 kubenswrapper[18592]: I0308 04:16:48.285571 18592 generic.go:334] "Generic (PLEG): container finished" podID="1f03ac81-08d4-4053-9a33-ae4228fb3d3b" containerID="c721ea8060313fa5fc0c7961cf3fa40bc577a54204648b62c16a81b5d46812eb" exitCode=0 Mar 08 04:16:48.289273 master-0 kubenswrapper[18592]: I0308 04:16:48.285713 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.289273 master-0 kubenswrapper[18592]: I0308 04:16:48.285717 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-rk2j4" event={"ID":"1f03ac81-08d4-4053-9a33-ae4228fb3d3b","Type":"ContainerDied","Data":"c721ea8060313fa5fc0c7961cf3fa40bc577a54204648b62c16a81b5d46812eb"} Mar 08 04:16:48.289273 master-0 kubenswrapper[18592]: I0308 04:16:48.286036 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:48.320849 master-0 kubenswrapper[18592]: I0308 04:16:48.316544 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:48.341064 master-0 kubenswrapper[18592]: I0308 04:16:48.341008 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:48.354925 master-0 kubenswrapper[18592]: I0308 04:16:48.353962 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9d6x\" (UniqueName: \"kubernetes.io/projected/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-kube-api-access-p9d6x\") pod \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " Mar 08 04:16:48.354925 master-0 kubenswrapper[18592]: I0308 04:16:48.354016 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-combined-ca-bundle\") pod \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " Mar 08 04:16:48.354925 master-0 kubenswrapper[18592]: I0308 04:16:48.354181 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-logs\") pod \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " Mar 08 04:16:48.354925 master-0 kubenswrapper[18592]: I0308 04:16:48.354264 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-httpd-run\") pod \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " Mar 08 04:16:48.354925 master-0 kubenswrapper[18592]: I0308 04:16:48.354307 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-scripts\") pod \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " Mar 08 04:16:48.354925 master-0 kubenswrapper[18592]: I0308 04:16:48.354362 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-internal-tls-certs\") pod \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " Mar 08 04:16:48.354925 master-0 kubenswrapper[18592]: I0308 04:16:48.354380 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-config-data\") pod \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\" (UID: \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\") " Mar 08 04:16:48.355232 master-0 kubenswrapper[18592]: I0308 04:16:48.354956 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-logs" (OuterVolumeSpecName: "logs") pod "3aea55ba-fea5-4a2c-8101-a7a94f8a518a" (UID: "3aea55ba-fea5-4a2c-8101-a7a94f8a518a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:16:48.358851 master-0 kubenswrapper[18592]: I0308 04:16:48.358795 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3aea55ba-fea5-4a2c-8101-a7a94f8a518a" (UID: "3aea55ba-fea5-4a2c-8101-a7a94f8a518a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:16:48.383891 master-0 kubenswrapper[18592]: I0308 04:16:48.376616 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3aea55ba-fea5-4a2c-8101-a7a94f8a518a" (UID: "3aea55ba-fea5-4a2c-8101-a7a94f8a518a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:48.402447 master-0 kubenswrapper[18592]: I0308 04:16:48.393516 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-scripts" (OuterVolumeSpecName: "scripts") pod "3aea55ba-fea5-4a2c-8101-a7a94f8a518a" (UID: "3aea55ba-fea5-4a2c-8101-a7a94f8a518a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:48.402447 master-0 kubenswrapper[18592]: I0308 04:16:48.393539 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-kube-api-access-p9d6x" (OuterVolumeSpecName: "kube-api-access-p9d6x") pod "3aea55ba-fea5-4a2c-8101-a7a94f8a518a" (UID: "3aea55ba-fea5-4a2c-8101-a7a94f8a518a"). InnerVolumeSpecName "kube-api-access-p9d6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:48.402447 master-0 kubenswrapper[18592]: I0308 04:16:48.394597 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-config-data" (OuterVolumeSpecName: "config-data") pod "3aea55ba-fea5-4a2c-8101-a7a94f8a518a" (UID: "3aea55ba-fea5-4a2c-8101-a7a94f8a518a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:48.402967 master-0 kubenswrapper[18592]: I0308 04:16:48.402907 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3aea55ba-fea5-4a2c-8101-a7a94f8a518a" (UID: "3aea55ba-fea5-4a2c-8101-a7a94f8a518a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:48.459666 master-0 kubenswrapper[18592]: I0308 04:16:48.458653 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5958986d-a206-420e-91c0-5d55f14f2183-logs\") pod \"5958986d-a206-420e-91c0-5d55f14f2183\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " Mar 08 04:16:48.459666 master-0 kubenswrapper[18592]: I0308 04:16:48.458754 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-combined-ca-bundle\") pod \"5958986d-a206-420e-91c0-5d55f14f2183\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " Mar 08 04:16:48.459666 master-0 kubenswrapper[18592]: I0308 04:16:48.458950 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-public-tls-certs\") pod \"5958986d-a206-420e-91c0-5d55f14f2183\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " Mar 08 04:16:48.459666 master-0 kubenswrapper[18592]: I0308 04:16:48.459134 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5958986d-a206-420e-91c0-5d55f14f2183-httpd-run\") pod \"5958986d-a206-420e-91c0-5d55f14f2183\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " Mar 08 04:16:48.459666 master-0 kubenswrapper[18592]: I0308 04:16:48.459174 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-config-data\") pod \"5958986d-a206-420e-91c0-5d55f14f2183\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " Mar 08 04:16:48.459666 master-0 kubenswrapper[18592]: I0308 04:16:48.459229 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-scripts\") pod \"5958986d-a206-420e-91c0-5d55f14f2183\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " Mar 08 04:16:48.459666 master-0 kubenswrapper[18592]: I0308 04:16:48.459295 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7l4q\" (UniqueName: \"kubernetes.io/projected/5958986d-a206-420e-91c0-5d55f14f2183-kube-api-access-x7l4q\") pod \"5958986d-a206-420e-91c0-5d55f14f2183\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " Mar 08 04:16:48.459666 master-0 kubenswrapper[18592]: I0308 04:16:48.459565 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5958986d-a206-420e-91c0-5d55f14f2183-logs" (OuterVolumeSpecName: "logs") pod "5958986d-a206-420e-91c0-5d55f14f2183" (UID: "5958986d-a206-420e-91c0-5d55f14f2183"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:16:48.460436 master-0 kubenswrapper[18592]: I0308 04:16:48.460238 18592 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.460436 master-0 kubenswrapper[18592]: I0308 04:16:48.460260 18592 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.460436 master-0 kubenswrapper[18592]: I0308 04:16:48.460271 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.460436 master-0 kubenswrapper[18592]: I0308 04:16:48.460282 18592 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5958986d-a206-420e-91c0-5d55f14f2183-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.460436 master-0 kubenswrapper[18592]: I0308 04:16:48.460291 18592 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.460436 master-0 kubenswrapper[18592]: I0308 04:16:48.460300 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.460436 master-0 kubenswrapper[18592]: I0308 04:16:48.460308 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9d6x\" (UniqueName: \"kubernetes.io/projected/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-kube-api-access-p9d6x\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.460436 master-0 kubenswrapper[18592]: I0308 04:16:48.460316 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aea55ba-fea5-4a2c-8101-a7a94f8a518a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.461215 master-0 kubenswrapper[18592]: I0308 04:16:48.460956 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5958986d-a206-420e-91c0-5d55f14f2183-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5958986d-a206-420e-91c0-5d55f14f2183" (UID: "5958986d-a206-420e-91c0-5d55f14f2183"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:16:48.479226 master-0 kubenswrapper[18592]: I0308 04:16:48.478571 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5958986d-a206-420e-91c0-5d55f14f2183-kube-api-access-x7l4q" (OuterVolumeSpecName: "kube-api-access-x7l4q") pod "5958986d-a206-420e-91c0-5d55f14f2183" (UID: "5958986d-a206-420e-91c0-5d55f14f2183"). InnerVolumeSpecName "kube-api-access-x7l4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:48.480135 master-0 kubenswrapper[18592]: I0308 04:16:48.479853 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5958986d-a206-420e-91c0-5d55f14f2183" (UID: "5958986d-a206-420e-91c0-5d55f14f2183"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:48.480135 master-0 kubenswrapper[18592]: I0308 04:16:48.479869 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5958986d-a206-420e-91c0-5d55f14f2183" (UID: "5958986d-a206-420e-91c0-5d55f14f2183"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:48.480350 master-0 kubenswrapper[18592]: I0308 04:16:48.480321 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-config-data" (OuterVolumeSpecName: "config-data") pod "5958986d-a206-420e-91c0-5d55f14f2183" (UID: "5958986d-a206-420e-91c0-5d55f14f2183"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:48.483426 master-0 kubenswrapper[18592]: I0308 04:16:48.482835 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-scripts" (OuterVolumeSpecName: "scripts") pod "5958986d-a206-420e-91c0-5d55f14f2183" (UID: "5958986d-a206-420e-91c0-5d55f14f2183"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:48.555139 master-0 kubenswrapper[18592]: I0308 04:16:48.555085 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79c75d759-lh7hc"] Mar 08 04:16:48.576976 master-0 kubenswrapper[18592]: I0308 04:16:48.572131 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.576976 master-0 kubenswrapper[18592]: I0308 04:16:48.572180 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.576976 master-0 kubenswrapper[18592]: I0308 04:16:48.572191 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7l4q\" (UniqueName: \"kubernetes.io/projected/5958986d-a206-420e-91c0-5d55f14f2183-kube-api-access-x7l4q\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.576976 master-0 kubenswrapper[18592]: I0308 04:16:48.572202 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.576976 master-0 kubenswrapper[18592]: I0308 04:16:48.572211 18592 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5958986d-a206-420e-91c0-5d55f14f2183-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.576976 master-0 kubenswrapper[18592]: I0308 04:16:48.572223 18592 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5958986d-a206-420e-91c0-5d55f14f2183-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:48.589847 master-0 kubenswrapper[18592]: I0308 04:16:48.577633 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79c75d759-lh7hc"] Mar 08 04:16:48.676277 master-0 kubenswrapper[18592]: I0308 04:16:48.676235 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-acff9521-23da-47da-b539-1fad9dc0c8dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb\") pod \"glance-afe2b-default-external-api-0\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:48.775046 master-0 kubenswrapper[18592]: I0308 04:16:48.775013 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb\") pod \"5958986d-a206-420e-91c0-5d55f14f2183\" (UID: \"5958986d-a206-420e-91c0-5d55f14f2183\") " Mar 08 04:16:49.317742 master-0 kubenswrapper[18592]: I0308 04:16:49.317682 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5584fcc769-csbz6" event={"ID":"ea394be4-dd2b-4519-b7bc-80ac84a4cd16","Type":"ContainerStarted","Data":"05b7d99b429e89d85d450e165aad20f9bbf2cbd354aa2d0ae238fab324d45d41"} Mar 08 04:16:49.318536 master-0 kubenswrapper[18592]: I0308 04:16:49.317874 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:49.318536 master-0 kubenswrapper[18592]: I0308 04:16:49.318069 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.322079 master-0 kubenswrapper[18592]: I0308 04:16:49.322019 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:49.359860 master-0 kubenswrapper[18592]: I0308 04:16:49.345372 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5584fcc769-csbz6" podStartSLOduration=4.34531411 podStartE2EDuration="4.34531411s" podCreationTimestamp="2026-03-08 04:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:16:49.344098296 +0000 UTC m=+1421.442852646" watchObservedRunningTime="2026-03-08 04:16:49.34531411 +0000 UTC m=+1421.444068460" Mar 08 04:16:49.446855 master-0 kubenswrapper[18592]: I0308 04:16:49.441586 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-afe2b-default-internal-api-0"] Mar 08 04:16:49.463884 master-0 kubenswrapper[18592]: I0308 04:16:49.454866 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-afe2b-default-internal-api-0"] Mar 08 04:16:49.532754 master-0 kubenswrapper[18592]: I0308 04:16:49.490996 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-afe2b-default-internal-api-0"] Mar 08 04:16:49.532754 master-0 kubenswrapper[18592]: I0308 04:16:49.493419 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.532754 master-0 kubenswrapper[18592]: I0308 04:16:49.502426 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 08 04:16:49.532754 master-0 kubenswrapper[18592]: I0308 04:16:49.502552 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-afe2b-default-internal-config-data" Mar 08 04:16:49.532754 master-0 kubenswrapper[18592]: I0308 04:16:49.527218 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-afe2b-default-internal-api-0"] Mar 08 04:16:49.597926 master-0 kubenswrapper[18592]: I0308 04:16:49.597414 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-scripts\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.597926 master-0 kubenswrapper[18592]: I0308 04:16:49.597467 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-combined-ca-bundle\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.597926 master-0 kubenswrapper[18592]: I0308 04:16:49.597546 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5qhv\" (UniqueName: \"kubernetes.io/projected/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-kube-api-access-t5qhv\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.597926 master-0 kubenswrapper[18592]: I0308 04:16:49.597600 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-httpd-run\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.597926 master-0 kubenswrapper[18592]: I0308 04:16:49.597905 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-logs\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.598306 master-0 kubenswrapper[18592]: I0308 04:16:49.597974 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-config-data\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.598306 master-0 kubenswrapper[18592]: I0308 04:16:49.598050 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-internal-tls-certs\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.700293 master-0 kubenswrapper[18592]: I0308 04:16:49.700227 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-scripts\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.700293 master-0 kubenswrapper[18592]: I0308 04:16:49.700285 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-combined-ca-bundle\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.700293 master-0 kubenswrapper[18592]: I0308 04:16:49.700308 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5qhv\" (UniqueName: \"kubernetes.io/projected/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-kube-api-access-t5qhv\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.700671 master-0 kubenswrapper[18592]: I0308 04:16:49.700647 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-httpd-run\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.700752 master-0 kubenswrapper[18592]: I0308 04:16:49.700726 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-logs\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.701083 master-0 kubenswrapper[18592]: I0308 04:16:49.700752 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-config-data\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.701083 master-0 kubenswrapper[18592]: I0308 04:16:49.700782 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-internal-tls-certs\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.701902 master-0 kubenswrapper[18592]: I0308 04:16:49.701844 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-logs\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.702159 master-0 kubenswrapper[18592]: I0308 04:16:49.702137 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-httpd-run\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.707169 master-0 kubenswrapper[18592]: I0308 04:16:49.705572 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-scripts\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.716562 master-0 kubenswrapper[18592]: I0308 04:16:49.709327 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-internal-tls-certs\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.723884 master-0 kubenswrapper[18592]: I0308 04:16:49.717704 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5qhv\" (UniqueName: \"kubernetes.io/projected/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-kube-api-access-t5qhv\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.723884 master-0 kubenswrapper[18592]: I0308 04:16:49.720665 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-config-data\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.747415 master-0 kubenswrapper[18592]: I0308 04:16:49.745452 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-combined-ca-bundle\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:49.928353 master-0 kubenswrapper[18592]: I0308 04:16:49.926916 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-e561-account-create-update-pmf7m" Mar 08 04:16:50.020099 master-0 kubenswrapper[18592]: I0308 04:16:50.017716 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kk8z\" (UniqueName: \"kubernetes.io/projected/81d0a12f-c85b-41ef-a076-efa3dd40f9aa-kube-api-access-2kk8z\") pod \"81d0a12f-c85b-41ef-a076-efa3dd40f9aa\" (UID: \"81d0a12f-c85b-41ef-a076-efa3dd40f9aa\") " Mar 08 04:16:50.020099 master-0 kubenswrapper[18592]: I0308 04:16:50.017772 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81d0a12f-c85b-41ef-a076-efa3dd40f9aa-operator-scripts\") pod \"81d0a12f-c85b-41ef-a076-efa3dd40f9aa\" (UID: \"81d0a12f-c85b-41ef-a076-efa3dd40f9aa\") " Mar 08 04:16:50.020099 master-0 kubenswrapper[18592]: I0308 04:16:50.018554 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81d0a12f-c85b-41ef-a076-efa3dd40f9aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81d0a12f-c85b-41ef-a076-efa3dd40f9aa" (UID: "81d0a12f-c85b-41ef-a076-efa3dd40f9aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:50.031874 master-0 kubenswrapper[18592]: I0308 04:16:50.022515 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81d0a12f-c85b-41ef-a076-efa3dd40f9aa-kube-api-access-2kk8z" (OuterVolumeSpecName: "kube-api-access-2kk8z") pod "81d0a12f-c85b-41ef-a076-efa3dd40f9aa" (UID: "81d0a12f-c85b-41ef-a076-efa3dd40f9aa"). InnerVolumeSpecName "kube-api-access-2kk8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:50.075545 master-0 kubenswrapper[18592]: I0308 04:16:50.075503 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-rk2j4" Mar 08 04:16:50.119737 master-0 kubenswrapper[18592]: I0308 04:16:50.119626 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f03ac81-08d4-4053-9a33-ae4228fb3d3b-operator-scripts\") pod \"1f03ac81-08d4-4053-9a33-ae4228fb3d3b\" (UID: \"1f03ac81-08d4-4053-9a33-ae4228fb3d3b\") " Mar 08 04:16:50.120028 master-0 kubenswrapper[18592]: I0308 04:16:50.119966 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt6pq\" (UniqueName: \"kubernetes.io/projected/1f03ac81-08d4-4053-9a33-ae4228fb3d3b-kube-api-access-wt6pq\") pod \"1f03ac81-08d4-4053-9a33-ae4228fb3d3b\" (UID: \"1f03ac81-08d4-4053-9a33-ae4228fb3d3b\") " Mar 08 04:16:50.122241 master-0 kubenswrapper[18592]: I0308 04:16:50.122192 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f03ac81-08d4-4053-9a33-ae4228fb3d3b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1f03ac81-08d4-4053-9a33-ae4228fb3d3b" (UID: "1f03ac81-08d4-4053-9a33-ae4228fb3d3b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:50.127691 master-0 kubenswrapper[18592]: I0308 04:16:50.127652 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f03ac81-08d4-4053-9a33-ae4228fb3d3b-kube-api-access-wt6pq" (OuterVolumeSpecName: "kube-api-access-wt6pq") pod "1f03ac81-08d4-4053-9a33-ae4228fb3d3b" (UID: "1f03ac81-08d4-4053-9a33-ae4228fb3d3b"). InnerVolumeSpecName "kube-api-access-wt6pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:50.128010 master-0 kubenswrapper[18592]: I0308 04:16:50.127981 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt6pq\" (UniqueName: \"kubernetes.io/projected/1f03ac81-08d4-4053-9a33-ae4228fb3d3b-kube-api-access-wt6pq\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:50.128010 master-0 kubenswrapper[18592]: I0308 04:16:50.128005 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f03ac81-08d4-4053-9a33-ae4228fb3d3b-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:50.128101 master-0 kubenswrapper[18592]: I0308 04:16:50.128015 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kk8z\" (UniqueName: \"kubernetes.io/projected/81d0a12f-c85b-41ef-a076-efa3dd40f9aa-kube-api-access-2kk8z\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:50.128101 master-0 kubenswrapper[18592]: I0308 04:16:50.128025 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81d0a12f-c85b-41ef-a076-efa3dd40f9aa-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:50.211694 master-0 kubenswrapper[18592]: I0308 04:16:50.211625 18592 kubelet_volumes.go:135] "Cleaned up orphaned volume from pod" podUID="3aea55ba-fea5-4a2c-8101-a7a94f8a518a" path="/var/lib/kubelet/pods/3aea55ba-fea5-4a2c-8101-a7a94f8a518a/volumes/kubernetes.io~csi/pvc-9bbcd16a-3cea-4572-8bc5-480804def335/mount" Mar 08 04:16:50.212786 master-0 kubenswrapper[18592]: I0308 04:16:50.212770 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc53e341-1423-4a10-b198-5fb7f51dcc52" path="/var/lib/kubelet/pods/fc53e341-1423-4a10-b198-5fb7f51dcc52/volumes" Mar 08 04:16:50.215377 master-0 kubenswrapper[18592]: E0308 04:16:50.215359 18592 kubelet_volumes.go:263] "There were many similar errors. Turn up verbosity to see them." err="orphaned pod \"3aea55ba-fea5-4a2c-8101-a7a94f8a518a\" found, but error occurred when trying to remove the volumes dir: not a directory" numErrs=1 Mar 08 04:16:50.331714 master-0 kubenswrapper[18592]: I0308 04:16:50.331655 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-e561-account-create-update-pmf7m" Mar 08 04:16:50.332371 master-0 kubenswrapper[18592]: I0308 04:16:50.332214 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-e561-account-create-update-pmf7m" event={"ID":"81d0a12f-c85b-41ef-a076-efa3dd40f9aa","Type":"ContainerDied","Data":"d81fda4b7a51ee89a266c3dd1cb518265f1f7785498ef9978b50da312c922602"} Mar 08 04:16:50.332371 master-0 kubenswrapper[18592]: I0308 04:16:50.332367 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d81fda4b7a51ee89a266c3dd1cb518265f1f7785498ef9978b50da312c922602" Mar 08 04:16:50.336628 master-0 kubenswrapper[18592]: I0308 04:16:50.336560 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-rk2j4" event={"ID":"1f03ac81-08d4-4053-9a33-ae4228fb3d3b","Type":"ContainerDied","Data":"010b7028b350b87b2ea58db37a7bea893edf0e0dbad2644eea198d27baf58f4d"} Mar 08 04:16:50.336732 master-0 kubenswrapper[18592]: I0308 04:16:50.336674 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-rk2j4" Mar 08 04:16:50.336815 master-0 kubenswrapper[18592]: I0308 04:16:50.336675 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="010b7028b350b87b2ea58db37a7bea893edf0e0dbad2644eea198d27baf58f4d" Mar 08 04:16:50.539218 master-0 kubenswrapper[18592]: E0308 04:16:50.539165 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/topolvm.io^613c9bf7-76cc-44ce-8d9f-8cdae5e6db9e podName: nodeName:}" failed. No retries permitted until 2026-03-08 04:16:51.03914563 +0000 UTC m=+1423.137899980 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "pvc-9bbcd16a-3cea-4572-8bc5-480804def335" (UniqueName: "kubernetes.io/csi/topolvm.io^613c9bf7-76cc-44ce-8d9f-8cdae5e6db9e") pod "glance-afe2b-default-internal-api-0" (UID: "3aea55ba-fea5-4a2c-8101-a7a94f8a518a") : rpc error: code = Internal desc = mount failed: volume=613c9bf7-76cc-44ce-8d9f-8cdae5e6db9e, error=mount failed: exit status 32 Mar 08 04:16:50.539218 master-0 kubenswrapper[18592]: Mounting command: mount Mar 08 04:16:50.539218 master-0 kubenswrapper[18592]: Mounting arguments: -t xfs -o nouuid,defaults /dev/local-storage/613c9bf7-76cc-44ce-8d9f-8cdae5e6db9e /var/lib/kubelet/pods/3aea55ba-fea5-4a2c-8101-a7a94f8a518a/volumes/kubernetes.io~csi/pvc-9bbcd16a-3cea-4572-8bc5-480804def335/mount Mar 08 04:16:50.539218 master-0 kubenswrapper[18592]: Output: mount: /var/lib/kubelet/pods/3aea55ba-fea5-4a2c-8101-a7a94f8a518a/volumes/kubernetes.io~csi/pvc-9bbcd16a-3cea-4572-8bc5-480804def335/mount: mount point does not exist. Mar 08 04:16:50.576661 master-0 kubenswrapper[18592]: I0308 04:16:50.576595 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb" (OuterVolumeSpecName: "glance") pod "5958986d-a206-420e-91c0-5d55f14f2183" (UID: "5958986d-a206-420e-91c0-5d55f14f2183"). InnerVolumeSpecName "pvc-acff9521-23da-47da-b539-1fad9dc0c8dd". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 04:16:50.639407 master-0 kubenswrapper[18592]: I0308 04:16:50.639163 18592 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-acff9521-23da-47da-b539-1fad9dc0c8dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb\") on node \"master-0\" " Mar 08 04:16:50.664690 master-0 kubenswrapper[18592]: I0308 04:16:50.664641 18592 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 04:16:50.664920 master-0 kubenswrapper[18592]: I0308 04:16:50.664799 18592 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-acff9521-23da-47da-b539-1fad9dc0c8dd" (UniqueName: "kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb") on node "master-0" Mar 08 04:16:50.742119 master-0 kubenswrapper[18592]: I0308 04:16:50.742064 18592 reconciler_common.go:293] "Volume detached for volume \"pvc-acff9521-23da-47da-b539-1fad9dc0c8dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:50.952788 master-0 kubenswrapper[18592]: I0308 04:16:50.949917 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-afe2b-default-external-api-0"] Mar 08 04:16:50.990744 master-0 kubenswrapper[18592]: I0308 04:16:50.983686 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-afe2b-default-external-api-0"] Mar 08 04:16:51.007707 master-0 kubenswrapper[18592]: I0308 04:16:51.007654 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-afe2b-default-external-api-0"] Mar 08 04:16:51.008205 master-0 kubenswrapper[18592]: E0308 04:16:51.008181 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81d0a12f-c85b-41ef-a076-efa3dd40f9aa" containerName="mariadb-account-create-update" Mar 08 04:16:51.008205 master-0 kubenswrapper[18592]: I0308 04:16:51.008203 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d0a12f-c85b-41ef-a076-efa3dd40f9aa" containerName="mariadb-account-create-update" Mar 08 04:16:51.008312 master-0 kubenswrapper[18592]: E0308 04:16:51.008232 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f03ac81-08d4-4053-9a33-ae4228fb3d3b" containerName="mariadb-database-create" Mar 08 04:16:51.008312 master-0 kubenswrapper[18592]: I0308 04:16:51.008240 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f03ac81-08d4-4053-9a33-ae4228fb3d3b" containerName="mariadb-database-create" Mar 08 04:16:51.008492 master-0 kubenswrapper[18592]: I0308 04:16:51.008471 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="81d0a12f-c85b-41ef-a076-efa3dd40f9aa" containerName="mariadb-account-create-update" Mar 08 04:16:51.008562 master-0 kubenswrapper[18592]: I0308 04:16:51.008520 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f03ac81-08d4-4053-9a33-ae4228fb3d3b" containerName="mariadb-database-create" Mar 08 04:16:51.009895 master-0 kubenswrapper[18592]: I0308 04:16:51.009860 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.021418 master-0 kubenswrapper[18592]: I0308 04:16:51.020999 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 04:16:51.021418 master-0 kubenswrapper[18592]: I0308 04:16:51.021332 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-afe2b-default-external-config-data" Mar 08 04:16:51.045696 master-0 kubenswrapper[18592]: I0308 04:16:51.045625 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-afe2b-default-external-api-0"] Mar 08 04:16:51.048814 master-0 kubenswrapper[18592]: I0308 04:16:51.048767 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9bbcd16a-3cea-4572-8bc5-480804def335\" (UniqueName: \"kubernetes.io/csi/topolvm.io^613c9bf7-76cc-44ce-8d9f-8cdae5e6db9e\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:51.151105 master-0 kubenswrapper[18592]: I0308 04:16:51.150976 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0eff6f29-794c-4597-b53f-c030263b2080-httpd-run\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.151105 master-0 kubenswrapper[18592]: I0308 04:16:51.151066 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-config-data\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.151311 master-0 kubenswrapper[18592]: I0308 04:16:51.151134 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dktq\" (UniqueName: \"kubernetes.io/projected/0eff6f29-794c-4597-b53f-c030263b2080-kube-api-access-5dktq\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.151311 master-0 kubenswrapper[18592]: I0308 04:16:51.151161 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-scripts\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.151377 master-0 kubenswrapper[18592]: I0308 04:16:51.151330 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-public-tls-certs\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.151433 master-0 kubenswrapper[18592]: I0308 04:16:51.151412 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0eff6f29-794c-4597-b53f-c030263b2080-logs\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.151557 master-0 kubenswrapper[18592]: I0308 04:16:51.151539 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-combined-ca-bundle\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.152025 master-0 kubenswrapper[18592]: I0308 04:16:51.152005 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-acff9521-23da-47da-b539-1fad9dc0c8dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.254248 master-0 kubenswrapper[18592]: I0308 04:16:51.254144 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-acff9521-23da-47da-b539-1fad9dc0c8dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.254928 master-0 kubenswrapper[18592]: I0308 04:16:51.254696 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0eff6f29-794c-4597-b53f-c030263b2080-httpd-run\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.254928 master-0 kubenswrapper[18592]: I0308 04:16:51.254790 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-config-data\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.255041 master-0 kubenswrapper[18592]: I0308 04:16:51.254967 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dktq\" (UniqueName: \"kubernetes.io/projected/0eff6f29-794c-4597-b53f-c030263b2080-kube-api-access-5dktq\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.255087 master-0 kubenswrapper[18592]: I0308 04:16:51.255060 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-scripts\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.255233 master-0 kubenswrapper[18592]: I0308 04:16:51.255215 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-public-tls-certs\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.255307 master-0 kubenswrapper[18592]: I0308 04:16:51.255290 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0eff6f29-794c-4597-b53f-c030263b2080-logs\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.255363 master-0 kubenswrapper[18592]: I0308 04:16:51.255347 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-combined-ca-bundle\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.255887 master-0 kubenswrapper[18592]: I0308 04:16:51.255860 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0eff6f29-794c-4597-b53f-c030263b2080-httpd-run\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.257109 master-0 kubenswrapper[18592]: I0308 04:16:51.256796 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0eff6f29-794c-4597-b53f-c030263b2080-logs\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.258720 master-0 kubenswrapper[18592]: I0308 04:16:51.257846 18592 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 04:16:51.258720 master-0 kubenswrapper[18592]: I0308 04:16:51.257893 18592 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-acff9521-23da-47da-b539-1fad9dc0c8dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/4fa84c0fff972147ae10cb80fb512dbc343a3b9b44d5fb87069bb7869e732a9b/globalmount\"" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.260737 master-0 kubenswrapper[18592]: I0308 04:16:51.260714 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-scripts\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.262535 master-0 kubenswrapper[18592]: I0308 04:16:51.262471 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-public-tls-certs\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.264813 master-0 kubenswrapper[18592]: I0308 04:16:51.264759 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-config-data\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.281723 master-0 kubenswrapper[18592]: I0308 04:16:51.281670 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-combined-ca-bundle\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:51.282097 master-0 kubenswrapper[18592]: I0308 04:16:51.282067 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dktq\" (UniqueName: \"kubernetes.io/projected/0eff6f29-794c-4597-b53f-c030263b2080-kube-api-access-5dktq\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:52.158444 master-0 kubenswrapper[18592]: I0308 04:16:52.158384 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aea55ba-fea5-4a2c-8101-a7a94f8a518a" path="/var/lib/kubelet/pods/3aea55ba-fea5-4a2c-8101-a7a94f8a518a/volumes" Mar 08 04:16:52.159529 master-0 kubenswrapper[18592]: I0308 04:16:52.159512 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5958986d-a206-420e-91c0-5d55f14f2183" path="/var/lib/kubelet/pods/5958986d-a206-420e-91c0-5d55f14f2183/volumes" Mar 08 04:16:52.389095 master-0 kubenswrapper[18592]: I0308 04:16:52.389020 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9bbcd16a-3cea-4572-8bc5-480804def335\" (UniqueName: \"kubernetes.io/csi/topolvm.io^613c9bf7-76cc-44ce-8d9f-8cdae5e6db9e\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:52.595931 master-0 kubenswrapper[18592]: I0308 04:16:52.595779 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:16:53.168027 master-0 kubenswrapper[18592]: I0308 04:16:53.167959 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-afe2b-default-internal-api-0"] Mar 08 04:16:53.385049 master-0 kubenswrapper[18592]: I0308 04:16:53.384974 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-afe2b-default-internal-api-0" event={"ID":"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1","Type":"ContainerStarted","Data":"a3cda40a372fe08964e712fd6c2673cdcdb0a13bd49c0058145d1da01eec478c"} Mar 08 04:16:53.393636 master-0 kubenswrapper[18592]: I0308 04:16:53.393574 18592 generic.go:334] "Generic (PLEG): container finished" podID="319bbb45-9cc5-4110-8aed-94ccbe808779" containerID="815a2ab3203863df3a809e029c429539a9587ad09ba883aa8520d64bec13105d" exitCode=0 Mar 08 04:16:53.393736 master-0 kubenswrapper[18592]: I0308 04:16:53.393639 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-z8kn6" event={"ID":"319bbb45-9cc5-4110-8aed-94ccbe808779","Type":"ContainerDied","Data":"815a2ab3203863df3a809e029c429539a9587ad09ba883aa8520d64bec13105d"} Mar 08 04:16:53.399376 master-0 kubenswrapper[18592]: I0308 04:16:53.399282 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7ttxm" event={"ID":"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769","Type":"ContainerStarted","Data":"95cd58650c1743a863844b11c731a9d0decec809efde463936ffaea0589654db"} Mar 08 04:16:53.451065 master-0 kubenswrapper[18592]: I0308 04:16:53.450957 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-7ttxm" podStartSLOduration=2.80700648 podStartE2EDuration="8.450929513s" podCreationTimestamp="2026-03-08 04:16:45 +0000 UTC" firstStartedPulling="2026-03-08 04:16:46.872234038 +0000 UTC m=+1418.970988388" lastFinishedPulling="2026-03-08 04:16:52.516157061 +0000 UTC m=+1424.614911421" observedRunningTime="2026-03-08 04:16:53.437569739 +0000 UTC m=+1425.536324089" watchObservedRunningTime="2026-03-08 04:16:53.450929513 +0000 UTC m=+1425.549683893" Mar 08 04:16:53.790979 master-0 kubenswrapper[18592]: I0308 04:16:53.790941 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-acff9521-23da-47da-b539-1fad9dc0c8dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb\") pod \"glance-afe2b-default-external-api-0\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:54.053112 master-0 kubenswrapper[18592]: I0308 04:16:54.052957 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:16:54.449399 master-0 kubenswrapper[18592]: I0308 04:16:54.449340 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-afe2b-default-internal-api-0" event={"ID":"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1","Type":"ContainerStarted","Data":"5754359ee075d07402e8f61461cb8a36550cb96bb16b4338c76ba1df26aca2ca"} Mar 08 04:16:54.562094 master-0 kubenswrapper[18592]: I0308 04:16:54.561774 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-afe2b-default-external-api-0"] Mar 08 04:16:54.966505 master-0 kubenswrapper[18592]: I0308 04:16:54.966330 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:55.082917 master-0 kubenswrapper[18592]: I0308 04:16:55.082505 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-config-data\") pod \"319bbb45-9cc5-4110-8aed-94ccbe808779\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " Mar 08 04:16:55.082917 master-0 kubenswrapper[18592]: I0308 04:16:55.082692 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-fernet-keys\") pod \"319bbb45-9cc5-4110-8aed-94ccbe808779\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " Mar 08 04:16:55.082917 master-0 kubenswrapper[18592]: I0308 04:16:55.082723 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpm5w\" (UniqueName: \"kubernetes.io/projected/319bbb45-9cc5-4110-8aed-94ccbe808779-kube-api-access-tpm5w\") pod \"319bbb45-9cc5-4110-8aed-94ccbe808779\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " Mar 08 04:16:55.082917 master-0 kubenswrapper[18592]: I0308 04:16:55.082860 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-credential-keys\") pod \"319bbb45-9cc5-4110-8aed-94ccbe808779\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " Mar 08 04:16:55.083260 master-0 kubenswrapper[18592]: I0308 04:16:55.082979 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-combined-ca-bundle\") pod \"319bbb45-9cc5-4110-8aed-94ccbe808779\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " Mar 08 04:16:55.083260 master-0 kubenswrapper[18592]: I0308 04:16:55.083136 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-scripts\") pod \"319bbb45-9cc5-4110-8aed-94ccbe808779\" (UID: \"319bbb45-9cc5-4110-8aed-94ccbe808779\") " Mar 08 04:16:55.092045 master-0 kubenswrapper[18592]: I0308 04:16:55.091993 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/319bbb45-9cc5-4110-8aed-94ccbe808779-kube-api-access-tpm5w" (OuterVolumeSpecName: "kube-api-access-tpm5w") pod "319bbb45-9cc5-4110-8aed-94ccbe808779" (UID: "319bbb45-9cc5-4110-8aed-94ccbe808779"). InnerVolumeSpecName "kube-api-access-tpm5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:55.092238 master-0 kubenswrapper[18592]: I0308 04:16:55.092182 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "319bbb45-9cc5-4110-8aed-94ccbe808779" (UID: "319bbb45-9cc5-4110-8aed-94ccbe808779"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:55.092330 master-0 kubenswrapper[18592]: I0308 04:16:55.092299 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-scripts" (OuterVolumeSpecName: "scripts") pod "319bbb45-9cc5-4110-8aed-94ccbe808779" (UID: "319bbb45-9cc5-4110-8aed-94ccbe808779"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:55.092369 master-0 kubenswrapper[18592]: I0308 04:16:55.092283 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "319bbb45-9cc5-4110-8aed-94ccbe808779" (UID: "319bbb45-9cc5-4110-8aed-94ccbe808779"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:55.124166 master-0 kubenswrapper[18592]: I0308 04:16:55.124096 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "319bbb45-9cc5-4110-8aed-94ccbe808779" (UID: "319bbb45-9cc5-4110-8aed-94ccbe808779"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:55.128836 master-0 kubenswrapper[18592]: I0308 04:16:55.128503 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-config-data" (OuterVolumeSpecName: "config-data") pod "319bbb45-9cc5-4110-8aed-94ccbe808779" (UID: "319bbb45-9cc5-4110-8aed-94ccbe808779"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:16:55.187235 master-0 kubenswrapper[18592]: I0308 04:16:55.187084 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:55.187235 master-0 kubenswrapper[18592]: I0308 04:16:55.187194 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:55.187235 master-0 kubenswrapper[18592]: I0308 04:16:55.187206 18592 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:55.187235 master-0 kubenswrapper[18592]: I0308 04:16:55.187216 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpm5w\" (UniqueName: \"kubernetes.io/projected/319bbb45-9cc5-4110-8aed-94ccbe808779-kube-api-access-tpm5w\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:55.187235 master-0 kubenswrapper[18592]: I0308 04:16:55.187226 18592 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-credential-keys\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:55.187235 master-0 kubenswrapper[18592]: I0308 04:16:55.187235 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/319bbb45-9cc5-4110-8aed-94ccbe808779-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:55.322464 master-0 kubenswrapper[18592]: I0308 04:16:55.322415 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-sync-78kcs"] Mar 08 04:16:55.323075 master-0 kubenswrapper[18592]: E0308 04:16:55.323033 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="319bbb45-9cc5-4110-8aed-94ccbe808779" containerName="keystone-bootstrap" Mar 08 04:16:55.323163 master-0 kubenswrapper[18592]: I0308 04:16:55.323078 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="319bbb45-9cc5-4110-8aed-94ccbe808779" containerName="keystone-bootstrap" Mar 08 04:16:55.323338 master-0 kubenswrapper[18592]: I0308 04:16:55.323311 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="319bbb45-9cc5-4110-8aed-94ccbe808779" containerName="keystone-bootstrap" Mar 08 04:16:55.324611 master-0 kubenswrapper[18592]: I0308 04:16:55.324518 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.329109 master-0 kubenswrapper[18592]: I0308 04:16:55.327880 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Mar 08 04:16:55.329109 master-0 kubenswrapper[18592]: I0308 04:16:55.328179 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-scripts" Mar 08 04:16:55.341427 master-0 kubenswrapper[18592]: I0308 04:16:55.337280 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-78kcs"] Mar 08 04:16:55.469924 master-0 kubenswrapper[18592]: I0308 04:16:55.469861 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-afe2b-default-external-api-0" event={"ID":"0eff6f29-794c-4597-b53f-c030263b2080","Type":"ContainerStarted","Data":"9bdd4546b9740edeb6f0f584dc8c3feaebf706adb02f7a37561eaa759eeb812f"} Mar 08 04:16:55.469924 master-0 kubenswrapper[18592]: I0308 04:16:55.469921 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-afe2b-default-external-api-0" event={"ID":"0eff6f29-794c-4597-b53f-c030263b2080","Type":"ContainerStarted","Data":"dfdbb9253f240f054c23481b82fe627e31b5016d69e891ea9e38d0591252690d"} Mar 08 04:16:55.472500 master-0 kubenswrapper[18592]: I0308 04:16:55.472446 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-afe2b-default-internal-api-0" event={"ID":"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1","Type":"ContainerStarted","Data":"9f477e1a96bfab138a86ff6cf71109073bb51bdcf5d50327fc3da0149cb0c5c1"} Mar 08 04:16:55.479765 master-0 kubenswrapper[18592]: I0308 04:16:55.479712 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-z8kn6" event={"ID":"319bbb45-9cc5-4110-8aed-94ccbe808779","Type":"ContainerDied","Data":"bc21c00eac515f76c290618c7f8a28f745651b7325e8c3f3a3e7beeeff9378cd"} Mar 08 04:16:55.479765 master-0 kubenswrapper[18592]: I0308 04:16:55.479752 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc21c00eac515f76c290618c7f8a28f745651b7325e8c3f3a3e7beeeff9378cd" Mar 08 04:16:55.480102 master-0 kubenswrapper[18592]: I0308 04:16:55.479803 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-z8kn6" Mar 08 04:16:55.506406 master-0 kubenswrapper[18592]: I0308 04:16:55.502035 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rcph\" (UniqueName: \"kubernetes.io/projected/c11a4533-a895-42a7-8c17-d6f421276ae0-kube-api-access-6rcph\") pod \"ironic-db-sync-78kcs\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.506406 master-0 kubenswrapper[18592]: I0308 04:16:55.502144 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c11a4533-a895-42a7-8c17-d6f421276ae0-config-data-merged\") pod \"ironic-db-sync-78kcs\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.506406 master-0 kubenswrapper[18592]: I0308 04:16:55.502229 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11a4533-a895-42a7-8c17-d6f421276ae0-combined-ca-bundle\") pod \"ironic-db-sync-78kcs\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.506406 master-0 kubenswrapper[18592]: I0308 04:16:55.502325 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11a4533-a895-42a7-8c17-d6f421276ae0-config-data\") pod \"ironic-db-sync-78kcs\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.506406 master-0 kubenswrapper[18592]: I0308 04:16:55.502366 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c11a4533-a895-42a7-8c17-d6f421276ae0-etc-podinfo\") pod \"ironic-db-sync-78kcs\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.506406 master-0 kubenswrapper[18592]: I0308 04:16:55.502478 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c11a4533-a895-42a7-8c17-d6f421276ae0-scripts\") pod \"ironic-db-sync-78kcs\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.530575 master-0 kubenswrapper[18592]: I0308 04:16:55.530471 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-afe2b-default-internal-api-0" podStartSLOduration=6.53044873 podStartE2EDuration="6.53044873s" podCreationTimestamp="2026-03-08 04:16:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:16:55.500354353 +0000 UTC m=+1427.599108723" watchObservedRunningTime="2026-03-08 04:16:55.53044873 +0000 UTC m=+1427.629203090" Mar 08 04:16:55.560769 master-0 kubenswrapper[18592]: I0308 04:16:55.560704 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-z8kn6"] Mar 08 04:16:55.572889 master-0 kubenswrapper[18592]: I0308 04:16:55.572817 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-z8kn6"] Mar 08 04:16:55.608903 master-0 kubenswrapper[18592]: I0308 04:16:55.607667 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c11a4533-a895-42a7-8c17-d6f421276ae0-scripts\") pod \"ironic-db-sync-78kcs\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.608903 master-0 kubenswrapper[18592]: I0308 04:16:55.607852 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rcph\" (UniqueName: \"kubernetes.io/projected/c11a4533-a895-42a7-8c17-d6f421276ae0-kube-api-access-6rcph\") pod \"ironic-db-sync-78kcs\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.608903 master-0 kubenswrapper[18592]: I0308 04:16:55.607958 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c11a4533-a895-42a7-8c17-d6f421276ae0-config-data-merged\") pod \"ironic-db-sync-78kcs\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.608903 master-0 kubenswrapper[18592]: I0308 04:16:55.608029 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11a4533-a895-42a7-8c17-d6f421276ae0-combined-ca-bundle\") pod \"ironic-db-sync-78kcs\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.608903 master-0 kubenswrapper[18592]: I0308 04:16:55.608080 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11a4533-a895-42a7-8c17-d6f421276ae0-config-data\") pod \"ironic-db-sync-78kcs\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.608903 master-0 kubenswrapper[18592]: I0308 04:16:55.608103 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c11a4533-a895-42a7-8c17-d6f421276ae0-etc-podinfo\") pod \"ironic-db-sync-78kcs\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.609271 master-0 kubenswrapper[18592]: I0308 04:16:55.609204 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c11a4533-a895-42a7-8c17-d6f421276ae0-config-data-merged\") pod \"ironic-db-sync-78kcs\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.612890 master-0 kubenswrapper[18592]: I0308 04:16:55.612084 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11a4533-a895-42a7-8c17-d6f421276ae0-config-data\") pod \"ironic-db-sync-78kcs\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.612890 master-0 kubenswrapper[18592]: I0308 04:16:55.612813 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11a4533-a895-42a7-8c17-d6f421276ae0-combined-ca-bundle\") pod \"ironic-db-sync-78kcs\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.614647 master-0 kubenswrapper[18592]: I0308 04:16:55.614616 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c11a4533-a895-42a7-8c17-d6f421276ae0-etc-podinfo\") pod \"ironic-db-sync-78kcs\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.622654 master-0 kubenswrapper[18592]: I0308 04:16:55.622610 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c11a4533-a895-42a7-8c17-d6f421276ae0-scripts\") pod \"ironic-db-sync-78kcs\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.624300 master-0 kubenswrapper[18592]: I0308 04:16:55.624268 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rcph\" (UniqueName: \"kubernetes.io/projected/c11a4533-a895-42a7-8c17-d6f421276ae0-kube-api-access-6rcph\") pod \"ironic-db-sync-78kcs\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.669845 master-0 kubenswrapper[18592]: I0308 04:16:55.669604 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xr25c"] Mar 08 04:16:55.673836 master-0 kubenswrapper[18592]: I0308 04:16:55.671222 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:55.677690 master-0 kubenswrapper[18592]: I0308 04:16:55.674775 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 04:16:55.677690 master-0 kubenswrapper[18592]: I0308 04:16:55.675006 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 04:16:55.677690 master-0 kubenswrapper[18592]: I0308 04:16:55.675108 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 04:16:55.677690 master-0 kubenswrapper[18592]: I0308 04:16:55.675588 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-78kcs" Mar 08 04:16:55.686441 master-0 kubenswrapper[18592]: I0308 04:16:55.686389 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xr25c"] Mar 08 04:16:55.831850 master-0 kubenswrapper[18592]: I0308 04:16:55.817400 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-combined-ca-bundle\") pod \"keystone-bootstrap-xr25c\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:55.831850 master-0 kubenswrapper[18592]: I0308 04:16:55.817462 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-fernet-keys\") pod \"keystone-bootstrap-xr25c\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:55.831850 master-0 kubenswrapper[18592]: I0308 04:16:55.817567 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-config-data\") pod \"keystone-bootstrap-xr25c\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:55.831850 master-0 kubenswrapper[18592]: I0308 04:16:55.817903 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl74c\" (UniqueName: \"kubernetes.io/projected/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-kube-api-access-nl74c\") pod \"keystone-bootstrap-xr25c\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:55.831850 master-0 kubenswrapper[18592]: I0308 04:16:55.818021 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-scripts\") pod \"keystone-bootstrap-xr25c\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:55.831850 master-0 kubenswrapper[18592]: I0308 04:16:55.818089 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-credential-keys\") pod \"keystone-bootstrap-xr25c\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:55.844044 master-0 kubenswrapper[18592]: I0308 04:16:55.843943 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:16:55.928362 master-0 kubenswrapper[18592]: I0308 04:16:55.920676 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-credential-keys\") pod \"keystone-bootstrap-xr25c\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:55.928362 master-0 kubenswrapper[18592]: I0308 04:16:55.920747 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-combined-ca-bundle\") pod \"keystone-bootstrap-xr25c\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:55.928362 master-0 kubenswrapper[18592]: I0308 04:16:55.920776 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-fernet-keys\") pod \"keystone-bootstrap-xr25c\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:55.928362 master-0 kubenswrapper[18592]: I0308 04:16:55.920806 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-config-data\") pod \"keystone-bootstrap-xr25c\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:55.928362 master-0 kubenswrapper[18592]: I0308 04:16:55.920955 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl74c\" (UniqueName: \"kubernetes.io/projected/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-kube-api-access-nl74c\") pod \"keystone-bootstrap-xr25c\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:55.928362 master-0 kubenswrapper[18592]: I0308 04:16:55.920999 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-scripts\") pod \"keystone-bootstrap-xr25c\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:55.928362 master-0 kubenswrapper[18592]: I0308 04:16:55.924127 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-scripts\") pod \"keystone-bootstrap-xr25c\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:55.940368 master-0 kubenswrapper[18592]: I0308 04:16:55.940168 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-combined-ca-bundle\") pod \"keystone-bootstrap-xr25c\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:55.940368 master-0 kubenswrapper[18592]: I0308 04:16:55.940192 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-config-data\") pod \"keystone-bootstrap-xr25c\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:55.940368 master-0 kubenswrapper[18592]: I0308 04:16:55.940252 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-fernet-keys\") pod \"keystone-bootstrap-xr25c\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:55.944840 master-0 kubenswrapper[18592]: I0308 04:16:55.943262 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-credential-keys\") pod \"keystone-bootstrap-xr25c\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:55.958902 master-0 kubenswrapper[18592]: I0308 04:16:55.956272 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc"] Mar 08 04:16:55.958902 master-0 kubenswrapper[18592]: I0308 04:16:55.956619 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" podUID="2bdf0b67-8d62-4dfc-ae07-730475c2a471" containerName="dnsmasq-dns" containerID="cri-o://0d093d9478d9ee3a76fd2b9381328c07c09751f36489ef028cc9d3e5acc5da32" gracePeriod=10 Mar 08 04:16:55.959268 master-0 kubenswrapper[18592]: I0308 04:16:55.959069 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl74c\" (UniqueName: \"kubernetes.io/projected/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-kube-api-access-nl74c\") pod \"keystone-bootstrap-xr25c\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:56.109411 master-0 kubenswrapper[18592]: I0308 04:16:56.109352 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:16:56.162361 master-0 kubenswrapper[18592]: I0308 04:16:56.162309 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="319bbb45-9cc5-4110-8aed-94ccbe808779" path="/var/lib/kubelet/pods/319bbb45-9cc5-4110-8aed-94ccbe808779/volumes" Mar 08 04:16:56.291173 master-0 kubenswrapper[18592]: I0308 04:16:56.287774 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-78kcs"] Mar 08 04:16:56.511061 master-0 kubenswrapper[18592]: I0308 04:16:56.511012 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-78kcs" event={"ID":"c11a4533-a895-42a7-8c17-d6f421276ae0","Type":"ContainerStarted","Data":"9676ec124b0a6ead43b93ddca3b487f24b8d04d3023c9889fddbba49b66f557d"} Mar 08 04:16:56.523648 master-0 kubenswrapper[18592]: I0308 04:16:56.520249 18592 generic.go:334] "Generic (PLEG): container finished" podID="2bdf0b67-8d62-4dfc-ae07-730475c2a471" containerID="0d093d9478d9ee3a76fd2b9381328c07c09751f36489ef028cc9d3e5acc5da32" exitCode=0 Mar 08 04:16:56.523648 master-0 kubenswrapper[18592]: I0308 04:16:56.520321 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" event={"ID":"2bdf0b67-8d62-4dfc-ae07-730475c2a471","Type":"ContainerDied","Data":"0d093d9478d9ee3a76fd2b9381328c07c09751f36489ef028cc9d3e5acc5da32"} Mar 08 04:16:56.528093 master-0 kubenswrapper[18592]: I0308 04:16:56.527543 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-afe2b-default-external-api-0" event={"ID":"0eff6f29-794c-4597-b53f-c030263b2080","Type":"ContainerStarted","Data":"acb051cdf721850d97db6528a56f259be53e65432bf2f5a93ccebdde9aac4e11"} Mar 08 04:16:56.568378 master-0 kubenswrapper[18592]: I0308 04:16:56.568283 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-afe2b-default-external-api-0" podStartSLOduration=6.568263541 podStartE2EDuration="6.568263541s" podCreationTimestamp="2026-03-08 04:16:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:16:56.556407649 +0000 UTC m=+1428.655161989" watchObservedRunningTime="2026-03-08 04:16:56.568263541 +0000 UTC m=+1428.667017881" Mar 08 04:16:56.855444 master-0 kubenswrapper[18592]: I0308 04:16:56.855378 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xr25c"] Mar 08 04:16:56.860776 master-0 kubenswrapper[18592]: I0308 04:16:56.860744 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:16:56.956434 master-0 kubenswrapper[18592]: I0308 04:16:56.956377 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-config\") pod \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " Mar 08 04:16:56.956622 master-0 kubenswrapper[18592]: I0308 04:16:56.956496 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-dns-svc\") pod \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " Mar 08 04:16:56.956622 master-0 kubenswrapper[18592]: I0308 04:16:56.956538 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-ovsdbserver-sb\") pod \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " Mar 08 04:16:56.956622 master-0 kubenswrapper[18592]: I0308 04:16:56.956619 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-ovsdbserver-nb\") pod \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " Mar 08 04:16:56.956744 master-0 kubenswrapper[18592]: I0308 04:16:56.956715 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdhsk\" (UniqueName: \"kubernetes.io/projected/2bdf0b67-8d62-4dfc-ae07-730475c2a471-kube-api-access-gdhsk\") pod \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\" (UID: \"2bdf0b67-8d62-4dfc-ae07-730475c2a471\") " Mar 08 04:16:56.974166 master-0 kubenswrapper[18592]: I0308 04:16:56.960518 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bdf0b67-8d62-4dfc-ae07-730475c2a471-kube-api-access-gdhsk" (OuterVolumeSpecName: "kube-api-access-gdhsk") pod "2bdf0b67-8d62-4dfc-ae07-730475c2a471" (UID: "2bdf0b67-8d62-4dfc-ae07-730475c2a471"). InnerVolumeSpecName "kube-api-access-gdhsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:16:57.007431 master-0 kubenswrapper[18592]: I0308 04:16:57.007366 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2bdf0b67-8d62-4dfc-ae07-730475c2a471" (UID: "2bdf0b67-8d62-4dfc-ae07-730475c2a471"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:57.037518 master-0 kubenswrapper[18592]: I0308 04:16:57.034761 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2bdf0b67-8d62-4dfc-ae07-730475c2a471" (UID: "2bdf0b67-8d62-4dfc-ae07-730475c2a471"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:57.040885 master-0 kubenswrapper[18592]: I0308 04:16:57.040814 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-config" (OuterVolumeSpecName: "config") pod "2bdf0b67-8d62-4dfc-ae07-730475c2a471" (UID: "2bdf0b67-8d62-4dfc-ae07-730475c2a471"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:57.060099 master-0 kubenswrapper[18592]: I0308 04:16:57.060022 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:57.060099 master-0 kubenswrapper[18592]: I0308 04:16:57.060072 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:57.060099 master-0 kubenswrapper[18592]: I0308 04:16:57.060084 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:57.060099 master-0 kubenswrapper[18592]: I0308 04:16:57.060092 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdhsk\" (UniqueName: \"kubernetes.io/projected/2bdf0b67-8d62-4dfc-ae07-730475c2a471-kube-api-access-gdhsk\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:57.076567 master-0 kubenswrapper[18592]: I0308 04:16:57.076511 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2bdf0b67-8d62-4dfc-ae07-730475c2a471" (UID: "2bdf0b67-8d62-4dfc-ae07-730475c2a471"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:16:57.166024 master-0 kubenswrapper[18592]: I0308 04:16:57.165971 18592 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bdf0b67-8d62-4dfc-ae07-730475c2a471-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 04:16:57.542898 master-0 kubenswrapper[18592]: I0308 04:16:57.542751 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" event={"ID":"2bdf0b67-8d62-4dfc-ae07-730475c2a471","Type":"ContainerDied","Data":"48ee007d399662c19ae25cf2325833aad800b6d6d8cf98341b2fa8805d557b96"} Mar 08 04:16:57.542898 master-0 kubenswrapper[18592]: I0308 04:16:57.542810 18592 scope.go:117] "RemoveContainer" containerID="0d093d9478d9ee3a76fd2b9381328c07c09751f36489ef028cc9d3e5acc5da32" Mar 08 04:16:57.543551 master-0 kubenswrapper[18592]: I0308 04:16:57.542955 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc" Mar 08 04:16:57.551549 master-0 kubenswrapper[18592]: I0308 04:16:57.551497 18592 generic.go:334] "Generic (PLEG): container finished" podID="ad7d2ffe-d3ac-4a74-ae47-46241d6c4769" containerID="95cd58650c1743a863844b11c731a9d0decec809efde463936ffaea0589654db" exitCode=0 Mar 08 04:16:57.551639 master-0 kubenswrapper[18592]: I0308 04:16:57.551581 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7ttxm" event={"ID":"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769","Type":"ContainerDied","Data":"95cd58650c1743a863844b11c731a9d0decec809efde463936ffaea0589654db"} Mar 08 04:16:57.786507 master-0 kubenswrapper[18592]: I0308 04:16:57.786426 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc"] Mar 08 04:16:57.791933 master-0 kubenswrapper[18592]: I0308 04:16:57.791893 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b9cd4dcf7-ln4zc"] Mar 08 04:16:58.161940 master-0 kubenswrapper[18592]: I0308 04:16:58.161878 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bdf0b67-8d62-4dfc-ae07-730475c2a471" path="/var/lib/kubelet/pods/2bdf0b67-8d62-4dfc-ae07-730475c2a471/volumes" Mar 08 04:17:02.596245 master-0 kubenswrapper[18592]: I0308 04:17:02.596161 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:17:02.596245 master-0 kubenswrapper[18592]: I0308 04:17:02.596257 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:17:02.642113 master-0 kubenswrapper[18592]: I0308 04:17:02.642026 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:17:02.642630 master-0 kubenswrapper[18592]: I0308 04:17:02.642540 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:17:02.671583 master-0 kubenswrapper[18592]: I0308 04:17:02.671496 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:17:03.653655 master-0 kubenswrapper[18592]: I0308 04:17:03.653570 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:17:04.053944 master-0 kubenswrapper[18592]: I0308 04:17:04.053872 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:17:04.053944 master-0 kubenswrapper[18592]: I0308 04:17:04.053949 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:17:04.086273 master-0 kubenswrapper[18592]: I0308 04:17:04.086203 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:17:04.121714 master-0 kubenswrapper[18592]: I0308 04:17:04.121176 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:17:04.659514 master-0 kubenswrapper[18592]: I0308 04:17:04.659442 18592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 04:17:04.660184 master-0 kubenswrapper[18592]: I0308 04:17:04.660092 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:17:04.660184 master-0 kubenswrapper[18592]: I0308 04:17:04.660140 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:17:05.345142 master-0 kubenswrapper[18592]: I0308 04:17:05.344215 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:17:05.634488 master-0 kubenswrapper[18592]: I0308 04:17:05.634347 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:17:06.629007 master-0 kubenswrapper[18592]: I0308 04:17:06.628678 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:17:06.649590 master-0 kubenswrapper[18592]: I0308 04:17:06.649524 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:17:09.553541 master-0 kubenswrapper[18592]: I0308 04:17:09.553485 18592 scope.go:117] "RemoveContainer" containerID="caba106c67fe491783d1663e22951fe25776b9cb8db793eb30d9ddb21dc94d5c" Mar 08 04:17:09.637564 master-0 kubenswrapper[18592]: I0308 04:17:09.637163 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7ttxm" Mar 08 04:17:09.671409 master-0 kubenswrapper[18592]: I0308 04:17:09.671346 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-logs\") pod \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " Mar 08 04:17:09.671619 master-0 kubenswrapper[18592]: I0308 04:17:09.671519 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwwgd\" (UniqueName: \"kubernetes.io/projected/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-kube-api-access-zwwgd\") pod \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " Mar 08 04:17:09.671736 master-0 kubenswrapper[18592]: I0308 04:17:09.671699 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-logs" (OuterVolumeSpecName: "logs") pod "ad7d2ffe-d3ac-4a74-ae47-46241d6c4769" (UID: "ad7d2ffe-d3ac-4a74-ae47-46241d6c4769"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:17:09.671782 master-0 kubenswrapper[18592]: I0308 04:17:09.671718 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-scripts\") pod \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " Mar 08 04:17:09.671815 master-0 kubenswrapper[18592]: I0308 04:17:09.671796 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-combined-ca-bundle\") pod \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " Mar 08 04:17:09.671934 master-0 kubenswrapper[18592]: I0308 04:17:09.671913 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-config-data\") pod \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\" (UID: \"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769\") " Mar 08 04:17:09.672517 master-0 kubenswrapper[18592]: I0308 04:17:09.672489 18592 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:09.675336 master-0 kubenswrapper[18592]: I0308 04:17:09.675300 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-scripts" (OuterVolumeSpecName: "scripts") pod "ad7d2ffe-d3ac-4a74-ae47-46241d6c4769" (UID: "ad7d2ffe-d3ac-4a74-ae47-46241d6c4769"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:09.675521 master-0 kubenswrapper[18592]: I0308 04:17:09.675436 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-kube-api-access-zwwgd" (OuterVolumeSpecName: "kube-api-access-zwwgd") pod "ad7d2ffe-d3ac-4a74-ae47-46241d6c4769" (UID: "ad7d2ffe-d3ac-4a74-ae47-46241d6c4769"). InnerVolumeSpecName "kube-api-access-zwwgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:17:09.696050 master-0 kubenswrapper[18592]: I0308 04:17:09.695979 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad7d2ffe-d3ac-4a74-ae47-46241d6c4769" (UID: "ad7d2ffe-d3ac-4a74-ae47-46241d6c4769"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:09.718753 master-0 kubenswrapper[18592]: I0308 04:17:09.718653 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7ttxm" event={"ID":"ad7d2ffe-d3ac-4a74-ae47-46241d6c4769","Type":"ContainerDied","Data":"4ec4ec720b8c14311ccb5daf4442b2945876cee8fd583600358c5bf9725d0763"} Mar 08 04:17:09.718753 master-0 kubenswrapper[18592]: I0308 04:17:09.718665 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-config-data" (OuterVolumeSpecName: "config-data") pod "ad7d2ffe-d3ac-4a74-ae47-46241d6c4769" (UID: "ad7d2ffe-d3ac-4a74-ae47-46241d6c4769"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:09.718753 master-0 kubenswrapper[18592]: I0308 04:17:09.718695 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ec4ec720b8c14311ccb5daf4442b2945876cee8fd583600358c5bf9725d0763" Mar 08 04:17:09.719154 master-0 kubenswrapper[18592]: I0308 04:17:09.718743 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7ttxm" Mar 08 04:17:09.720954 master-0 kubenswrapper[18592]: I0308 04:17:09.720783 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xr25c" event={"ID":"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf","Type":"ContainerStarted","Data":"8dd092c3801d74fc10d9ff2d4452b9536bbd19d4edebb5ba2b47319862defcac"} Mar 08 04:17:09.775733 master-0 kubenswrapper[18592]: I0308 04:17:09.775653 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:09.775954 master-0 kubenswrapper[18592]: I0308 04:17:09.775738 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:09.775954 master-0 kubenswrapper[18592]: I0308 04:17:09.775762 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:09.775954 master-0 kubenswrapper[18592]: I0308 04:17:09.775781 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwwgd\" (UniqueName: \"kubernetes.io/projected/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769-kube-api-access-zwwgd\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:11.001979 master-0 kubenswrapper[18592]: I0308 04:17:11.000364 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-57f9f57fc6-pfwg4"] Mar 08 04:17:11.001979 master-0 kubenswrapper[18592]: E0308 04:17:11.001206 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7d2ffe-d3ac-4a74-ae47-46241d6c4769" containerName="placement-db-sync" Mar 08 04:17:11.001979 master-0 kubenswrapper[18592]: I0308 04:17:11.001228 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7d2ffe-d3ac-4a74-ae47-46241d6c4769" containerName="placement-db-sync" Mar 08 04:17:11.001979 master-0 kubenswrapper[18592]: E0308 04:17:11.001289 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bdf0b67-8d62-4dfc-ae07-730475c2a471" containerName="init" Mar 08 04:17:11.001979 master-0 kubenswrapper[18592]: I0308 04:17:11.001300 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bdf0b67-8d62-4dfc-ae07-730475c2a471" containerName="init" Mar 08 04:17:11.001979 master-0 kubenswrapper[18592]: E0308 04:17:11.001335 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bdf0b67-8d62-4dfc-ae07-730475c2a471" containerName="dnsmasq-dns" Mar 08 04:17:11.001979 master-0 kubenswrapper[18592]: I0308 04:17:11.001370 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bdf0b67-8d62-4dfc-ae07-730475c2a471" containerName="dnsmasq-dns" Mar 08 04:17:11.001979 master-0 kubenswrapper[18592]: I0308 04:17:11.001768 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad7d2ffe-d3ac-4a74-ae47-46241d6c4769" containerName="placement-db-sync" Mar 08 04:17:11.001979 master-0 kubenswrapper[18592]: I0308 04:17:11.001813 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bdf0b67-8d62-4dfc-ae07-730475c2a471" containerName="dnsmasq-dns" Mar 08 04:17:11.005320 master-0 kubenswrapper[18592]: I0308 04:17:11.003671 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.010333 master-0 kubenswrapper[18592]: I0308 04:17:11.008415 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 08 04:17:11.010333 master-0 kubenswrapper[18592]: I0308 04:17:11.008633 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 08 04:17:11.010333 master-0 kubenswrapper[18592]: I0308 04:17:11.008734 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 08 04:17:11.010333 master-0 kubenswrapper[18592]: I0308 04:17:11.008847 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 08 04:17:11.031923 master-0 kubenswrapper[18592]: I0308 04:17:11.029055 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-57f9f57fc6-pfwg4"] Mar 08 04:17:11.105815 master-0 kubenswrapper[18592]: I0308 04:17:11.105760 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c77d97d-417a-48b7-a871-6800b12fbcb7-logs\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.106031 master-0 kubenswrapper[18592]: I0308 04:17:11.105838 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-scripts\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.106113 master-0 kubenswrapper[18592]: I0308 04:17:11.106062 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-public-tls-certs\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.106190 master-0 kubenswrapper[18592]: I0308 04:17:11.106168 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-internal-tls-certs\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.106238 master-0 kubenswrapper[18592]: I0308 04:17:11.106203 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-combined-ca-bundle\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.106274 master-0 kubenswrapper[18592]: I0308 04:17:11.106235 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlbnw\" (UniqueName: \"kubernetes.io/projected/9c77d97d-417a-48b7-a871-6800b12fbcb7-kube-api-access-rlbnw\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.106307 master-0 kubenswrapper[18592]: I0308 04:17:11.106282 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-config-data\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.215402 master-0 kubenswrapper[18592]: I0308 04:17:11.215355 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c77d97d-417a-48b7-a871-6800b12fbcb7-logs\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.215402 master-0 kubenswrapper[18592]: I0308 04:17:11.215410 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-scripts\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.215772 master-0 kubenswrapper[18592]: I0308 04:17:11.215524 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-public-tls-certs\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.215772 master-0 kubenswrapper[18592]: I0308 04:17:11.215755 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c77d97d-417a-48b7-a871-6800b12fbcb7-logs\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.215911 master-0 kubenswrapper[18592]: I0308 04:17:11.215800 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-internal-tls-certs\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.215967 master-0 kubenswrapper[18592]: I0308 04:17:11.215943 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-combined-ca-bundle\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.216016 master-0 kubenswrapper[18592]: I0308 04:17:11.215991 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlbnw\" (UniqueName: \"kubernetes.io/projected/9c77d97d-417a-48b7-a871-6800b12fbcb7-kube-api-access-rlbnw\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.216099 master-0 kubenswrapper[18592]: I0308 04:17:11.216074 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-config-data\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.220587 master-0 kubenswrapper[18592]: I0308 04:17:11.220424 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-public-tls-certs\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.221034 master-0 kubenswrapper[18592]: I0308 04:17:11.220958 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-scripts\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.221219 master-0 kubenswrapper[18592]: I0308 04:17:11.221184 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-config-data\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.222078 master-0 kubenswrapper[18592]: I0308 04:17:11.222042 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-combined-ca-bundle\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.223287 master-0 kubenswrapper[18592]: I0308 04:17:11.223262 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-internal-tls-certs\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.239552 master-0 kubenswrapper[18592]: I0308 04:17:11.239507 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlbnw\" (UniqueName: \"kubernetes.io/projected/9c77d97d-417a-48b7-a871-6800b12fbcb7-kube-api-access-rlbnw\") pod \"placement-57f9f57fc6-pfwg4\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:11.330168 master-0 kubenswrapper[18592]: I0308 04:17:11.330048 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:12.101506 master-0 kubenswrapper[18592]: I0308 04:17:12.101441 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-57f9f57fc6-pfwg4"] Mar 08 04:17:12.770547 master-0 kubenswrapper[18592]: I0308 04:17:12.770458 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-db-sync-bgkm7" event={"ID":"117a9c49-cd48-4a2c-bdee-10bb60588b20","Type":"ContainerStarted","Data":"99a98feadbf2914137693643285f4a1f8a079354fbc62898f0077f7d7a80a784"} Mar 08 04:17:12.772702 master-0 kubenswrapper[18592]: I0308 04:17:12.772644 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xr25c" event={"ID":"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf","Type":"ContainerStarted","Data":"c4b1d930d8c3a49e276f6417985c5bdea4b518aeb0512687ad85f96adb6eb5ba"} Mar 08 04:17:12.777261 master-0 kubenswrapper[18592]: I0308 04:17:12.777192 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57f9f57fc6-pfwg4" event={"ID":"9c77d97d-417a-48b7-a871-6800b12fbcb7","Type":"ContainerStarted","Data":"0f947b349b8be56666ad7c86e4264fb8308fb6f43e75ff44640009b2c1c98fbf"} Mar 08 04:17:12.777453 master-0 kubenswrapper[18592]: I0308 04:17:12.777293 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57f9f57fc6-pfwg4" event={"ID":"9c77d97d-417a-48b7-a871-6800b12fbcb7","Type":"ContainerStarted","Data":"921d18396cd77485dd40de28e9920f77cdd014dc6f721b25d815f0d811d8d65b"} Mar 08 04:17:12.777453 master-0 kubenswrapper[18592]: I0308 04:17:12.777317 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57f9f57fc6-pfwg4" event={"ID":"9c77d97d-417a-48b7-a871-6800b12fbcb7","Type":"ContainerStarted","Data":"e60e5c23a54b956e63f7848ad5385e7127ffdf922977750f5b2a5d49ffe3bc43"} Mar 08 04:17:12.777453 master-0 kubenswrapper[18592]: I0308 04:17:12.777386 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:12.779422 master-0 kubenswrapper[18592]: I0308 04:17:12.779330 18592 generic.go:334] "Generic (PLEG): container finished" podID="c11a4533-a895-42a7-8c17-d6f421276ae0" containerID="681a96c101abae53751c5a962ecc82bf1bc3fb04bf52b8267a5914f994ae00ad" exitCode=0 Mar 08 04:17:12.779514 master-0 kubenswrapper[18592]: I0308 04:17:12.779376 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-78kcs" event={"ID":"c11a4533-a895-42a7-8c17-d6f421276ae0","Type":"ContainerDied","Data":"681a96c101abae53751c5a962ecc82bf1bc3fb04bf52b8267a5914f994ae00ad"} Mar 08 04:17:12.802558 master-0 kubenswrapper[18592]: I0308 04:17:12.802446 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-ff301-db-sync-bgkm7" podStartSLOduration=3.207622455 podStartE2EDuration="27.802420487s" podCreationTimestamp="2026-03-08 04:16:45 +0000 UTC" firstStartedPulling="2026-03-08 04:16:47.015276995 +0000 UTC m=+1419.114031345" lastFinishedPulling="2026-03-08 04:17:11.610075027 +0000 UTC m=+1443.708829377" observedRunningTime="2026-03-08 04:17:12.795954771 +0000 UTC m=+1444.894709171" watchObservedRunningTime="2026-03-08 04:17:12.802420487 +0000 UTC m=+1444.901174867" Mar 08 04:17:12.841907 master-0 kubenswrapper[18592]: I0308 04:17:12.838784 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-57f9f57fc6-pfwg4" podStartSLOduration=2.8387611440000002 podStartE2EDuration="2.838761144s" podCreationTimestamp="2026-03-08 04:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:17:12.824652721 +0000 UTC m=+1444.923407071" watchObservedRunningTime="2026-03-08 04:17:12.838761144 +0000 UTC m=+1444.937515504" Mar 08 04:17:13.802505 master-0 kubenswrapper[18592]: I0308 04:17:13.802443 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-78kcs" event={"ID":"c11a4533-a895-42a7-8c17-d6f421276ae0","Type":"ContainerStarted","Data":"490068217b99e1b9eb5597b0547cfb888c0fa4b0491eeaa5a5cf7358d93d627e"} Mar 08 04:17:13.805553 master-0 kubenswrapper[18592]: I0308 04:17:13.805508 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:13.840194 master-0 kubenswrapper[18592]: I0308 04:17:13.840096 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xr25c" podStartSLOduration=18.840075163 podStartE2EDuration="18.840075163s" podCreationTimestamp="2026-03-08 04:16:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:17:12.896542935 +0000 UTC m=+1444.995297285" watchObservedRunningTime="2026-03-08 04:17:13.840075163 +0000 UTC m=+1445.938829523" Mar 08 04:17:13.850338 master-0 kubenswrapper[18592]: I0308 04:17:13.850256 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-db-sync-78kcs" podStartSLOduration=3.55770713 podStartE2EDuration="18.850022093s" podCreationTimestamp="2026-03-08 04:16:55 +0000 UTC" firstStartedPulling="2026-03-08 04:16:56.31615024 +0000 UTC m=+1428.414904590" lastFinishedPulling="2026-03-08 04:17:11.608465203 +0000 UTC m=+1443.707219553" observedRunningTime="2026-03-08 04:17:13.838365497 +0000 UTC m=+1445.937119857" watchObservedRunningTime="2026-03-08 04:17:13.850022093 +0000 UTC m=+1445.948776453" Mar 08 04:17:14.812788 master-0 kubenswrapper[18592]: I0308 04:17:14.812684 18592 generic.go:334] "Generic (PLEG): container finished" podID="3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf" containerID="c4b1d930d8c3a49e276f6417985c5bdea4b518aeb0512687ad85f96adb6eb5ba" exitCode=0 Mar 08 04:17:14.813776 master-0 kubenswrapper[18592]: I0308 04:17:14.813748 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xr25c" event={"ID":"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf","Type":"ContainerDied","Data":"c4b1d930d8c3a49e276f6417985c5bdea4b518aeb0512687ad85f96adb6eb5ba"} Mar 08 04:17:15.831912 master-0 kubenswrapper[18592]: I0308 04:17:15.831840 18592 generic.go:334] "Generic (PLEG): container finished" podID="d0eaefda-7b75-43b9-8d18-8c1476db321d" containerID="fddf0c276ff3e965c6a7d21cfe20238c47c133412bb2794d15ef17ace13bbc58" exitCode=0 Mar 08 04:17:15.832708 master-0 kubenswrapper[18592]: I0308 04:17:15.831917 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vhcff" event={"ID":"d0eaefda-7b75-43b9-8d18-8c1476db321d","Type":"ContainerDied","Data":"fddf0c276ff3e965c6a7d21cfe20238c47c133412bb2794d15ef17ace13bbc58"} Mar 08 04:17:16.295082 master-0 kubenswrapper[18592]: I0308 04:17:16.295014 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:17:16.341610 master-0 kubenswrapper[18592]: I0308 04:17:16.341549 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl74c\" (UniqueName: \"kubernetes.io/projected/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-kube-api-access-nl74c\") pod \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " Mar 08 04:17:16.341885 master-0 kubenswrapper[18592]: I0308 04:17:16.341628 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-credential-keys\") pod \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " Mar 08 04:17:16.341885 master-0 kubenswrapper[18592]: I0308 04:17:16.341676 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-combined-ca-bundle\") pod \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " Mar 08 04:17:16.341885 master-0 kubenswrapper[18592]: I0308 04:17:16.341749 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-scripts\") pod \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " Mar 08 04:17:16.342045 master-0 kubenswrapper[18592]: I0308 04:17:16.341958 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-config-data\") pod \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " Mar 08 04:17:16.342045 master-0 kubenswrapper[18592]: I0308 04:17:16.342005 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-fernet-keys\") pod \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\" (UID: \"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf\") " Mar 08 04:17:16.357745 master-0 kubenswrapper[18592]: I0308 04:17:16.356696 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf" (UID: "3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:16.360212 master-0 kubenswrapper[18592]: I0308 04:17:16.360156 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-kube-api-access-nl74c" (OuterVolumeSpecName: "kube-api-access-nl74c") pod "3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf" (UID: "3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf"). InnerVolumeSpecName "kube-api-access-nl74c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:17:16.364508 master-0 kubenswrapper[18592]: I0308 04:17:16.364425 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf" (UID: "3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:16.367364 master-0 kubenswrapper[18592]: I0308 04:17:16.367074 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-scripts" (OuterVolumeSpecName: "scripts") pod "3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf" (UID: "3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:16.393299 master-0 kubenswrapper[18592]: I0308 04:17:16.393231 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf" (UID: "3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:16.394079 master-0 kubenswrapper[18592]: I0308 04:17:16.393756 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-config-data" (OuterVolumeSpecName: "config-data") pod "3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf" (UID: "3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:16.445123 master-0 kubenswrapper[18592]: I0308 04:17:16.445058 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl74c\" (UniqueName: \"kubernetes.io/projected/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-kube-api-access-nl74c\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:16.445123 master-0 kubenswrapper[18592]: I0308 04:17:16.445114 18592 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-credential-keys\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:16.445123 master-0 kubenswrapper[18592]: I0308 04:17:16.445128 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:16.445420 master-0 kubenswrapper[18592]: I0308 04:17:16.445141 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:16.445420 master-0 kubenswrapper[18592]: I0308 04:17:16.445154 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:16.445420 master-0 kubenswrapper[18592]: I0308 04:17:16.445166 18592 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:16.855877 master-0 kubenswrapper[18592]: I0308 04:17:16.855745 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xr25c" Mar 08 04:17:16.856713 master-0 kubenswrapper[18592]: I0308 04:17:16.856199 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xr25c" event={"ID":"3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf","Type":"ContainerDied","Data":"8dd092c3801d74fc10d9ff2d4452b9536bbd19d4edebb5ba2b47319862defcac"} Mar 08 04:17:16.856713 master-0 kubenswrapper[18592]: I0308 04:17:16.856232 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dd092c3801d74fc10d9ff2d4452b9536bbd19d4edebb5ba2b47319862defcac" Mar 08 04:17:17.312684 master-0 kubenswrapper[18592]: I0308 04:17:17.312637 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vhcff" Mar 08 04:17:17.368331 master-0 kubenswrapper[18592]: I0308 04:17:17.367696 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0eaefda-7b75-43b9-8d18-8c1476db321d-combined-ca-bundle\") pod \"d0eaefda-7b75-43b9-8d18-8c1476db321d\" (UID: \"d0eaefda-7b75-43b9-8d18-8c1476db321d\") " Mar 08 04:17:17.368331 master-0 kubenswrapper[18592]: I0308 04:17:17.367964 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz6tr\" (UniqueName: \"kubernetes.io/projected/d0eaefda-7b75-43b9-8d18-8c1476db321d-kube-api-access-wz6tr\") pod \"d0eaefda-7b75-43b9-8d18-8c1476db321d\" (UID: \"d0eaefda-7b75-43b9-8d18-8c1476db321d\") " Mar 08 04:17:17.368331 master-0 kubenswrapper[18592]: I0308 04:17:17.368176 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0eaefda-7b75-43b9-8d18-8c1476db321d-config\") pod \"d0eaefda-7b75-43b9-8d18-8c1476db321d\" (UID: \"d0eaefda-7b75-43b9-8d18-8c1476db321d\") " Mar 08 04:17:17.372614 master-0 kubenswrapper[18592]: I0308 04:17:17.372555 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0eaefda-7b75-43b9-8d18-8c1476db321d-kube-api-access-wz6tr" (OuterVolumeSpecName: "kube-api-access-wz6tr") pod "d0eaefda-7b75-43b9-8d18-8c1476db321d" (UID: "d0eaefda-7b75-43b9-8d18-8c1476db321d"). InnerVolumeSpecName "kube-api-access-wz6tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:17:17.389093 master-0 kubenswrapper[18592]: E0308 04:17:17.389022 18592 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0eaefda-7b75-43b9-8d18-8c1476db321d-config podName:d0eaefda-7b75-43b9-8d18-8c1476db321d nodeName:}" failed. No retries permitted until 2026-03-08 04:17:17.888994079 +0000 UTC m=+1449.987748429 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/secret/d0eaefda-7b75-43b9-8d18-8c1476db321d-config") pod "d0eaefda-7b75-43b9-8d18-8c1476db321d" (UID: "d0eaefda-7b75-43b9-8d18-8c1476db321d") : error deleting /var/lib/kubelet/pods/d0eaefda-7b75-43b9-8d18-8c1476db321d/volume-subpaths: remove /var/lib/kubelet/pods/d0eaefda-7b75-43b9-8d18-8c1476db321d/volume-subpaths: no such file or directory Mar 08 04:17:17.392003 master-0 kubenswrapper[18592]: I0308 04:17:17.391946 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0eaefda-7b75-43b9-8d18-8c1476db321d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0eaefda-7b75-43b9-8d18-8c1476db321d" (UID: "d0eaefda-7b75-43b9-8d18-8c1476db321d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:17.472475 master-0 kubenswrapper[18592]: I0308 04:17:17.472358 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz6tr\" (UniqueName: \"kubernetes.io/projected/d0eaefda-7b75-43b9-8d18-8c1476db321d-kube-api-access-wz6tr\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:17.472475 master-0 kubenswrapper[18592]: I0308 04:17:17.472406 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0eaefda-7b75-43b9-8d18-8c1476db321d-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:17.512891 master-0 kubenswrapper[18592]: I0308 04:17:17.512807 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-56d7944fd-t2xf9"] Mar 08 04:17:17.513309 master-0 kubenswrapper[18592]: E0308 04:17:17.513277 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf" containerName="keystone-bootstrap" Mar 08 04:17:17.513309 master-0 kubenswrapper[18592]: I0308 04:17:17.513300 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf" containerName="keystone-bootstrap" Mar 08 04:17:17.513400 master-0 kubenswrapper[18592]: E0308 04:17:17.513333 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0eaefda-7b75-43b9-8d18-8c1476db321d" containerName="neutron-db-sync" Mar 08 04:17:17.513400 master-0 kubenswrapper[18592]: I0308 04:17:17.513340 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0eaefda-7b75-43b9-8d18-8c1476db321d" containerName="neutron-db-sync" Mar 08 04:17:17.513619 master-0 kubenswrapper[18592]: I0308 04:17:17.513598 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0eaefda-7b75-43b9-8d18-8c1476db321d" containerName="neutron-db-sync" Mar 08 04:17:17.513669 master-0 kubenswrapper[18592]: I0308 04:17:17.513630 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf" containerName="keystone-bootstrap" Mar 08 04:17:17.514394 master-0 kubenswrapper[18592]: I0308 04:17:17.514354 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.516180 master-0 kubenswrapper[18592]: I0308 04:17:17.516143 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 04:17:17.516280 master-0 kubenswrapper[18592]: I0308 04:17:17.516253 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 08 04:17:17.516324 master-0 kubenswrapper[18592]: I0308 04:17:17.516262 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 04:17:17.516363 master-0 kubenswrapper[18592]: I0308 04:17:17.516349 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 08 04:17:17.521208 master-0 kubenswrapper[18592]: I0308 04:17:17.521167 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 04:17:17.534557 master-0 kubenswrapper[18592]: I0308 04:17:17.534505 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56d7944fd-t2xf9"] Mar 08 04:17:17.585516 master-0 kubenswrapper[18592]: I0308 04:17:17.585448 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-internal-tls-certs\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.585516 master-0 kubenswrapper[18592]: I0308 04:17:17.585517 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-combined-ca-bundle\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.585790 master-0 kubenswrapper[18592]: I0308 04:17:17.585553 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-scripts\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.585790 master-0 kubenswrapper[18592]: I0308 04:17:17.585586 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-public-tls-certs\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.585790 master-0 kubenswrapper[18592]: I0308 04:17:17.585619 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-fernet-keys\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.585790 master-0 kubenswrapper[18592]: I0308 04:17:17.585666 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-config-data\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.585790 master-0 kubenswrapper[18592]: I0308 04:17:17.585723 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-credential-keys\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.585790 master-0 kubenswrapper[18592]: I0308 04:17:17.585791 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzmvs\" (UniqueName: \"kubernetes.io/projected/081bf4e2-02c9-4cca-b699-6148f8aaa219-kube-api-access-tzmvs\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.688053 master-0 kubenswrapper[18592]: I0308 04:17:17.687981 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-credential-keys\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.688309 master-0 kubenswrapper[18592]: I0308 04:17:17.688108 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzmvs\" (UniqueName: \"kubernetes.io/projected/081bf4e2-02c9-4cca-b699-6148f8aaa219-kube-api-access-tzmvs\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.688309 master-0 kubenswrapper[18592]: I0308 04:17:17.688168 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-internal-tls-certs\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.688309 master-0 kubenswrapper[18592]: I0308 04:17:17.688206 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-combined-ca-bundle\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.688309 master-0 kubenswrapper[18592]: I0308 04:17:17.688244 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-scripts\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.688309 master-0 kubenswrapper[18592]: I0308 04:17:17.688280 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-public-tls-certs\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.688526 master-0 kubenswrapper[18592]: I0308 04:17:17.688330 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-fernet-keys\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.688526 master-0 kubenswrapper[18592]: I0308 04:17:17.688384 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-config-data\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.692498 master-0 kubenswrapper[18592]: I0308 04:17:17.692454 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-public-tls-certs\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.694653 master-0 kubenswrapper[18592]: I0308 04:17:17.693761 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-combined-ca-bundle\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.695120 master-0 kubenswrapper[18592]: I0308 04:17:17.695056 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-scripts\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.695615 master-0 kubenswrapper[18592]: I0308 04:17:17.695576 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-fernet-keys\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.696360 master-0 kubenswrapper[18592]: I0308 04:17:17.696324 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-config-data\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.697309 master-0 kubenswrapper[18592]: I0308 04:17:17.697250 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-credential-keys\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.699176 master-0 kubenswrapper[18592]: I0308 04:17:17.699151 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/081bf4e2-02c9-4cca-b699-6148f8aaa219-internal-tls-certs\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.707905 master-0 kubenswrapper[18592]: I0308 04:17:17.707321 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzmvs\" (UniqueName: \"kubernetes.io/projected/081bf4e2-02c9-4cca-b699-6148f8aaa219-kube-api-access-tzmvs\") pod \"keystone-56d7944fd-t2xf9\" (UID: \"081bf4e2-02c9-4cca-b699-6148f8aaa219\") " pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.866401 master-0 kubenswrapper[18592]: I0308 04:17:17.866283 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vhcff" event={"ID":"d0eaefda-7b75-43b9-8d18-8c1476db321d","Type":"ContainerDied","Data":"3c4c3a7d2636242bfc2a1d3a1e5905ae0bec590b3fcd546cfff36c40db486da7"} Mar 08 04:17:17.866401 master-0 kubenswrapper[18592]: I0308 04:17:17.866333 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c4c3a7d2636242bfc2a1d3a1e5905ae0bec590b3fcd546cfff36c40db486da7" Mar 08 04:17:17.866401 master-0 kubenswrapper[18592]: I0308 04:17:17.866333 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vhcff" Mar 08 04:17:17.871734 master-0 kubenswrapper[18592]: I0308 04:17:17.871682 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:17.891565 master-0 kubenswrapper[18592]: I0308 04:17:17.891502 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0eaefda-7b75-43b9-8d18-8c1476db321d-config\") pod \"d0eaefda-7b75-43b9-8d18-8c1476db321d\" (UID: \"d0eaefda-7b75-43b9-8d18-8c1476db321d\") " Mar 08 04:17:17.894949 master-0 kubenswrapper[18592]: I0308 04:17:17.894886 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0eaefda-7b75-43b9-8d18-8c1476db321d-config" (OuterVolumeSpecName: "config") pod "d0eaefda-7b75-43b9-8d18-8c1476db321d" (UID: "d0eaefda-7b75-43b9-8d18-8c1476db321d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:17.997835 master-0 kubenswrapper[18592]: I0308 04:17:17.997739 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0eaefda-7b75-43b9-8d18-8c1476db321d-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:18.234690 master-0 kubenswrapper[18592]: I0308 04:17:18.234647 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d65fdd857-mz6wd"] Mar 08 04:17:18.237058 master-0 kubenswrapper[18592]: I0308 04:17:18.237034 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.252566 master-0 kubenswrapper[18592]: I0308 04:17:18.249145 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d65fdd857-mz6wd"] Mar 08 04:17:18.328311 master-0 kubenswrapper[18592]: I0308 04:17:18.328267 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-config\") pod \"dnsmasq-dns-7d65fdd857-mz6wd\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.328580 master-0 kubenswrapper[18592]: I0308 04:17:18.328563 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-dns-swift-storage-0\") pod \"dnsmasq-dns-7d65fdd857-mz6wd\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.328773 master-0 kubenswrapper[18592]: I0308 04:17:18.328759 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-dns-svc\") pod \"dnsmasq-dns-7d65fdd857-mz6wd\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.328885 master-0 kubenswrapper[18592]: I0308 04:17:18.328872 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-ovsdbserver-sb\") pod \"dnsmasq-dns-7d65fdd857-mz6wd\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.329028 master-0 kubenswrapper[18592]: I0308 04:17:18.329015 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-ovsdbserver-nb\") pod \"dnsmasq-dns-7d65fdd857-mz6wd\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.329108 master-0 kubenswrapper[18592]: I0308 04:17:18.329096 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26xxg\" (UniqueName: \"kubernetes.io/projected/065d1362-4389-473a-9b59-4d3b25153deb-kube-api-access-26xxg\") pod \"dnsmasq-dns-7d65fdd857-mz6wd\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.424354 master-0 kubenswrapper[18592]: I0308 04:17:18.424260 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56d7944fd-t2xf9"] Mar 08 04:17:18.440050 master-0 kubenswrapper[18592]: I0308 04:17:18.431671 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-dns-svc\") pod \"dnsmasq-dns-7d65fdd857-mz6wd\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.440050 master-0 kubenswrapper[18592]: I0308 04:17:18.431806 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-ovsdbserver-sb\") pod \"dnsmasq-dns-7d65fdd857-mz6wd\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.440050 master-0 kubenswrapper[18592]: I0308 04:17:18.431896 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-ovsdbserver-nb\") pod \"dnsmasq-dns-7d65fdd857-mz6wd\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.440050 master-0 kubenswrapper[18592]: I0308 04:17:18.431942 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26xxg\" (UniqueName: \"kubernetes.io/projected/065d1362-4389-473a-9b59-4d3b25153deb-kube-api-access-26xxg\") pod \"dnsmasq-dns-7d65fdd857-mz6wd\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.440050 master-0 kubenswrapper[18592]: I0308 04:17:18.432019 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-config\") pod \"dnsmasq-dns-7d65fdd857-mz6wd\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.440050 master-0 kubenswrapper[18592]: I0308 04:17:18.432075 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-dns-swift-storage-0\") pod \"dnsmasq-dns-7d65fdd857-mz6wd\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.440050 master-0 kubenswrapper[18592]: I0308 04:17:18.432961 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-dns-svc\") pod \"dnsmasq-dns-7d65fdd857-mz6wd\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.440050 master-0 kubenswrapper[18592]: I0308 04:17:18.433231 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-ovsdbserver-nb\") pod \"dnsmasq-dns-7d65fdd857-mz6wd\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.440050 master-0 kubenswrapper[18592]: I0308 04:17:18.433711 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-ovsdbserver-sb\") pod \"dnsmasq-dns-7d65fdd857-mz6wd\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.440050 master-0 kubenswrapper[18592]: I0308 04:17:18.436307 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-config\") pod \"dnsmasq-dns-7d65fdd857-mz6wd\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.440050 master-0 kubenswrapper[18592]: I0308 04:17:18.436536 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-dns-swift-storage-0\") pod \"dnsmasq-dns-7d65fdd857-mz6wd\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.445786 master-0 kubenswrapper[18592]: W0308 04:17:18.445676 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod081bf4e2_02c9_4cca_b699_6148f8aaa219.slice/crio-bc678861c6205968669078fe14b94c137ac2399149a4cf06fe412eea1ee0094e WatchSource:0}: Error finding container bc678861c6205968669078fe14b94c137ac2399149a4cf06fe412eea1ee0094e: Status 404 returned error can't find the container with id bc678861c6205968669078fe14b94c137ac2399149a4cf06fe412eea1ee0094e Mar 08 04:17:18.450639 master-0 kubenswrapper[18592]: I0308 04:17:18.449706 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26xxg\" (UniqueName: \"kubernetes.io/projected/065d1362-4389-473a-9b59-4d3b25153deb-kube-api-access-26xxg\") pod \"dnsmasq-dns-7d65fdd857-mz6wd\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.540117 master-0 kubenswrapper[18592]: I0308 04:17:18.538874 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c664f854-r9dzf"] Mar 08 04:17:18.540543 master-0 kubenswrapper[18592]: I0308 04:17:18.540520 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:18.551439 master-0 kubenswrapper[18592]: I0308 04:17:18.551388 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 08 04:17:18.551638 master-0 kubenswrapper[18592]: I0308 04:17:18.551612 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 08 04:17:18.551805 master-0 kubenswrapper[18592]: I0308 04:17:18.551780 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 08 04:17:18.566652 master-0 kubenswrapper[18592]: I0308 04:17:18.566598 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c664f854-r9dzf"] Mar 08 04:17:18.605015 master-0 kubenswrapper[18592]: I0308 04:17:18.604951 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:18.636747 master-0 kubenswrapper[18592]: I0308 04:17:18.635934 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-httpd-config\") pod \"neutron-c664f854-r9dzf\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:18.636747 master-0 kubenswrapper[18592]: I0308 04:17:18.636006 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-954w4\" (UniqueName: \"kubernetes.io/projected/5a3e60db-4326-462e-bdad-0d06970f36c3-kube-api-access-954w4\") pod \"neutron-c664f854-r9dzf\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:18.636747 master-0 kubenswrapper[18592]: I0308 04:17:18.636048 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-config\") pod \"neutron-c664f854-r9dzf\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:18.636747 master-0 kubenswrapper[18592]: I0308 04:17:18.636128 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-combined-ca-bundle\") pod \"neutron-c664f854-r9dzf\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:18.636747 master-0 kubenswrapper[18592]: I0308 04:17:18.636452 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-ovndb-tls-certs\") pod \"neutron-c664f854-r9dzf\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:18.738430 master-0 kubenswrapper[18592]: I0308 04:17:18.738249 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-ovndb-tls-certs\") pod \"neutron-c664f854-r9dzf\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:18.738430 master-0 kubenswrapper[18592]: I0308 04:17:18.738320 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-httpd-config\") pod \"neutron-c664f854-r9dzf\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:18.738430 master-0 kubenswrapper[18592]: I0308 04:17:18.738360 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-954w4\" (UniqueName: \"kubernetes.io/projected/5a3e60db-4326-462e-bdad-0d06970f36c3-kube-api-access-954w4\") pod \"neutron-c664f854-r9dzf\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:18.738430 master-0 kubenswrapper[18592]: I0308 04:17:18.738397 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-config\") pod \"neutron-c664f854-r9dzf\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:18.738788 master-0 kubenswrapper[18592]: I0308 04:17:18.738472 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-combined-ca-bundle\") pod \"neutron-c664f854-r9dzf\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:18.742910 master-0 kubenswrapper[18592]: I0308 04:17:18.742701 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-config\") pod \"neutron-c664f854-r9dzf\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:18.748211 master-0 kubenswrapper[18592]: I0308 04:17:18.748077 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-combined-ca-bundle\") pod \"neutron-c664f854-r9dzf\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:18.751730 master-0 kubenswrapper[18592]: I0308 04:17:18.751629 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-httpd-config\") pod \"neutron-c664f854-r9dzf\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:18.752344 master-0 kubenswrapper[18592]: I0308 04:17:18.752306 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-ovndb-tls-certs\") pod \"neutron-c664f854-r9dzf\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:18.756494 master-0 kubenswrapper[18592]: I0308 04:17:18.756377 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-954w4\" (UniqueName: \"kubernetes.io/projected/5a3e60db-4326-462e-bdad-0d06970f36c3-kube-api-access-954w4\") pod \"neutron-c664f854-r9dzf\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:18.905027 master-0 kubenswrapper[18592]: I0308 04:17:18.904979 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56d7944fd-t2xf9" event={"ID":"081bf4e2-02c9-4cca-b699-6148f8aaa219","Type":"ContainerStarted","Data":"c7e0d3efeed481959fd39249793445d23630e25ba7eecfdd4caf0c6210bc4e98"} Mar 08 04:17:18.905027 master-0 kubenswrapper[18592]: I0308 04:17:18.905029 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56d7944fd-t2xf9" event={"ID":"081bf4e2-02c9-4cca-b699-6148f8aaa219","Type":"ContainerStarted","Data":"bc678861c6205968669078fe14b94c137ac2399149a4cf06fe412eea1ee0094e"} Mar 08 04:17:18.906377 master-0 kubenswrapper[18592]: I0308 04:17:18.905359 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:18.921300 master-0 kubenswrapper[18592]: I0308 04:17:18.920553 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:18.926436 master-0 kubenswrapper[18592]: I0308 04:17:18.925317 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-56d7944fd-t2xf9" podStartSLOduration=1.925299856 podStartE2EDuration="1.925299856s" podCreationTimestamp="2026-03-08 04:17:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:17:18.922717355 +0000 UTC m=+1451.021471705" watchObservedRunningTime="2026-03-08 04:17:18.925299856 +0000 UTC m=+1451.024054206" Mar 08 04:17:19.150852 master-0 kubenswrapper[18592]: I0308 04:17:19.147332 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d65fdd857-mz6wd"] Mar 08 04:17:19.558585 master-0 kubenswrapper[18592]: I0308 04:17:19.553265 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c664f854-r9dzf"] Mar 08 04:17:19.560580 master-0 kubenswrapper[18592]: W0308 04:17:19.560507 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a3e60db_4326_462e_bdad_0d06970f36c3.slice/crio-7d909f92de517bb888eea5a2a344264eefb388485fc834689514c22b10e67b0b WatchSource:0}: Error finding container 7d909f92de517bb888eea5a2a344264eefb388485fc834689514c22b10e67b0b: Status 404 returned error can't find the container with id 7d909f92de517bb888eea5a2a344264eefb388485fc834689514c22b10e67b0b Mar 08 04:17:19.921849 master-0 kubenswrapper[18592]: I0308 04:17:19.918501 18592 generic.go:334] "Generic (PLEG): container finished" podID="065d1362-4389-473a-9b59-4d3b25153deb" containerID="26bb6bf3b613140928de4297065d88e9ca8ca8bc54b20972fe0bbf1c366abc85" exitCode=0 Mar 08 04:17:19.921849 master-0 kubenswrapper[18592]: I0308 04:17:19.918618 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" event={"ID":"065d1362-4389-473a-9b59-4d3b25153deb","Type":"ContainerDied","Data":"26bb6bf3b613140928de4297065d88e9ca8ca8bc54b20972fe0bbf1c366abc85"} Mar 08 04:17:19.921849 master-0 kubenswrapper[18592]: I0308 04:17:19.918653 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" event={"ID":"065d1362-4389-473a-9b59-4d3b25153deb","Type":"ContainerStarted","Data":"37ffe400256714d2ed75f61632704eb63923874e1911f9d746a455a7c18a7cb2"} Mar 08 04:17:19.921849 master-0 kubenswrapper[18592]: I0308 04:17:19.920678 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c664f854-r9dzf" event={"ID":"5a3e60db-4326-462e-bdad-0d06970f36c3","Type":"ContainerStarted","Data":"38f96f580b3c6adb13d63b37ceb373efcd3ca83c0e0d2bb9dd61b0854d1f737f"} Mar 08 04:17:19.921849 master-0 kubenswrapper[18592]: I0308 04:17:19.920725 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c664f854-r9dzf" event={"ID":"5a3e60db-4326-462e-bdad-0d06970f36c3","Type":"ContainerStarted","Data":"7d909f92de517bb888eea5a2a344264eefb388485fc834689514c22b10e67b0b"} Mar 08 04:17:20.946254 master-0 kubenswrapper[18592]: I0308 04:17:20.946176 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c664f854-r9dzf" event={"ID":"5a3e60db-4326-462e-bdad-0d06970f36c3","Type":"ContainerStarted","Data":"ce39c5441b3afe06932ac615a56726bf3b59a0535525e8c100fd43967b33b085"} Mar 08 04:17:20.946890 master-0 kubenswrapper[18592]: I0308 04:17:20.946724 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:20.950076 master-0 kubenswrapper[18592]: I0308 04:17:20.949955 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" event={"ID":"065d1362-4389-473a-9b59-4d3b25153deb","Type":"ContainerStarted","Data":"787f0cb3a3f0b8aa215b026c1fb9e972156a7c6a16f7b3daf1ed6837b295d474"} Mar 08 04:17:20.951330 master-0 kubenswrapper[18592]: I0308 04:17:20.951300 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:20.982902 master-0 kubenswrapper[18592]: I0308 04:17:20.981712 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-c664f854-r9dzf" podStartSLOduration=2.981691755 podStartE2EDuration="2.981691755s" podCreationTimestamp="2026-03-08 04:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:17:20.972125174 +0000 UTC m=+1453.070879524" watchObservedRunningTime="2026-03-08 04:17:20.981691755 +0000 UTC m=+1453.080446115" Mar 08 04:17:21.009196 master-0 kubenswrapper[18592]: I0308 04:17:21.009128 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" podStartSLOduration=3.009110029 podStartE2EDuration="3.009110029s" podCreationTimestamp="2026-03-08 04:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:17:21.007333071 +0000 UTC m=+1453.106087421" watchObservedRunningTime="2026-03-08 04:17:21.009110029 +0000 UTC m=+1453.107864379" Mar 08 04:17:21.259935 master-0 kubenswrapper[18592]: I0308 04:17:21.257992 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67869b4f85-qhzqs"] Mar 08 04:17:21.262125 master-0 kubenswrapper[18592]: I0308 04:17:21.260397 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.264394 master-0 kubenswrapper[18592]: I0308 04:17:21.264346 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 08 04:17:21.264589 master-0 kubenswrapper[18592]: I0308 04:17:21.264537 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 08 04:17:21.308575 master-0 kubenswrapper[18592]: I0308 04:17:21.307928 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67869b4f85-qhzqs"] Mar 08 04:17:21.309914 master-0 kubenswrapper[18592]: I0308 04:17:21.309330 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-public-tls-certs\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.309914 master-0 kubenswrapper[18592]: I0308 04:17:21.309365 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-ovndb-tls-certs\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.309914 master-0 kubenswrapper[18592]: I0308 04:17:21.309384 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-httpd-config\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.309914 master-0 kubenswrapper[18592]: I0308 04:17:21.309502 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfl4j\" (UniqueName: \"kubernetes.io/projected/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-kube-api-access-vfl4j\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.309914 master-0 kubenswrapper[18592]: I0308 04:17:21.309759 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-combined-ca-bundle\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.310223 master-0 kubenswrapper[18592]: I0308 04:17:21.309954 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-config\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.310223 master-0 kubenswrapper[18592]: I0308 04:17:21.310057 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-internal-tls-certs\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.413406 master-0 kubenswrapper[18592]: I0308 04:17:21.413326 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-config\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.413631 master-0 kubenswrapper[18592]: I0308 04:17:21.413429 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-internal-tls-certs\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.413631 master-0 kubenswrapper[18592]: I0308 04:17:21.413541 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-public-tls-certs\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.413631 master-0 kubenswrapper[18592]: I0308 04:17:21.413570 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-ovndb-tls-certs\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.413631 master-0 kubenswrapper[18592]: I0308 04:17:21.413591 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-httpd-config\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.413837 master-0 kubenswrapper[18592]: I0308 04:17:21.413636 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfl4j\" (UniqueName: \"kubernetes.io/projected/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-kube-api-access-vfl4j\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.413837 master-0 kubenswrapper[18592]: I0308 04:17:21.413707 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-combined-ca-bundle\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.418231 master-0 kubenswrapper[18592]: I0308 04:17:21.418006 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-combined-ca-bundle\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.418231 master-0 kubenswrapper[18592]: I0308 04:17:21.418217 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-config\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.418718 master-0 kubenswrapper[18592]: I0308 04:17:21.418668 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-ovndb-tls-certs\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.419263 master-0 kubenswrapper[18592]: I0308 04:17:21.419226 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-httpd-config\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.419996 master-0 kubenswrapper[18592]: I0308 04:17:21.419963 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-internal-tls-certs\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.425514 master-0 kubenswrapper[18592]: I0308 04:17:21.425479 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-public-tls-certs\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.443402 master-0 kubenswrapper[18592]: I0308 04:17:21.443360 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfl4j\" (UniqueName: \"kubernetes.io/projected/b300a420-49d9-40f0-9c31-7a2ff20dcbfc-kube-api-access-vfl4j\") pod \"neutron-67869b4f85-qhzqs\" (UID: \"b300a420-49d9-40f0-9c31-7a2ff20dcbfc\") " pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:21.595767 master-0 kubenswrapper[18592]: I0308 04:17:21.595073 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:22.231953 master-0 kubenswrapper[18592]: I0308 04:17:22.225096 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67869b4f85-qhzqs"] Mar 08 04:17:22.992542 master-0 kubenswrapper[18592]: I0308 04:17:22.992481 18592 generic.go:334] "Generic (PLEG): container finished" podID="117a9c49-cd48-4a2c-bdee-10bb60588b20" containerID="99a98feadbf2914137693643285f4a1f8a079354fbc62898f0077f7d7a80a784" exitCode=0 Mar 08 04:17:22.992767 master-0 kubenswrapper[18592]: I0308 04:17:22.992595 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-db-sync-bgkm7" event={"ID":"117a9c49-cd48-4a2c-bdee-10bb60588b20","Type":"ContainerDied","Data":"99a98feadbf2914137693643285f4a1f8a079354fbc62898f0077f7d7a80a784"} Mar 08 04:17:22.997692 master-0 kubenswrapper[18592]: I0308 04:17:22.997634 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67869b4f85-qhzqs" event={"ID":"b300a420-49d9-40f0-9c31-7a2ff20dcbfc","Type":"ContainerStarted","Data":"d4fc9d630138393fd9ae4af9a4e4852e8a5868aa2a7a776c4a8dfa9dfa921db7"} Mar 08 04:17:22.997692 master-0 kubenswrapper[18592]: I0308 04:17:22.997688 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:22.997692 master-0 kubenswrapper[18592]: I0308 04:17:22.997704 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67869b4f85-qhzqs" event={"ID":"b300a420-49d9-40f0-9c31-7a2ff20dcbfc","Type":"ContainerStarted","Data":"256ccbae5eda5a80a8438932bd22a9196c02c07b3f03f86ce93be17444b401be"} Mar 08 04:17:22.998004 master-0 kubenswrapper[18592]: I0308 04:17:22.997716 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67869b4f85-qhzqs" event={"ID":"b300a420-49d9-40f0-9c31-7a2ff20dcbfc","Type":"ContainerStarted","Data":"5256a39d0af84d337d10cca8ab77c9a965d5fb861f86a42b8cd72ad6a9978374"} Mar 08 04:17:23.038223 master-0 kubenswrapper[18592]: I0308 04:17:23.038148 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67869b4f85-qhzqs" podStartSLOduration=2.038131995 podStartE2EDuration="2.038131995s" podCreationTimestamp="2026-03-08 04:17:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:17:23.0357334 +0000 UTC m=+1455.134487770" watchObservedRunningTime="2026-03-08 04:17:23.038131995 +0000 UTC m=+1455.136886345" Mar 08 04:17:24.440317 master-0 kubenswrapper[18592]: I0308 04:17:24.440257 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:17:24.603091 master-0 kubenswrapper[18592]: I0308 04:17:24.594787 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-db-sync-config-data\") pod \"117a9c49-cd48-4a2c-bdee-10bb60588b20\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " Mar 08 04:17:24.603091 master-0 kubenswrapper[18592]: I0308 04:17:24.595025 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdkv6\" (UniqueName: \"kubernetes.io/projected/117a9c49-cd48-4a2c-bdee-10bb60588b20-kube-api-access-wdkv6\") pod \"117a9c49-cd48-4a2c-bdee-10bb60588b20\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " Mar 08 04:17:24.603091 master-0 kubenswrapper[18592]: I0308 04:17:24.595067 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-config-data\") pod \"117a9c49-cd48-4a2c-bdee-10bb60588b20\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " Mar 08 04:17:24.603091 master-0 kubenswrapper[18592]: I0308 04:17:24.595159 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/117a9c49-cd48-4a2c-bdee-10bb60588b20-etc-machine-id\") pod \"117a9c49-cd48-4a2c-bdee-10bb60588b20\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " Mar 08 04:17:24.603091 master-0 kubenswrapper[18592]: I0308 04:17:24.595363 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-combined-ca-bundle\") pod \"117a9c49-cd48-4a2c-bdee-10bb60588b20\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " Mar 08 04:17:24.603091 master-0 kubenswrapper[18592]: I0308 04:17:24.595413 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-scripts\") pod \"117a9c49-cd48-4a2c-bdee-10bb60588b20\" (UID: \"117a9c49-cd48-4a2c-bdee-10bb60588b20\") " Mar 08 04:17:24.603091 master-0 kubenswrapper[18592]: I0308 04:17:24.595411 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/117a9c49-cd48-4a2c-bdee-10bb60588b20-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "117a9c49-cd48-4a2c-bdee-10bb60588b20" (UID: "117a9c49-cd48-4a2c-bdee-10bb60588b20"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:24.603091 master-0 kubenswrapper[18592]: I0308 04:17:24.596184 18592 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/117a9c49-cd48-4a2c-bdee-10bb60588b20-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:24.603091 master-0 kubenswrapper[18592]: I0308 04:17:24.599937 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "117a9c49-cd48-4a2c-bdee-10bb60588b20" (UID: "117a9c49-cd48-4a2c-bdee-10bb60588b20"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:24.603091 master-0 kubenswrapper[18592]: I0308 04:17:24.600614 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-scripts" (OuterVolumeSpecName: "scripts") pod "117a9c49-cd48-4a2c-bdee-10bb60588b20" (UID: "117a9c49-cd48-4a2c-bdee-10bb60588b20"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:24.612049 master-0 kubenswrapper[18592]: I0308 04:17:24.605269 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/117a9c49-cd48-4a2c-bdee-10bb60588b20-kube-api-access-wdkv6" (OuterVolumeSpecName: "kube-api-access-wdkv6") pod "117a9c49-cd48-4a2c-bdee-10bb60588b20" (UID: "117a9c49-cd48-4a2c-bdee-10bb60588b20"). InnerVolumeSpecName "kube-api-access-wdkv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:17:24.653246 master-0 kubenswrapper[18592]: I0308 04:17:24.653181 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "117a9c49-cd48-4a2c-bdee-10bb60588b20" (UID: "117a9c49-cd48-4a2c-bdee-10bb60588b20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:24.678556 master-0 kubenswrapper[18592]: I0308 04:17:24.678509 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-config-data" (OuterVolumeSpecName: "config-data") pod "117a9c49-cd48-4a2c-bdee-10bb60588b20" (UID: "117a9c49-cd48-4a2c-bdee-10bb60588b20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:24.697922 master-0 kubenswrapper[18592]: I0308 04:17:24.697875 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:24.697922 master-0 kubenswrapper[18592]: I0308 04:17:24.697914 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:24.697922 master-0 kubenswrapper[18592]: I0308 04:17:24.697923 18592 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:24.698479 master-0 kubenswrapper[18592]: I0308 04:17:24.697935 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdkv6\" (UniqueName: \"kubernetes.io/projected/117a9c49-cd48-4a2c-bdee-10bb60588b20-kube-api-access-wdkv6\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:24.698479 master-0 kubenswrapper[18592]: I0308 04:17:24.697944 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117a9c49-cd48-4a2c-bdee-10bb60588b20-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:25.029040 master-0 kubenswrapper[18592]: I0308 04:17:25.028927 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-db-sync-bgkm7" event={"ID":"117a9c49-cd48-4a2c-bdee-10bb60588b20","Type":"ContainerDied","Data":"2a58ccfbb8785c254db329c6d3dbdc21d79ca9f1b47d1e1f3e720136bb4002c9"} Mar 08 04:17:25.029040 master-0 kubenswrapper[18592]: I0308 04:17:25.028996 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a58ccfbb8785c254db329c6d3dbdc21d79ca9f1b47d1e1f3e720136bb4002c9" Mar 08 04:17:25.029040 master-0 kubenswrapper[18592]: I0308 04:17:25.029020 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-db-sync-bgkm7" Mar 08 04:17:25.594173 master-0 kubenswrapper[18592]: I0308 04:17:25.590032 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ff301-scheduler-0"] Mar 08 04:17:25.594173 master-0 kubenswrapper[18592]: E0308 04:17:25.590519 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="117a9c49-cd48-4a2c-bdee-10bb60588b20" containerName="cinder-ff301-db-sync" Mar 08 04:17:25.594173 master-0 kubenswrapper[18592]: I0308 04:17:25.590533 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="117a9c49-cd48-4a2c-bdee-10bb60588b20" containerName="cinder-ff301-db-sync" Mar 08 04:17:25.594173 master-0 kubenswrapper[18592]: I0308 04:17:25.590782 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="117a9c49-cd48-4a2c-bdee-10bb60588b20" containerName="cinder-ff301-db-sync" Mar 08 04:17:25.594173 master-0 kubenswrapper[18592]: I0308 04:17:25.592298 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.603844 master-0 kubenswrapper[18592]: I0308 04:17:25.602280 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-ff301-config-data" Mar 08 04:17:25.603844 master-0 kubenswrapper[18592]: I0308 04:17:25.602488 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-ff301-scheduler-config-data" Mar 08 04:17:25.603844 master-0 kubenswrapper[18592]: I0308 04:17:25.602593 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-ff301-scripts" Mar 08 04:17:25.671497 master-0 kubenswrapper[18592]: I0308 04:17:25.667723 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ff301-scheduler-0"] Mar 08 04:17:25.674253 master-0 kubenswrapper[18592]: I0308 04:17:25.674214 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-etc-machine-id\") pod \"cinder-ff301-scheduler-0\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.674399 master-0 kubenswrapper[18592]: I0308 04:17:25.674283 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-config-data\") pod \"cinder-ff301-scheduler-0\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.674399 master-0 kubenswrapper[18592]: I0308 04:17:25.674376 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-config-data-custom\") pod \"cinder-ff301-scheduler-0\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.674467 master-0 kubenswrapper[18592]: I0308 04:17:25.674400 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-combined-ca-bundle\") pod \"cinder-ff301-scheduler-0\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.674467 master-0 kubenswrapper[18592]: I0308 04:17:25.674429 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvb9v\" (UniqueName: \"kubernetes.io/projected/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-kube-api-access-rvb9v\") pod \"cinder-ff301-scheduler-0\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.674532 master-0 kubenswrapper[18592]: I0308 04:17:25.674486 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-scripts\") pod \"cinder-ff301-scheduler-0\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.727227 master-0 kubenswrapper[18592]: I0308 04:17:25.723850 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d65fdd857-mz6wd"] Mar 08 04:17:25.727227 master-0 kubenswrapper[18592]: I0308 04:17:25.724219 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" podUID="065d1362-4389-473a-9b59-4d3b25153deb" containerName="dnsmasq-dns" containerID="cri-o://787f0cb3a3f0b8aa215b026c1fb9e972156a7c6a16f7b3daf1ed6837b295d474" gracePeriod=10 Mar 08 04:17:25.727227 master-0 kubenswrapper[18592]: I0308 04:17:25.726982 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:25.788963 master-0 kubenswrapper[18592]: I0308 04:17:25.788852 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-config-data\") pod \"cinder-ff301-scheduler-0\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.792730 master-0 kubenswrapper[18592]: I0308 04:17:25.792148 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-config-data-custom\") pod \"cinder-ff301-scheduler-0\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.792730 master-0 kubenswrapper[18592]: I0308 04:17:25.792221 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-combined-ca-bundle\") pod \"cinder-ff301-scheduler-0\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.792730 master-0 kubenswrapper[18592]: I0308 04:17:25.792286 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvb9v\" (UniqueName: \"kubernetes.io/projected/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-kube-api-access-rvb9v\") pod \"cinder-ff301-scheduler-0\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.792730 master-0 kubenswrapper[18592]: I0308 04:17:25.792485 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-scripts\") pod \"cinder-ff301-scheduler-0\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.792730 master-0 kubenswrapper[18592]: I0308 04:17:25.792571 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-etc-machine-id\") pod \"cinder-ff301-scheduler-0\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.793313 master-0 kubenswrapper[18592]: I0308 04:17:25.793235 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-etc-machine-id\") pod \"cinder-ff301-scheduler-0\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.797214 master-0 kubenswrapper[18592]: I0308 04:17:25.793882 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-config-data\") pod \"cinder-ff301-scheduler-0\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.797214 master-0 kubenswrapper[18592]: I0308 04:17:25.796952 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-config-data-custom\") pod \"cinder-ff301-scheduler-0\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.800303 master-0 kubenswrapper[18592]: I0308 04:17:25.800238 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-combined-ca-bundle\") pod \"cinder-ff301-scheduler-0\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.801763 master-0 kubenswrapper[18592]: I0308 04:17:25.801061 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-scripts\") pod \"cinder-ff301-scheduler-0\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.806437 master-0 kubenswrapper[18592]: I0308 04:17:25.803444 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ff301-volume-lvm-iscsi-0"] Mar 08 04:17:25.820095 master-0 kubenswrapper[18592]: I0308 04:17:25.819767 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:25.829905 master-0 kubenswrapper[18592]: I0308 04:17:25.825228 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6977d9bcc9-rzdgq"] Mar 08 04:17:25.829905 master-0 kubenswrapper[18592]: I0308 04:17:25.827185 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:25.838766 master-0 kubenswrapper[18592]: I0308 04:17:25.838146 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvb9v\" (UniqueName: \"kubernetes.io/projected/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-kube-api-access-rvb9v\") pod \"cinder-ff301-scheduler-0\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:25.838766 master-0 kubenswrapper[18592]: I0308 04:17:25.838297 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-ff301-volume-lvm-iscsi-config-data" Mar 08 04:17:25.852909 master-0 kubenswrapper[18592]: I0308 04:17:25.852537 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ff301-volume-lvm-iscsi-0"] Mar 08 04:17:25.873549 master-0 kubenswrapper[18592]: I0308 04:17:25.873489 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6977d9bcc9-rzdgq"] Mar 08 04:17:25.885836 master-0 kubenswrapper[18592]: I0308 04:17:25.885766 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ff301-backup-0"] Mar 08 04:17:25.887624 master-0 kubenswrapper[18592]: I0308 04:17:25.887581 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:25.893672 master-0 kubenswrapper[18592]: I0308 04:17:25.891256 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-ff301-backup-config-data" Mar 08 04:17:25.903726 master-0 kubenswrapper[18592]: I0308 04:17:25.903639 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-config-data\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:25.903726 master-0 kubenswrapper[18592]: I0308 04:17:25.903713 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-etc-nvme\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:25.903992 master-0 kubenswrapper[18592]: I0308 04:17:25.903739 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-config\") pod \"dnsmasq-dns-6977d9bcc9-rzdgq\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:25.903992 master-0 kubenswrapper[18592]: I0308 04:17:25.903784 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-var-locks-cinder\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:25.903992 master-0 kubenswrapper[18592]: I0308 04:17:25.903806 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-ovsdbserver-sb\") pod \"dnsmasq-dns-6977d9bcc9-rzdgq\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:25.903992 master-0 kubenswrapper[18592]: I0308 04:17:25.903866 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-dev\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:25.903992 master-0 kubenswrapper[18592]: I0308 04:17:25.903927 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwvdn\" (UniqueName: \"kubernetes.io/projected/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-kube-api-access-gwvdn\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:25.904154 master-0 kubenswrapper[18592]: I0308 04:17:25.904033 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-run\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:25.904154 master-0 kubenswrapper[18592]: I0308 04:17:25.904058 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-lib-modules\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:25.904154 master-0 kubenswrapper[18592]: I0308 04:17:25.904075 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-config-data-custom\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:25.904154 master-0 kubenswrapper[18592]: I0308 04:17:25.904105 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-scripts\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:25.904877 master-0 kubenswrapper[18592]: I0308 04:17:25.904297 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-sys\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:25.904877 master-0 kubenswrapper[18592]: I0308 04:17:25.904383 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-ovsdbserver-nb\") pod \"dnsmasq-dns-6977d9bcc9-rzdgq\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:25.904877 master-0 kubenswrapper[18592]: I0308 04:17:25.904447 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqz4j\" (UniqueName: \"kubernetes.io/projected/374c1597-22de-4f77-a61a-6e72a503dfd0-kube-api-access-xqz4j\") pod \"dnsmasq-dns-6977d9bcc9-rzdgq\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:25.904877 master-0 kubenswrapper[18592]: I0308 04:17:25.904476 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-var-locks-brick\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:25.904877 master-0 kubenswrapper[18592]: I0308 04:17:25.904549 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-combined-ca-bundle\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:25.904877 master-0 kubenswrapper[18592]: I0308 04:17:25.904587 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-etc-machine-id\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:25.904877 master-0 kubenswrapper[18592]: I0308 04:17:25.904627 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-etc-iscsi\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:25.904877 master-0 kubenswrapper[18592]: I0308 04:17:25.904684 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-dns-swift-storage-0\") pod \"dnsmasq-dns-6977d9bcc9-rzdgq\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:25.904877 master-0 kubenswrapper[18592]: I0308 04:17:25.904713 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-dns-svc\") pod \"dnsmasq-dns-6977d9bcc9-rzdgq\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:25.904877 master-0 kubenswrapper[18592]: I0308 04:17:25.904772 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-var-lib-cinder\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:25.943846 master-0 kubenswrapper[18592]: I0308 04:17:25.933415 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ff301-backup-0"] Mar 08 04:17:25.978417 master-0 kubenswrapper[18592]: I0308 04:17:25.977006 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ff301-api-0"] Mar 08 04:17:25.981463 master-0 kubenswrapper[18592]: I0308 04:17:25.978887 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-api-0" Mar 08 04:17:25.986873 master-0 kubenswrapper[18592]: I0308 04:17:25.981893 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-ff301-api-config-data" Mar 08 04:17:25.989446 master-0 kubenswrapper[18592]: I0308 04:17:25.989391 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ff301-api-0"] Mar 08 04:17:26.007568 master-0 kubenswrapper[18592]: I0308 04:17:26.007524 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-etc-iscsi\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.007568 master-0 kubenswrapper[18592]: I0308 04:17:26.007568 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-sys\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.007732 master-0 kubenswrapper[18592]: I0308 04:17:26.007600 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-sys\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.007732 master-0 kubenswrapper[18592]: I0308 04:17:26.007620 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-dev\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.007732 master-0 kubenswrapper[18592]: I0308 04:17:26.007651 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-ovsdbserver-nb\") pod \"dnsmasq-dns-6977d9bcc9-rzdgq\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:26.007732 master-0 kubenswrapper[18592]: I0308 04:17:26.007696 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-config-data\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.007732 master-0 kubenswrapper[18592]: I0308 04:17:26.007725 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqz4j\" (UniqueName: \"kubernetes.io/projected/374c1597-22de-4f77-a61a-6e72a503dfd0-kube-api-access-xqz4j\") pod \"dnsmasq-dns-6977d9bcc9-rzdgq\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:26.007903 master-0 kubenswrapper[18592]: I0308 04:17:26.007741 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-scripts\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.007903 master-0 kubenswrapper[18592]: I0308 04:17:26.007755 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-lib-modules\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.007903 master-0 kubenswrapper[18592]: I0308 04:17:26.007773 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-var-locks-brick\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.007903 master-0 kubenswrapper[18592]: I0308 04:17:26.007790 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-etc-machine-id\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.007903 master-0 kubenswrapper[18592]: I0308 04:17:26.007808 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-combined-ca-bundle\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.007903 master-0 kubenswrapper[18592]: I0308 04:17:26.007842 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-etc-iscsi\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.007903 master-0 kubenswrapper[18592]: I0308 04:17:26.007860 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-var-lib-cinder\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.008236 master-0 kubenswrapper[18592]: I0308 04:17:26.008206 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-etc-iscsi\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.008274 master-0 kubenswrapper[18592]: I0308 04:17:26.008244 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-dns-swift-storage-0\") pod \"dnsmasq-dns-6977d9bcc9-rzdgq\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:26.008274 master-0 kubenswrapper[18592]: I0308 04:17:26.008267 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-dns-svc\") pod \"dnsmasq-dns-6977d9bcc9-rzdgq\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:26.008343 master-0 kubenswrapper[18592]: I0308 04:17:26.008295 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-var-lib-cinder\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.008343 master-0 kubenswrapper[18592]: I0308 04:17:26.008311 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-config-data\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.008343 master-0 kubenswrapper[18592]: I0308 04:17:26.008327 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf822\" (UniqueName: \"kubernetes.io/projected/0f0421ee-53ce-4160-abfc-a5968415005b-kube-api-access-bf822\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.008432 master-0 kubenswrapper[18592]: I0308 04:17:26.008357 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-etc-nvme\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.008432 master-0 kubenswrapper[18592]: I0308 04:17:26.008375 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-combined-ca-bundle\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.008432 master-0 kubenswrapper[18592]: I0308 04:17:26.008393 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-config\") pod \"dnsmasq-dns-6977d9bcc9-rzdgq\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:26.008432 master-0 kubenswrapper[18592]: I0308 04:17:26.008428 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-etc-nvme\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.008552 master-0 kubenswrapper[18592]: I0308 04:17:26.008512 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-etc-nvme\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.008552 master-0 kubenswrapper[18592]: I0308 04:17:26.008536 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-sys\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.008720 master-0 kubenswrapper[18592]: I0308 04:17:26.008690 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-var-locks-brick\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.008769 master-0 kubenswrapper[18592]: I0308 04:17:26.008725 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-etc-machine-id\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.009600 master-0 kubenswrapper[18592]: I0308 04:17:26.009566 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-ovsdbserver-nb\") pod \"dnsmasq-dns-6977d9bcc9-rzdgq\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:26.009667 master-0 kubenswrapper[18592]: I0308 04:17:26.009649 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-var-locks-cinder\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.009735 master-0 kubenswrapper[18592]: I0308 04:17:26.009690 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-ovsdbserver-sb\") pod \"dnsmasq-dns-6977d9bcc9-rzdgq\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:26.009856 master-0 kubenswrapper[18592]: I0308 04:17:26.009815 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-dev\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.009905 master-0 kubenswrapper[18592]: I0308 04:17:26.009867 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwvdn\" (UniqueName: \"kubernetes.io/projected/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-kube-api-access-gwvdn\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.010041 master-0 kubenswrapper[18592]: I0308 04:17:26.009973 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-run\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.010217 master-0 kubenswrapper[18592]: I0308 04:17:26.010195 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-scripts\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.010269 master-0 kubenswrapper[18592]: I0308 04:17:26.010220 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-lib-modules\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.010307 master-0 kubenswrapper[18592]: I0308 04:17:26.010275 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-config-data-custom\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.010307 master-0 kubenswrapper[18592]: I0308 04:17:26.010299 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-var-locks-brick\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.010407 master-0 kubenswrapper[18592]: I0308 04:17:26.010385 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-run\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.010456 master-0 kubenswrapper[18592]: I0308 04:17:26.010432 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-var-locks-cinder\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.010490 master-0 kubenswrapper[18592]: I0308 04:17:26.010465 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-config-data-custom\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.010526 master-0 kubenswrapper[18592]: I0308 04:17:26.010515 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-etc-machine-id\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.012494 master-0 kubenswrapper[18592]: I0308 04:17:26.012447 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-dns-swift-storage-0\") pod \"dnsmasq-dns-6977d9bcc9-rzdgq\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:26.013895 master-0 kubenswrapper[18592]: I0308 04:17:26.013269 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-dns-svc\") pod \"dnsmasq-dns-6977d9bcc9-rzdgq\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:26.013895 master-0 kubenswrapper[18592]: I0308 04:17:26.013407 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-var-lib-cinder\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.013895 master-0 kubenswrapper[18592]: I0308 04:17:26.013580 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-combined-ca-bundle\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.014349 master-0 kubenswrapper[18592]: I0308 04:17:26.014323 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-config\") pod \"dnsmasq-dns-6977d9bcc9-rzdgq\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:26.014930 master-0 kubenswrapper[18592]: I0308 04:17:26.014854 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-dev\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.014997 master-0 kubenswrapper[18592]: I0308 04:17:26.014926 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-lib-modules\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.015096 master-0 kubenswrapper[18592]: I0308 04:17:26.015071 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-var-locks-cinder\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.015702 master-0 kubenswrapper[18592]: I0308 04:17:26.015678 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-ovsdbserver-sb\") pod \"dnsmasq-dns-6977d9bcc9-rzdgq\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:26.015948 master-0 kubenswrapper[18592]: I0308 04:17:26.015918 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-run\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.016419 master-0 kubenswrapper[18592]: I0308 04:17:26.016392 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:26.040953 master-0 kubenswrapper[18592]: I0308 04:17:26.040907 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-config-data\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.041801 master-0 kubenswrapper[18592]: I0308 04:17:26.041731 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-config-data-custom\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.043464 master-0 kubenswrapper[18592]: I0308 04:17:26.043446 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-scripts\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.050658 master-0 kubenswrapper[18592]: I0308 04:17:26.050618 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwvdn\" (UniqueName: \"kubernetes.io/projected/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-kube-api-access-gwvdn\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.051160 master-0 kubenswrapper[18592]: I0308 04:17:26.051130 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqz4j\" (UniqueName: \"kubernetes.io/projected/374c1597-22de-4f77-a61a-6e72a503dfd0-kube-api-access-xqz4j\") pod \"dnsmasq-dns-6977d9bcc9-rzdgq\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:26.080202 master-0 kubenswrapper[18592]: I0308 04:17:26.080146 18592 generic.go:334] "Generic (PLEG): container finished" podID="065d1362-4389-473a-9b59-4d3b25153deb" containerID="787f0cb3a3f0b8aa215b026c1fb9e972156a7c6a16f7b3daf1ed6837b295d474" exitCode=0 Mar 08 04:17:26.080202 master-0 kubenswrapper[18592]: I0308 04:17:26.080197 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" event={"ID":"065d1362-4389-473a-9b59-4d3b25153deb","Type":"ContainerDied","Data":"787f0cb3a3f0b8aa215b026c1fb9e972156a7c6a16f7b3daf1ed6837b295d474"} Mar 08 04:17:26.113544 master-0 kubenswrapper[18592]: I0308 04:17:26.113480 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-config-data\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.113544 master-0 kubenswrapper[18592]: I0308 04:17:26.113547 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4dde472-61ed-49eb-aa34-0addbba05d94-logs\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.113760 master-0 kubenswrapper[18592]: I0308 04:17:26.113582 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-scripts\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.113760 master-0 kubenswrapper[18592]: I0308 04:17:26.113602 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-lib-modules\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.113760 master-0 kubenswrapper[18592]: I0308 04:17:26.113619 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c4dde472-61ed-49eb-aa34-0addbba05d94-etc-machine-id\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.113760 master-0 kubenswrapper[18592]: I0308 04:17:26.113651 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-config-data-custom\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.113760 master-0 kubenswrapper[18592]: I0308 04:17:26.113672 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-var-lib-cinder\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.113760 master-0 kubenswrapper[18592]: I0308 04:17:26.113751 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf822\" (UniqueName: \"kubernetes.io/projected/0f0421ee-53ce-4160-abfc-a5968415005b-kube-api-access-bf822\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.113993 master-0 kubenswrapper[18592]: I0308 04:17:26.113786 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-combined-ca-bundle\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.113993 master-0 kubenswrapper[18592]: I0308 04:17:26.113864 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-etc-nvme\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.113993 master-0 kubenswrapper[18592]: I0308 04:17:26.113889 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-scripts\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.113993 master-0 kubenswrapper[18592]: I0308 04:17:26.113975 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-var-locks-brick\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.114115 master-0 kubenswrapper[18592]: I0308 04:17:26.114006 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-run\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.114115 master-0 kubenswrapper[18592]: I0308 04:17:26.114024 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-var-locks-cinder\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.114115 master-0 kubenswrapper[18592]: I0308 04:17:26.114045 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-config-data-custom\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.114115 master-0 kubenswrapper[18592]: I0308 04:17:26.114059 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-config-data\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.114115 master-0 kubenswrapper[18592]: I0308 04:17:26.114078 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-etc-machine-id\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.114115 master-0 kubenswrapper[18592]: I0308 04:17:26.114109 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-etc-iscsi\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.114277 master-0 kubenswrapper[18592]: I0308 04:17:26.114127 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-sys\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.114277 master-0 kubenswrapper[18592]: I0308 04:17:26.114153 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-dev\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.114277 master-0 kubenswrapper[18592]: I0308 04:17:26.114179 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s8kw\" (UniqueName: \"kubernetes.io/projected/c4dde472-61ed-49eb-aa34-0addbba05d94-kube-api-access-8s8kw\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.114277 master-0 kubenswrapper[18592]: I0308 04:17:26.114201 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-combined-ca-bundle\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.115268 master-0 kubenswrapper[18592]: I0308 04:17:26.115228 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-etc-nvme\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.115435 master-0 kubenswrapper[18592]: I0308 04:17:26.115388 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-var-locks-cinder\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.116289 master-0 kubenswrapper[18592]: I0308 04:17:26.116254 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-var-locks-brick\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.116344 master-0 kubenswrapper[18592]: I0308 04:17:26.116288 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-run\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.116344 master-0 kubenswrapper[18592]: I0308 04:17:26.116313 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-etc-iscsi\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.122992 master-0 kubenswrapper[18592]: I0308 04:17:26.119073 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-sys\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.122992 master-0 kubenswrapper[18592]: I0308 04:17:26.119128 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-dev\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.122992 master-0 kubenswrapper[18592]: I0308 04:17:26.122552 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-config-data\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.122992 master-0 kubenswrapper[18592]: I0308 04:17:26.122726 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-etc-machine-id\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.123343 master-0 kubenswrapper[18592]: I0308 04:17:26.123007 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-var-lib-cinder\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.125533 master-0 kubenswrapper[18592]: I0308 04:17:26.125501 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-lib-modules\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.130345 master-0 kubenswrapper[18592]: I0308 04:17:26.130300 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-config-data-custom\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.135703 master-0 kubenswrapper[18592]: I0308 04:17:26.135569 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-combined-ca-bundle\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.153005 master-0 kubenswrapper[18592]: I0308 04:17:26.152793 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf822\" (UniqueName: \"kubernetes.io/projected/0f0421ee-53ce-4160-abfc-a5968415005b-kube-api-access-bf822\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.153632 master-0 kubenswrapper[18592]: I0308 04:17:26.153411 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-scripts\") pod \"cinder-ff301-backup-0\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.223001 master-0 kubenswrapper[18592]: I0308 04:17:26.216025 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-scripts\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.223001 master-0 kubenswrapper[18592]: I0308 04:17:26.220721 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-config-data\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.223001 master-0 kubenswrapper[18592]: I0308 04:17:26.220858 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s8kw\" (UniqueName: \"kubernetes.io/projected/c4dde472-61ed-49eb-aa34-0addbba05d94-kube-api-access-8s8kw\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.223001 master-0 kubenswrapper[18592]: I0308 04:17:26.220890 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-combined-ca-bundle\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.224428 master-0 kubenswrapper[18592]: I0308 04:17:26.224327 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4dde472-61ed-49eb-aa34-0addbba05d94-logs\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.224493 master-0 kubenswrapper[18592]: I0308 04:17:26.224438 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c4dde472-61ed-49eb-aa34-0addbba05d94-etc-machine-id\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.224493 master-0 kubenswrapper[18592]: I0308 04:17:26.224476 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-config-data-custom\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.240628 master-0 kubenswrapper[18592]: I0308 04:17:26.225227 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c4dde472-61ed-49eb-aa34-0addbba05d94-etc-machine-id\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.240628 master-0 kubenswrapper[18592]: I0308 04:17:26.226101 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-scripts\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.240628 master-0 kubenswrapper[18592]: I0308 04:17:26.226186 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4dde472-61ed-49eb-aa34-0addbba05d94-logs\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.240628 master-0 kubenswrapper[18592]: I0308 04:17:26.227498 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-combined-ca-bundle\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.240628 master-0 kubenswrapper[18592]: I0308 04:17:26.229162 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-config-data\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.240628 master-0 kubenswrapper[18592]: I0308 04:17:26.231772 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-config-data-custom\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.259848 master-0 kubenswrapper[18592]: I0308 04:17:26.249314 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s8kw\" (UniqueName: \"kubernetes.io/projected/c4dde472-61ed-49eb-aa34-0addbba05d94-kube-api-access-8s8kw\") pod \"cinder-ff301-api-0\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.293409 master-0 kubenswrapper[18592]: I0308 04:17:26.293352 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:26.307315 master-0 kubenswrapper[18592]: I0308 04:17:26.307253 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:26.357613 master-0 kubenswrapper[18592]: I0308 04:17:26.357574 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:26.431843 master-0 kubenswrapper[18592]: I0308 04:17:26.429726 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:26.444868 master-0 kubenswrapper[18592]: I0308 04:17:26.443588 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-ovsdbserver-nb\") pod \"065d1362-4389-473a-9b59-4d3b25153deb\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " Mar 08 04:17:26.444868 master-0 kubenswrapper[18592]: I0308 04:17:26.443676 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-dns-svc\") pod \"065d1362-4389-473a-9b59-4d3b25153deb\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " Mar 08 04:17:26.444868 master-0 kubenswrapper[18592]: I0308 04:17:26.443744 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-ovsdbserver-sb\") pod \"065d1362-4389-473a-9b59-4d3b25153deb\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " Mar 08 04:17:26.444868 master-0 kubenswrapper[18592]: I0308 04:17:26.443764 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26xxg\" (UniqueName: \"kubernetes.io/projected/065d1362-4389-473a-9b59-4d3b25153deb-kube-api-access-26xxg\") pod \"065d1362-4389-473a-9b59-4d3b25153deb\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " Mar 08 04:17:26.444868 master-0 kubenswrapper[18592]: I0308 04:17:26.443800 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-config\") pod \"065d1362-4389-473a-9b59-4d3b25153deb\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " Mar 08 04:17:26.444868 master-0 kubenswrapper[18592]: I0308 04:17:26.443846 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-dns-swift-storage-0\") pod \"065d1362-4389-473a-9b59-4d3b25153deb\" (UID: \"065d1362-4389-473a-9b59-4d3b25153deb\") " Mar 08 04:17:26.465855 master-0 kubenswrapper[18592]: I0308 04:17:26.463399 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-api-0" Mar 08 04:17:26.500284 master-0 kubenswrapper[18592]: I0308 04:17:26.499306 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/065d1362-4389-473a-9b59-4d3b25153deb-kube-api-access-26xxg" (OuterVolumeSpecName: "kube-api-access-26xxg") pod "065d1362-4389-473a-9b59-4d3b25153deb" (UID: "065d1362-4389-473a-9b59-4d3b25153deb"). InnerVolumeSpecName "kube-api-access-26xxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:17:26.561091 master-0 kubenswrapper[18592]: I0308 04:17:26.546281 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26xxg\" (UniqueName: \"kubernetes.io/projected/065d1362-4389-473a-9b59-4d3b25153deb-kube-api-access-26xxg\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:26.635838 master-0 kubenswrapper[18592]: I0308 04:17:26.633101 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "065d1362-4389-473a-9b59-4d3b25153deb" (UID: "065d1362-4389-473a-9b59-4d3b25153deb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:17:26.668885 master-0 kubenswrapper[18592]: I0308 04:17:26.657443 18592 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:26.724664 master-0 kubenswrapper[18592]: I0308 04:17:26.720768 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ff301-scheduler-0"] Mar 08 04:17:26.767897 master-0 kubenswrapper[18592]: I0308 04:17:26.767090 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "065d1362-4389-473a-9b59-4d3b25153deb" (UID: "065d1362-4389-473a-9b59-4d3b25153deb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:17:26.778273 master-0 kubenswrapper[18592]: I0308 04:17:26.774815 18592 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 04:17:26.784002 master-0 kubenswrapper[18592]: I0308 04:17:26.783964 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-config" (OuterVolumeSpecName: "config") pod "065d1362-4389-473a-9b59-4d3b25153deb" (UID: "065d1362-4389-473a-9b59-4d3b25153deb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:17:26.805232 master-0 kubenswrapper[18592]: I0308 04:17:26.802363 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "065d1362-4389-473a-9b59-4d3b25153deb" (UID: "065d1362-4389-473a-9b59-4d3b25153deb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:17:26.840381 master-0 kubenswrapper[18592]: I0308 04:17:26.840319 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "065d1362-4389-473a-9b59-4d3b25153deb" (UID: "065d1362-4389-473a-9b59-4d3b25153deb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:17:26.868596 master-0 kubenswrapper[18592]: I0308 04:17:26.868223 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:26.868596 master-0 kubenswrapper[18592]: I0308 04:17:26.868266 18592 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:26.868596 master-0 kubenswrapper[18592]: I0308 04:17:26.868275 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:26.868596 master-0 kubenswrapper[18592]: I0308 04:17:26.868283 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065d1362-4389-473a-9b59-4d3b25153deb-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:27.049970 master-0 kubenswrapper[18592]: I0308 04:17:27.032684 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ff301-volume-lvm-iscsi-0"] Mar 08 04:17:27.117099 master-0 kubenswrapper[18592]: I0308 04:17:27.115007 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" event={"ID":"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01","Type":"ContainerStarted","Data":"f743e8645bdb4996ccedcac0cbe2af7c6001c7fafc0468d8e2082f085338ee8a"} Mar 08 04:17:27.117099 master-0 kubenswrapper[18592]: I0308 04:17:27.116345 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" event={"ID":"065d1362-4389-473a-9b59-4d3b25153deb","Type":"ContainerDied","Data":"37ffe400256714d2ed75f61632704eb63923874e1911f9d746a455a7c18a7cb2"} Mar 08 04:17:27.117099 master-0 kubenswrapper[18592]: I0308 04:17:27.116372 18592 scope.go:117] "RemoveContainer" containerID="787f0cb3a3f0b8aa215b026c1fb9e972156a7c6a16f7b3daf1ed6837b295d474" Mar 08 04:17:27.117099 master-0 kubenswrapper[18592]: I0308 04:17:27.116480 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d65fdd857-mz6wd" Mar 08 04:17:27.134891 master-0 kubenswrapper[18592]: I0308 04:17:27.131797 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-scheduler-0" event={"ID":"808f8fcb-c2b2-4df7-9c7a-9838aaf744af","Type":"ContainerStarted","Data":"18f7e2063e3936084eaca1b17f02965ef79317b24466e0686f4498c2cc7dbacc"} Mar 08 04:17:27.180693 master-0 kubenswrapper[18592]: I0308 04:17:27.180605 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6977d9bcc9-rzdgq"] Mar 08 04:17:27.181782 master-0 kubenswrapper[18592]: I0308 04:17:27.181579 18592 scope.go:117] "RemoveContainer" containerID="26bb6bf3b613140928de4297065d88e9ca8ca8bc54b20972fe0bbf1c366abc85" Mar 08 04:17:27.199872 master-0 kubenswrapper[18592]: I0308 04:17:27.199594 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d65fdd857-mz6wd"] Mar 08 04:17:27.210137 master-0 kubenswrapper[18592]: I0308 04:17:27.210097 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d65fdd857-mz6wd"] Mar 08 04:17:27.268727 master-0 kubenswrapper[18592]: I0308 04:17:27.268677 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ff301-api-0"] Mar 08 04:17:27.502621 master-0 kubenswrapper[18592]: I0308 04:17:27.502404 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ff301-backup-0"] Mar 08 04:17:27.527172 master-0 kubenswrapper[18592]: W0308 04:17:27.526442 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f0421ee_53ce_4160_abfc_a5968415005b.slice/crio-740342c0d64c7138e970900d47e4d32f154103cedb519501c4826b27b35dc15e WatchSource:0}: Error finding container 740342c0d64c7138e970900d47e4d32f154103cedb519501c4826b27b35dc15e: Status 404 returned error can't find the container with id 740342c0d64c7138e970900d47e4d32f154103cedb519501c4826b27b35dc15e Mar 08 04:17:28.160616 master-0 kubenswrapper[18592]: I0308 04:17:28.160235 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="065d1362-4389-473a-9b59-4d3b25153deb" path="/var/lib/kubelet/pods/065d1362-4389-473a-9b59-4d3b25153deb/volumes" Mar 08 04:17:28.172798 master-0 kubenswrapper[18592]: I0308 04:17:28.166613 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-api-0" event={"ID":"c4dde472-61ed-49eb-aa34-0addbba05d94","Type":"ContainerStarted","Data":"a139db48489a58eb26371d9b815d217b5366fbff77d3c77ff9b8e2feabd3a84d"} Mar 08 04:17:28.172798 master-0 kubenswrapper[18592]: I0308 04:17:28.166653 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-api-0" event={"ID":"c4dde472-61ed-49eb-aa34-0addbba05d94","Type":"ContainerStarted","Data":"4de883371e6530231daf13ab4069477c6bd9a0c8628952b090f46d31a4cadd84"} Mar 08 04:17:28.174929 master-0 kubenswrapper[18592]: I0308 04:17:28.174415 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" event={"ID":"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01","Type":"ContainerStarted","Data":"026138a76b3039bbe26925baaeac4b9e1f51172480fae9c8f1c644a11823150b"} Mar 08 04:17:28.177059 master-0 kubenswrapper[18592]: I0308 04:17:28.177017 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-backup-0" event={"ID":"0f0421ee-53ce-4160-abfc-a5968415005b","Type":"ContainerStarted","Data":"740342c0d64c7138e970900d47e4d32f154103cedb519501c4826b27b35dc15e"} Mar 08 04:17:28.197168 master-0 kubenswrapper[18592]: I0308 04:17:28.192439 18592 generic.go:334] "Generic (PLEG): container finished" podID="374c1597-22de-4f77-a61a-6e72a503dfd0" containerID="2bd972438fceb0a22512a8429c86a34d80d2769a29d2da113c93266de3f8281c" exitCode=0 Mar 08 04:17:28.197168 master-0 kubenswrapper[18592]: I0308 04:17:28.192482 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" event={"ID":"374c1597-22de-4f77-a61a-6e72a503dfd0","Type":"ContainerDied","Data":"2bd972438fceb0a22512a8429c86a34d80d2769a29d2da113c93266de3f8281c"} Mar 08 04:17:28.197168 master-0 kubenswrapper[18592]: I0308 04:17:28.192509 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" event={"ID":"374c1597-22de-4f77-a61a-6e72a503dfd0","Type":"ContainerStarted","Data":"ccd84b97dcf8a5ecb9bf9079360e84cf46df9db2a68d8c74f85c4e2a2e93c1f2"} Mar 08 04:17:28.825846 master-0 kubenswrapper[18592]: I0308 04:17:28.818806 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ff301-api-0"] Mar 08 04:17:29.229194 master-0 kubenswrapper[18592]: I0308 04:17:29.229142 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-backup-0" event={"ID":"0f0421ee-53ce-4160-abfc-a5968415005b","Type":"ContainerStarted","Data":"12199ca2ce66a98f608e1da86a561392f75037c0b84ea80ae27c5022a0692d96"} Mar 08 04:17:29.229194 master-0 kubenswrapper[18592]: I0308 04:17:29.229195 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-backup-0" event={"ID":"0f0421ee-53ce-4160-abfc-a5968415005b","Type":"ContainerStarted","Data":"0fcdce70899a7ba893c9e7446288886f7e98ce6c222494e6a125caa9b136ecaf"} Mar 08 04:17:29.250995 master-0 kubenswrapper[18592]: I0308 04:17:29.250819 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-scheduler-0" event={"ID":"808f8fcb-c2b2-4df7-9c7a-9838aaf744af","Type":"ContainerStarted","Data":"3eb0f5404cad7df753b609a49341dcfedcd0f8ba7eba964052e8a5ff5d83c94d"} Mar 08 04:17:29.253746 master-0 kubenswrapper[18592]: I0308 04:17:29.252784 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" event={"ID":"374c1597-22de-4f77-a61a-6e72a503dfd0","Type":"ContainerStarted","Data":"ffeb908bf05f96742f97bac99a4cde74e1a36304c0e36a7812b860bcd15de7b6"} Mar 08 04:17:29.253746 master-0 kubenswrapper[18592]: I0308 04:17:29.253705 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:29.265595 master-0 kubenswrapper[18592]: I0308 04:17:29.265546 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-ff301-backup-0" podStartSLOduration=3.354401265 podStartE2EDuration="4.265525684s" podCreationTimestamp="2026-03-08 04:17:25 +0000 UTC" firstStartedPulling="2026-03-08 04:17:27.531937686 +0000 UTC m=+1459.630692036" lastFinishedPulling="2026-03-08 04:17:28.443062105 +0000 UTC m=+1460.541816455" observedRunningTime="2026-03-08 04:17:29.257580118 +0000 UTC m=+1461.356334468" watchObservedRunningTime="2026-03-08 04:17:29.265525684 +0000 UTC m=+1461.364280034" Mar 08 04:17:29.276100 master-0 kubenswrapper[18592]: I0308 04:17:29.276053 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-api-0" event={"ID":"c4dde472-61ed-49eb-aa34-0addbba05d94","Type":"ContainerStarted","Data":"0d4de652b4f62c68bbd94344ea7a6f0f182d6c5f80ef08f93bb5fdf5b7df6533"} Mar 08 04:17:29.276350 master-0 kubenswrapper[18592]: I0308 04:17:29.276243 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-ff301-api-0" podUID="c4dde472-61ed-49eb-aa34-0addbba05d94" containerName="cinder-ff301-api-log" containerID="cri-o://a139db48489a58eb26371d9b815d217b5366fbff77d3c77ff9b8e2feabd3a84d" gracePeriod=30 Mar 08 04:17:29.276501 master-0 kubenswrapper[18592]: I0308 04:17:29.276486 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-ff301-api-0" Mar 08 04:17:29.276546 master-0 kubenswrapper[18592]: I0308 04:17:29.276523 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-ff301-api-0" podUID="c4dde472-61ed-49eb-aa34-0addbba05d94" containerName="cinder-api" containerID="cri-o://0d4de652b4f62c68bbd94344ea7a6f0f182d6c5f80ef08f93bb5fdf5b7df6533" gracePeriod=30 Mar 08 04:17:29.298415 master-0 kubenswrapper[18592]: I0308 04:17:29.298040 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" event={"ID":"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01","Type":"ContainerStarted","Data":"98ef436f086a681790db38099cf9de7225c61f5dcbfdb3938dd8fddf0c0c5b68"} Mar 08 04:17:29.300678 master-0 kubenswrapper[18592]: I0308 04:17:29.300611 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" podStartSLOduration=4.300593246 podStartE2EDuration="4.300593246s" podCreationTimestamp="2026-03-08 04:17:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:17:29.285436035 +0000 UTC m=+1461.384190385" watchObservedRunningTime="2026-03-08 04:17:29.300593246 +0000 UTC m=+1461.399347606" Mar 08 04:17:29.323910 master-0 kubenswrapper[18592]: I0308 04:17:29.321136 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-ff301-api-0" podStartSLOduration=4.321117855 podStartE2EDuration="4.321117855s" podCreationTimestamp="2026-03-08 04:17:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:17:29.311214915 +0000 UTC m=+1461.409969265" watchObservedRunningTime="2026-03-08 04:17:29.321117855 +0000 UTC m=+1461.419872205" Mar 08 04:17:29.360991 master-0 kubenswrapper[18592]: I0308 04:17:29.359046 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" podStartSLOduration=3.589989938 podStartE2EDuration="4.359012824s" podCreationTimestamp="2026-03-08 04:17:25 +0000 UTC" firstStartedPulling="2026-03-08 04:17:27.072119302 +0000 UTC m=+1459.170873652" lastFinishedPulling="2026-03-08 04:17:27.841142188 +0000 UTC m=+1459.939896538" observedRunningTime="2026-03-08 04:17:29.351624543 +0000 UTC m=+1461.450378893" watchObservedRunningTime="2026-03-08 04:17:29.359012824 +0000 UTC m=+1461.457767174" Mar 08 04:17:30.309911 master-0 kubenswrapper[18592]: I0308 04:17:30.309672 18592 generic.go:334] "Generic (PLEG): container finished" podID="c4dde472-61ed-49eb-aa34-0addbba05d94" containerID="a139db48489a58eb26371d9b815d217b5366fbff77d3c77ff9b8e2feabd3a84d" exitCode=143 Mar 08 04:17:30.309911 master-0 kubenswrapper[18592]: I0308 04:17:30.309748 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-api-0" event={"ID":"c4dde472-61ed-49eb-aa34-0addbba05d94","Type":"ContainerDied","Data":"a139db48489a58eb26371d9b815d217b5366fbff77d3c77ff9b8e2feabd3a84d"} Mar 08 04:17:30.312779 master-0 kubenswrapper[18592]: I0308 04:17:30.311627 18592 generic.go:334] "Generic (PLEG): container finished" podID="c11a4533-a895-42a7-8c17-d6f421276ae0" containerID="490068217b99e1b9eb5597b0547cfb888c0fa4b0491eeaa5a5cf7358d93d627e" exitCode=0 Mar 08 04:17:30.312779 master-0 kubenswrapper[18592]: I0308 04:17:30.311694 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-78kcs" event={"ID":"c11a4533-a895-42a7-8c17-d6f421276ae0","Type":"ContainerDied","Data":"490068217b99e1b9eb5597b0547cfb888c0fa4b0491eeaa5a5cf7358d93d627e"} Mar 08 04:17:30.322852 master-0 kubenswrapper[18592]: I0308 04:17:30.322769 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-scheduler-0" event={"ID":"808f8fcb-c2b2-4df7-9c7a-9838aaf744af","Type":"ContainerStarted","Data":"18b944393cbbd8944c868595b59690bf821e5a692d25a4b47311ab6a5400e2d5"} Mar 08 04:17:30.359462 master-0 kubenswrapper[18592]: I0308 04:17:30.355552 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-ff301-scheduler-0" podStartSLOduration=4.293617908 podStartE2EDuration="5.355522113s" podCreationTimestamp="2026-03-08 04:17:25 +0000 UTC" firstStartedPulling="2026-03-08 04:17:26.774752862 +0000 UTC m=+1458.873507212" lastFinishedPulling="2026-03-08 04:17:27.836657067 +0000 UTC m=+1459.935411417" observedRunningTime="2026-03-08 04:17:30.354768652 +0000 UTC m=+1462.453523002" watchObservedRunningTime="2026-03-08 04:17:30.355522113 +0000 UTC m=+1462.454276503" Mar 08 04:17:31.044578 master-0 kubenswrapper[18592]: I0308 04:17:31.044482 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:31.294559 master-0 kubenswrapper[18592]: I0308 04:17:31.294496 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:31.430862 master-0 kubenswrapper[18592]: I0308 04:17:31.430387 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:31.800250 master-0 kubenswrapper[18592]: I0308 04:17:31.800207 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-78kcs" Mar 08 04:17:31.926299 master-0 kubenswrapper[18592]: I0308 04:17:31.926187 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11a4533-a895-42a7-8c17-d6f421276ae0-combined-ca-bundle\") pod \"c11a4533-a895-42a7-8c17-d6f421276ae0\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " Mar 08 04:17:31.926299 master-0 kubenswrapper[18592]: I0308 04:17:31.926305 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c11a4533-a895-42a7-8c17-d6f421276ae0-scripts\") pod \"c11a4533-a895-42a7-8c17-d6f421276ae0\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " Mar 08 04:17:31.926643 master-0 kubenswrapper[18592]: I0308 04:17:31.926603 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11a4533-a895-42a7-8c17-d6f421276ae0-config-data\") pod \"c11a4533-a895-42a7-8c17-d6f421276ae0\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " Mar 08 04:17:31.926691 master-0 kubenswrapper[18592]: I0308 04:17:31.926679 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rcph\" (UniqueName: \"kubernetes.io/projected/c11a4533-a895-42a7-8c17-d6f421276ae0-kube-api-access-6rcph\") pod \"c11a4533-a895-42a7-8c17-d6f421276ae0\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " Mar 08 04:17:31.926951 master-0 kubenswrapper[18592]: I0308 04:17:31.926930 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c11a4533-a895-42a7-8c17-d6f421276ae0-etc-podinfo\") pod \"c11a4533-a895-42a7-8c17-d6f421276ae0\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " Mar 08 04:17:31.927016 master-0 kubenswrapper[18592]: I0308 04:17:31.926960 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c11a4533-a895-42a7-8c17-d6f421276ae0-config-data-merged\") pod \"c11a4533-a895-42a7-8c17-d6f421276ae0\" (UID: \"c11a4533-a895-42a7-8c17-d6f421276ae0\") " Mar 08 04:17:31.930074 master-0 kubenswrapper[18592]: I0308 04:17:31.928289 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c11a4533-a895-42a7-8c17-d6f421276ae0-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "c11a4533-a895-42a7-8c17-d6f421276ae0" (UID: "c11a4533-a895-42a7-8c17-d6f421276ae0"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:17:31.931908 master-0 kubenswrapper[18592]: I0308 04:17:31.931865 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c11a4533-a895-42a7-8c17-d6f421276ae0-kube-api-access-6rcph" (OuterVolumeSpecName: "kube-api-access-6rcph") pod "c11a4533-a895-42a7-8c17-d6f421276ae0" (UID: "c11a4533-a895-42a7-8c17-d6f421276ae0"). InnerVolumeSpecName "kube-api-access-6rcph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:17:31.942983 master-0 kubenswrapper[18592]: I0308 04:17:31.942938 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c11a4533-a895-42a7-8c17-d6f421276ae0-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "c11a4533-a895-42a7-8c17-d6f421276ae0" (UID: "c11a4533-a895-42a7-8c17-d6f421276ae0"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 04:17:31.945442 master-0 kubenswrapper[18592]: I0308 04:17:31.945380 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11a4533-a895-42a7-8c17-d6f421276ae0-scripts" (OuterVolumeSpecName: "scripts") pod "c11a4533-a895-42a7-8c17-d6f421276ae0" (UID: "c11a4533-a895-42a7-8c17-d6f421276ae0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:31.961089 master-0 kubenswrapper[18592]: I0308 04:17:31.961027 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11a4533-a895-42a7-8c17-d6f421276ae0-config-data" (OuterVolumeSpecName: "config-data") pod "c11a4533-a895-42a7-8c17-d6f421276ae0" (UID: "c11a4533-a895-42a7-8c17-d6f421276ae0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:31.994312 master-0 kubenswrapper[18592]: I0308 04:17:31.994151 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11a4533-a895-42a7-8c17-d6f421276ae0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c11a4533-a895-42a7-8c17-d6f421276ae0" (UID: "c11a4533-a895-42a7-8c17-d6f421276ae0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:32.032416 master-0 kubenswrapper[18592]: I0308 04:17:32.030341 18592 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c11a4533-a895-42a7-8c17-d6f421276ae0-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:32.032416 master-0 kubenswrapper[18592]: I0308 04:17:32.030389 18592 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c11a4533-a895-42a7-8c17-d6f421276ae0-config-data-merged\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:32.032416 master-0 kubenswrapper[18592]: I0308 04:17:32.030404 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11a4533-a895-42a7-8c17-d6f421276ae0-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:32.032416 master-0 kubenswrapper[18592]: I0308 04:17:32.030413 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c11a4533-a895-42a7-8c17-d6f421276ae0-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:32.032416 master-0 kubenswrapper[18592]: I0308 04:17:32.030421 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11a4533-a895-42a7-8c17-d6f421276ae0-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:32.032416 master-0 kubenswrapper[18592]: I0308 04:17:32.030430 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rcph\" (UniqueName: \"kubernetes.io/projected/c11a4533-a895-42a7-8c17-d6f421276ae0-kube-api-access-6rcph\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:32.371464 master-0 kubenswrapper[18592]: I0308 04:17:32.370087 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-78kcs" Mar 08 04:17:32.371464 master-0 kubenswrapper[18592]: I0308 04:17:32.370674 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-78kcs" event={"ID":"c11a4533-a895-42a7-8c17-d6f421276ae0","Type":"ContainerDied","Data":"9676ec124b0a6ead43b93ddca3b487f24b8d04d3023c9889fddbba49b66f557d"} Mar 08 04:17:32.371464 master-0 kubenswrapper[18592]: I0308 04:17:32.370717 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9676ec124b0a6ead43b93ddca3b487f24b8d04d3023c9889fddbba49b66f557d" Mar 08 04:17:32.940936 master-0 kubenswrapper[18592]: I0308 04:17:32.940864 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-create-kgmn4"] Mar 08 04:17:32.942218 master-0 kubenswrapper[18592]: E0308 04:17:32.941414 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="065d1362-4389-473a-9b59-4d3b25153deb" containerName="dnsmasq-dns" Mar 08 04:17:32.942218 master-0 kubenswrapper[18592]: I0308 04:17:32.941429 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="065d1362-4389-473a-9b59-4d3b25153deb" containerName="dnsmasq-dns" Mar 08 04:17:32.942218 master-0 kubenswrapper[18592]: E0308 04:17:32.941446 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="065d1362-4389-473a-9b59-4d3b25153deb" containerName="init" Mar 08 04:17:32.942218 master-0 kubenswrapper[18592]: I0308 04:17:32.941452 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="065d1362-4389-473a-9b59-4d3b25153deb" containerName="init" Mar 08 04:17:32.942218 master-0 kubenswrapper[18592]: E0308 04:17:32.941476 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11a4533-a895-42a7-8c17-d6f421276ae0" containerName="ironic-db-sync" Mar 08 04:17:32.942218 master-0 kubenswrapper[18592]: I0308 04:17:32.941483 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11a4533-a895-42a7-8c17-d6f421276ae0" containerName="ironic-db-sync" Mar 08 04:17:32.942218 master-0 kubenswrapper[18592]: E0308 04:17:32.941496 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11a4533-a895-42a7-8c17-d6f421276ae0" containerName="init" Mar 08 04:17:32.942218 master-0 kubenswrapper[18592]: I0308 04:17:32.941503 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11a4533-a895-42a7-8c17-d6f421276ae0" containerName="init" Mar 08 04:17:32.942218 master-0 kubenswrapper[18592]: I0308 04:17:32.941754 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="065d1362-4389-473a-9b59-4d3b25153deb" containerName="dnsmasq-dns" Mar 08 04:17:32.942218 master-0 kubenswrapper[18592]: I0308 04:17:32.941803 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="c11a4533-a895-42a7-8c17-d6f421276ae0" containerName="ironic-db-sync" Mar 08 04:17:32.954887 master-0 kubenswrapper[18592]: I0308 04:17:32.954704 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-kgmn4" Mar 08 04:17:32.981848 master-0 kubenswrapper[18592]: I0308 04:17:32.968446 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-kgmn4"] Mar 08 04:17:33.056609 master-0 kubenswrapper[18592]: I0308 04:17:33.056549 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-ff09-account-create-update-79kwt"] Mar 08 04:17:33.064184 master-0 kubenswrapper[18592]: I0308 04:17:33.058499 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-ff09-account-create-update-79kwt" Mar 08 04:17:33.072989 master-0 kubenswrapper[18592]: I0308 04:17:33.072752 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b74sg\" (UniqueName: \"kubernetes.io/projected/70814fbc-2d62-4cc7-b815-5b791cdc5e0a-kube-api-access-b74sg\") pod \"ironic-inspector-db-create-kgmn4\" (UID: \"70814fbc-2d62-4cc7-b815-5b791cdc5e0a\") " pod="openstack/ironic-inspector-db-create-kgmn4" Mar 08 04:17:33.072989 master-0 kubenswrapper[18592]: I0308 04:17:33.072863 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70814fbc-2d62-4cc7-b815-5b791cdc5e0a-operator-scripts\") pod \"ironic-inspector-db-create-kgmn4\" (UID: \"70814fbc-2d62-4cc7-b815-5b791cdc5e0a\") " pod="openstack/ironic-inspector-db-create-kgmn4" Mar 08 04:17:33.074482 master-0 kubenswrapper[18592]: I0308 04:17:33.073516 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-db-secret" Mar 08 04:17:33.084494 master-0 kubenswrapper[18592]: I0308 04:17:33.084444 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-ff09-account-create-update-79kwt"] Mar 08 04:17:33.139250 master-0 kubenswrapper[18592]: I0308 04:17:33.139183 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-neutron-agent-859d47fc89-z2wvz"] Mar 08 04:17:33.152243 master-0 kubenswrapper[18592]: I0308 04:17:33.140628 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:17:33.157189 master-0 kubenswrapper[18592]: I0308 04:17:33.157148 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-ironic-neutron-agent-config-data" Mar 08 04:17:33.174467 master-0 kubenswrapper[18592]: I0308 04:17:33.174392 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qc2z\" (UniqueName: \"kubernetes.io/projected/9a3f947a-748d-4cfd-a500-d759b58d22f4-kube-api-access-8qc2z\") pod \"ironic-inspector-ff09-account-create-update-79kwt\" (UID: \"9a3f947a-748d-4cfd-a500-d759b58d22f4\") " pod="openstack/ironic-inspector-ff09-account-create-update-79kwt" Mar 08 04:17:33.174863 master-0 kubenswrapper[18592]: I0308 04:17:33.174525 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b74sg\" (UniqueName: \"kubernetes.io/projected/70814fbc-2d62-4cc7-b815-5b791cdc5e0a-kube-api-access-b74sg\") pod \"ironic-inspector-db-create-kgmn4\" (UID: \"70814fbc-2d62-4cc7-b815-5b791cdc5e0a\") " pod="openstack/ironic-inspector-db-create-kgmn4" Mar 08 04:17:33.174863 master-0 kubenswrapper[18592]: I0308 04:17:33.174614 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a3f947a-748d-4cfd-a500-d759b58d22f4-operator-scripts\") pod \"ironic-inspector-ff09-account-create-update-79kwt\" (UID: \"9a3f947a-748d-4cfd-a500-d759b58d22f4\") " pod="openstack/ironic-inspector-ff09-account-create-update-79kwt" Mar 08 04:17:33.174863 master-0 kubenswrapper[18592]: I0308 04:17:33.174635 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70814fbc-2d62-4cc7-b815-5b791cdc5e0a-operator-scripts\") pod \"ironic-inspector-db-create-kgmn4\" (UID: \"70814fbc-2d62-4cc7-b815-5b791cdc5e0a\") " pod="openstack/ironic-inspector-db-create-kgmn4" Mar 08 04:17:33.175340 master-0 kubenswrapper[18592]: I0308 04:17:33.175294 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70814fbc-2d62-4cc7-b815-5b791cdc5e0a-operator-scripts\") pod \"ironic-inspector-db-create-kgmn4\" (UID: \"70814fbc-2d62-4cc7-b815-5b791cdc5e0a\") " pod="openstack/ironic-inspector-db-create-kgmn4" Mar 08 04:17:33.175438 master-0 kubenswrapper[18592]: I0308 04:17:33.175339 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-859d47fc89-z2wvz"] Mar 08 04:17:33.194466 master-0 kubenswrapper[18592]: I0308 04:17:33.192795 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6977d9bcc9-rzdgq"] Mar 08 04:17:33.194466 master-0 kubenswrapper[18592]: I0308 04:17:33.193092 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" podUID="374c1597-22de-4f77-a61a-6e72a503dfd0" containerName="dnsmasq-dns" containerID="cri-o://ffeb908bf05f96742f97bac99a4cde74e1a36304c0e36a7812b860bcd15de7b6" gracePeriod=10 Mar 08 04:17:33.204509 master-0 kubenswrapper[18592]: I0308 04:17:33.204379 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:33.252944 master-0 kubenswrapper[18592]: I0308 04:17:33.252888 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76b6c769dc-vwrbs"] Mar 08 04:17:33.257176 master-0 kubenswrapper[18592]: I0308 04:17:33.257114 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.275536 master-0 kubenswrapper[18592]: I0308 04:17:33.275445 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b74sg\" (UniqueName: \"kubernetes.io/projected/70814fbc-2d62-4cc7-b815-5b791cdc5e0a-kube-api-access-b74sg\") pod \"ironic-inspector-db-create-kgmn4\" (UID: \"70814fbc-2d62-4cc7-b815-5b791cdc5e0a\") " pod="openstack/ironic-inspector-db-create-kgmn4" Mar 08 04:17:33.277966 master-0 kubenswrapper[18592]: I0308 04:17:33.277924 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/05ba9b98-7d2f-4a9b-80ad-60793d8279e8-config\") pod \"ironic-neutron-agent-859d47fc89-z2wvz\" (UID: \"05ba9b98-7d2f-4a9b-80ad-60793d8279e8\") " pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:17:33.278052 master-0 kubenswrapper[18592]: I0308 04:17:33.278029 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a3f947a-748d-4cfd-a500-d759b58d22f4-operator-scripts\") pod \"ironic-inspector-ff09-account-create-update-79kwt\" (UID: \"9a3f947a-748d-4cfd-a500-d759b58d22f4\") " pod="openstack/ironic-inspector-ff09-account-create-update-79kwt" Mar 08 04:17:33.278166 master-0 kubenswrapper[18592]: I0308 04:17:33.278147 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85jjl\" (UniqueName: \"kubernetes.io/projected/05ba9b98-7d2f-4a9b-80ad-60793d8279e8-kube-api-access-85jjl\") pod \"ironic-neutron-agent-859d47fc89-z2wvz\" (UID: \"05ba9b98-7d2f-4a9b-80ad-60793d8279e8\") " pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:17:33.278215 master-0 kubenswrapper[18592]: I0308 04:17:33.278182 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qc2z\" (UniqueName: \"kubernetes.io/projected/9a3f947a-748d-4cfd-a500-d759b58d22f4-kube-api-access-8qc2z\") pod \"ironic-inspector-ff09-account-create-update-79kwt\" (UID: \"9a3f947a-748d-4cfd-a500-d759b58d22f4\") " pod="openstack/ironic-inspector-ff09-account-create-update-79kwt" Mar 08 04:17:33.278215 master-0 kubenswrapper[18592]: I0308 04:17:33.278200 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ba9b98-7d2f-4a9b-80ad-60793d8279e8-combined-ca-bundle\") pod \"ironic-neutron-agent-859d47fc89-z2wvz\" (UID: \"05ba9b98-7d2f-4a9b-80ad-60793d8279e8\") " pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:17:33.281472 master-0 kubenswrapper[18592]: I0308 04:17:33.281316 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b6c769dc-vwrbs"] Mar 08 04:17:33.292421 master-0 kubenswrapper[18592]: I0308 04:17:33.292342 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a3f947a-748d-4cfd-a500-d759b58d22f4-operator-scripts\") pod \"ironic-inspector-ff09-account-create-update-79kwt\" (UID: \"9a3f947a-748d-4cfd-a500-d759b58d22f4\") " pod="openstack/ironic-inspector-ff09-account-create-update-79kwt" Mar 08 04:17:33.313258 master-0 kubenswrapper[18592]: I0308 04:17:33.313197 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-kgmn4" Mar 08 04:17:33.328750 master-0 kubenswrapper[18592]: I0308 04:17:33.328683 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qc2z\" (UniqueName: \"kubernetes.io/projected/9a3f947a-748d-4cfd-a500-d759b58d22f4-kube-api-access-8qc2z\") pod \"ironic-inspector-ff09-account-create-update-79kwt\" (UID: \"9a3f947a-748d-4cfd-a500-d759b58d22f4\") " pod="openstack/ironic-inspector-ff09-account-create-update-79kwt" Mar 08 04:17:33.387047 master-0 kubenswrapper[18592]: I0308 04:17:33.386991 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85jjl\" (UniqueName: \"kubernetes.io/projected/05ba9b98-7d2f-4a9b-80ad-60793d8279e8-kube-api-access-85jjl\") pod \"ironic-neutron-agent-859d47fc89-z2wvz\" (UID: \"05ba9b98-7d2f-4a9b-80ad-60793d8279e8\") " pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:17:33.387047 master-0 kubenswrapper[18592]: I0308 04:17:33.387049 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ba9b98-7d2f-4a9b-80ad-60793d8279e8-combined-ca-bundle\") pod \"ironic-neutron-agent-859d47fc89-z2wvz\" (UID: \"05ba9b98-7d2f-4a9b-80ad-60793d8279e8\") " pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:17:33.387285 master-0 kubenswrapper[18592]: I0308 04:17:33.387081 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnm7x\" (UniqueName: \"kubernetes.io/projected/9d16547d-20e7-4839-b923-a3a01531ae0d-kube-api-access-vnm7x\") pod \"dnsmasq-dns-76b6c769dc-vwrbs\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.387285 master-0 kubenswrapper[18592]: I0308 04:17:33.387148 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-ovsdbserver-nb\") pod \"dnsmasq-dns-76b6c769dc-vwrbs\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.387285 master-0 kubenswrapper[18592]: I0308 04:17:33.387186 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/05ba9b98-7d2f-4a9b-80ad-60793d8279e8-config\") pod \"ironic-neutron-agent-859d47fc89-z2wvz\" (UID: \"05ba9b98-7d2f-4a9b-80ad-60793d8279e8\") " pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:17:33.387285 master-0 kubenswrapper[18592]: I0308 04:17:33.387235 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-dns-svc\") pod \"dnsmasq-dns-76b6c769dc-vwrbs\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.387285 master-0 kubenswrapper[18592]: I0308 04:17:33.387267 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-config\") pod \"dnsmasq-dns-76b6c769dc-vwrbs\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.387432 master-0 kubenswrapper[18592]: I0308 04:17:33.387295 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-ovsdbserver-sb\") pod \"dnsmasq-dns-76b6c769dc-vwrbs\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.387432 master-0 kubenswrapper[18592]: I0308 04:17:33.387319 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-dns-swift-storage-0\") pod \"dnsmasq-dns-76b6c769dc-vwrbs\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.398317 master-0 kubenswrapper[18592]: I0308 04:17:33.397843 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ba9b98-7d2f-4a9b-80ad-60793d8279e8-combined-ca-bundle\") pod \"ironic-neutron-agent-859d47fc89-z2wvz\" (UID: \"05ba9b98-7d2f-4a9b-80ad-60793d8279e8\") " pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:17:33.398317 master-0 kubenswrapper[18592]: I0308 04:17:33.397859 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/05ba9b98-7d2f-4a9b-80ad-60793d8279e8-config\") pod \"ironic-neutron-agent-859d47fc89-z2wvz\" (UID: \"05ba9b98-7d2f-4a9b-80ad-60793d8279e8\") " pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:17:33.414707 master-0 kubenswrapper[18592]: I0308 04:17:33.414664 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-ff09-account-create-update-79kwt" Mar 08 04:17:33.422648 master-0 kubenswrapper[18592]: I0308 04:17:33.422597 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85jjl\" (UniqueName: \"kubernetes.io/projected/05ba9b98-7d2f-4a9b-80ad-60793d8279e8-kube-api-access-85jjl\") pod \"ironic-neutron-agent-859d47fc89-z2wvz\" (UID: \"05ba9b98-7d2f-4a9b-80ad-60793d8279e8\") " pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:17:33.441055 master-0 kubenswrapper[18592]: I0308 04:17:33.440960 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-568f56df46-pmcb4"] Mar 08 04:17:33.453971 master-0 kubenswrapper[18592]: I0308 04:17:33.453912 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.463405 master-0 kubenswrapper[18592]: I0308 04:17:33.463294 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 04:17:33.463695 master-0 kubenswrapper[18592]: I0308 04:17:33.463647 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-transport" Mar 08 04:17:33.463849 master-0 kubenswrapper[18592]: I0308 04:17:33.463810 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Mar 08 04:17:33.464029 master-0 kubenswrapper[18592]: I0308 04:17:33.464000 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-config-data" Mar 08 04:17:33.465814 master-0 kubenswrapper[18592]: I0308 04:17:33.465678 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-568f56df46-pmcb4"] Mar 08 04:17:33.480211 master-0 kubenswrapper[18592]: I0308 04:17:33.480164 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-scripts" Mar 08 04:17:33.489000 master-0 kubenswrapper[18592]: I0308 04:17:33.488770 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-ovsdbserver-nb\") pod \"dnsmasq-dns-76b6c769dc-vwrbs\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.489099 master-0 kubenswrapper[18592]: I0308 04:17:33.489034 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-dns-svc\") pod \"dnsmasq-dns-76b6c769dc-vwrbs\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.489099 master-0 kubenswrapper[18592]: I0308 04:17:33.489080 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-config\") pod \"dnsmasq-dns-76b6c769dc-vwrbs\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.489196 master-0 kubenswrapper[18592]: I0308 04:17:33.489111 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-ovsdbserver-sb\") pod \"dnsmasq-dns-76b6c769dc-vwrbs\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.489196 master-0 kubenswrapper[18592]: I0308 04:17:33.489142 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-dns-swift-storage-0\") pod \"dnsmasq-dns-76b6c769dc-vwrbs\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.489284 master-0 kubenswrapper[18592]: I0308 04:17:33.489205 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnm7x\" (UniqueName: \"kubernetes.io/projected/9d16547d-20e7-4839-b923-a3a01531ae0d-kube-api-access-vnm7x\") pod \"dnsmasq-dns-76b6c769dc-vwrbs\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.490197 master-0 kubenswrapper[18592]: I0308 04:17:33.490152 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-ovsdbserver-nb\") pod \"dnsmasq-dns-76b6c769dc-vwrbs\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.490335 master-0 kubenswrapper[18592]: I0308 04:17:33.490313 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-dns-svc\") pod \"dnsmasq-dns-76b6c769dc-vwrbs\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.490920 master-0 kubenswrapper[18592]: I0308 04:17:33.490896 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-dns-swift-storage-0\") pod \"dnsmasq-dns-76b6c769dc-vwrbs\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.492142 master-0 kubenswrapper[18592]: I0308 04:17:33.491371 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-ovsdbserver-sb\") pod \"dnsmasq-dns-76b6c769dc-vwrbs\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.492142 master-0 kubenswrapper[18592]: I0308 04:17:33.491527 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-config\") pod \"dnsmasq-dns-76b6c769dc-vwrbs\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.526589 master-0 kubenswrapper[18592]: I0308 04:17:33.517019 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnm7x\" (UniqueName: \"kubernetes.io/projected/9d16547d-20e7-4839-b923-a3a01531ae0d-kube-api-access-vnm7x\") pod \"dnsmasq-dns-76b6c769dc-vwrbs\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.526589 master-0 kubenswrapper[18592]: I0308 04:17:33.519313 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:17:33.552923 master-0 kubenswrapper[18592]: I0308 04:17:33.548322 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:33.609845 master-0 kubenswrapper[18592]: I0308 04:17:33.597983 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-config-data-custom\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.609845 master-0 kubenswrapper[18592]: I0308 04:17:33.598039 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-config-data\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.609845 master-0 kubenswrapper[18592]: I0308 04:17:33.598058 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-logs\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.609845 master-0 kubenswrapper[18592]: I0308 04:17:33.598094 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-scripts\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.609845 master-0 kubenswrapper[18592]: I0308 04:17:33.598121 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-combined-ca-bundle\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.609845 master-0 kubenswrapper[18592]: I0308 04:17:33.598190 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k2b7\" (UniqueName: \"kubernetes.io/projected/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-kube-api-access-6k2b7\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.609845 master-0 kubenswrapper[18592]: I0308 04:17:33.598236 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-etc-podinfo\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.609845 master-0 kubenswrapper[18592]: I0308 04:17:33.598326 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-config-data-merged\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.700236 master-0 kubenswrapper[18592]: I0308 04:17:33.699668 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k2b7\" (UniqueName: \"kubernetes.io/projected/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-kube-api-access-6k2b7\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.710732 master-0 kubenswrapper[18592]: I0308 04:17:33.700472 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-etc-podinfo\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.710732 master-0 kubenswrapper[18592]: I0308 04:17:33.700791 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-config-data-merged\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.710732 master-0 kubenswrapper[18592]: I0308 04:17:33.701382 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-config-data-custom\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.710732 master-0 kubenswrapper[18592]: I0308 04:17:33.701495 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-config-data\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.710732 master-0 kubenswrapper[18592]: I0308 04:17:33.701528 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-logs\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.710732 master-0 kubenswrapper[18592]: I0308 04:17:33.701637 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-scripts\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.710732 master-0 kubenswrapper[18592]: I0308 04:17:33.701691 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-combined-ca-bundle\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.710732 master-0 kubenswrapper[18592]: I0308 04:17:33.701798 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-config-data-merged\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.710732 master-0 kubenswrapper[18592]: I0308 04:17:33.702941 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-logs\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.771351 master-0 kubenswrapper[18592]: I0308 04:17:33.749478 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-etc-podinfo\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.771351 master-0 kubenswrapper[18592]: I0308 04:17:33.765108 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k2b7\" (UniqueName: \"kubernetes.io/projected/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-kube-api-access-6k2b7\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.823843 master-0 kubenswrapper[18592]: I0308 04:17:33.812020 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-config-data-custom\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.823843 master-0 kubenswrapper[18592]: I0308 04:17:33.812071 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-scripts\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.823843 master-0 kubenswrapper[18592]: I0308 04:17:33.814973 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-combined-ca-bundle\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.823843 master-0 kubenswrapper[18592]: I0308 04:17:33.815245 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-config-data\") pod \"ironic-568f56df46-pmcb4\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.863631 master-0 kubenswrapper[18592]: I0308 04:17:33.863530 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:33.993355 master-0 kubenswrapper[18592]: I0308 04:17:33.992686 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:34.138318 master-0 kubenswrapper[18592]: I0308 04:17:34.135967 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-kgmn4"] Mar 08 04:17:34.139098 master-0 kubenswrapper[18592]: I0308 04:17:34.138988 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-config\") pod \"374c1597-22de-4f77-a61a-6e72a503dfd0\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " Mar 08 04:17:34.139178 master-0 kubenswrapper[18592]: I0308 04:17:34.139111 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-ovsdbserver-sb\") pod \"374c1597-22de-4f77-a61a-6e72a503dfd0\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " Mar 08 04:17:34.139178 master-0 kubenswrapper[18592]: I0308 04:17:34.139156 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-dns-svc\") pod \"374c1597-22de-4f77-a61a-6e72a503dfd0\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " Mar 08 04:17:34.139248 master-0 kubenswrapper[18592]: I0308 04:17:34.139182 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqz4j\" (UniqueName: \"kubernetes.io/projected/374c1597-22de-4f77-a61a-6e72a503dfd0-kube-api-access-xqz4j\") pod \"374c1597-22de-4f77-a61a-6e72a503dfd0\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " Mar 08 04:17:34.139301 master-0 kubenswrapper[18592]: I0308 04:17:34.139264 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-dns-swift-storage-0\") pod \"374c1597-22de-4f77-a61a-6e72a503dfd0\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " Mar 08 04:17:34.139700 master-0 kubenswrapper[18592]: I0308 04:17:34.139354 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-ovsdbserver-nb\") pod \"374c1597-22de-4f77-a61a-6e72a503dfd0\" (UID: \"374c1597-22de-4f77-a61a-6e72a503dfd0\") " Mar 08 04:17:34.148403 master-0 kubenswrapper[18592]: I0308 04:17:34.148306 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/374c1597-22de-4f77-a61a-6e72a503dfd0-kube-api-access-xqz4j" (OuterVolumeSpecName: "kube-api-access-xqz4j") pod "374c1597-22de-4f77-a61a-6e72a503dfd0" (UID: "374c1597-22de-4f77-a61a-6e72a503dfd0"). InnerVolumeSpecName "kube-api-access-xqz4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:17:34.219743 master-0 kubenswrapper[18592]: I0308 04:17:34.212979 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-config" (OuterVolumeSpecName: "config") pod "374c1597-22de-4f77-a61a-6e72a503dfd0" (UID: "374c1597-22de-4f77-a61a-6e72a503dfd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:17:34.250605 master-0 kubenswrapper[18592]: I0308 04:17:34.249304 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqz4j\" (UniqueName: \"kubernetes.io/projected/374c1597-22de-4f77-a61a-6e72a503dfd0-kube-api-access-xqz4j\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:34.250605 master-0 kubenswrapper[18592]: I0308 04:17:34.249348 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:34.260355 master-0 kubenswrapper[18592]: I0308 04:17:34.260310 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "374c1597-22de-4f77-a61a-6e72a503dfd0" (UID: "374c1597-22de-4f77-a61a-6e72a503dfd0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:17:34.286036 master-0 kubenswrapper[18592]: W0308 04:17:34.285960 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05ba9b98_7d2f_4a9b_80ad_60793d8279e8.slice/crio-77d624f247dca2ac7c5fac1a7d60310e2c7a3d69494a80271ba936f70aab0475 WatchSource:0}: Error finding container 77d624f247dca2ac7c5fac1a7d60310e2c7a3d69494a80271ba936f70aab0475: Status 404 returned error can't find the container with id 77d624f247dca2ac7c5fac1a7d60310e2c7a3d69494a80271ba936f70aab0475 Mar 08 04:17:34.286243 master-0 kubenswrapper[18592]: I0308 04:17:34.286083 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-859d47fc89-z2wvz"] Mar 08 04:17:34.298954 master-0 kubenswrapper[18592]: W0308 04:17:34.298659 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a3f947a_748d_4cfd_a500_d759b58d22f4.slice/crio-1049c91a352825463169314201825add1392d083cd00b2ada8beb7b35ebcf11d WatchSource:0}: Error finding container 1049c91a352825463169314201825add1392d083cd00b2ada8beb7b35ebcf11d: Status 404 returned error can't find the container with id 1049c91a352825463169314201825add1392d083cd00b2ada8beb7b35ebcf11d Mar 08 04:17:34.299456 master-0 kubenswrapper[18592]: I0308 04:17:34.299225 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "374c1597-22de-4f77-a61a-6e72a503dfd0" (UID: "374c1597-22de-4f77-a61a-6e72a503dfd0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:17:34.301291 master-0 kubenswrapper[18592]: I0308 04:17:34.301266 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-ff09-account-create-update-79kwt"] Mar 08 04:17:34.307450 master-0 kubenswrapper[18592]: I0308 04:17:34.307388 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "374c1597-22de-4f77-a61a-6e72a503dfd0" (UID: "374c1597-22de-4f77-a61a-6e72a503dfd0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:17:34.332370 master-0 kubenswrapper[18592]: I0308 04:17:34.332318 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "374c1597-22de-4f77-a61a-6e72a503dfd0" (UID: "374c1597-22de-4f77-a61a-6e72a503dfd0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:17:34.367999 master-0 kubenswrapper[18592]: I0308 04:17:34.366794 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:34.367999 master-0 kubenswrapper[18592]: I0308 04:17:34.366943 18592 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:34.367999 master-0 kubenswrapper[18592]: I0308 04:17:34.366957 18592 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:34.367999 master-0 kubenswrapper[18592]: I0308 04:17:34.366967 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/374c1597-22de-4f77-a61a-6e72a503dfd0-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:34.423915 master-0 kubenswrapper[18592]: I0308 04:17:34.419051 18592 generic.go:334] "Generic (PLEG): container finished" podID="374c1597-22de-4f77-a61a-6e72a503dfd0" containerID="ffeb908bf05f96742f97bac99a4cde74e1a36304c0e36a7812b860bcd15de7b6" exitCode=0 Mar 08 04:17:34.423915 master-0 kubenswrapper[18592]: I0308 04:17:34.419095 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" event={"ID":"374c1597-22de-4f77-a61a-6e72a503dfd0","Type":"ContainerDied","Data":"ffeb908bf05f96742f97bac99a4cde74e1a36304c0e36a7812b860bcd15de7b6"} Mar 08 04:17:34.423915 master-0 kubenswrapper[18592]: I0308 04:17:34.419157 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" event={"ID":"374c1597-22de-4f77-a61a-6e72a503dfd0","Type":"ContainerDied","Data":"ccd84b97dcf8a5ecb9bf9079360e84cf46df9db2a68d8c74f85c4e2a2e93c1f2"} Mar 08 04:17:34.423915 master-0 kubenswrapper[18592]: I0308 04:17:34.419178 18592 scope.go:117] "RemoveContainer" containerID="ffeb908bf05f96742f97bac99a4cde74e1a36304c0e36a7812b860bcd15de7b6" Mar 08 04:17:34.423915 master-0 kubenswrapper[18592]: I0308 04:17:34.419197 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6977d9bcc9-rzdgq" Mar 08 04:17:34.423915 master-0 kubenswrapper[18592]: I0308 04:17:34.421574 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-kgmn4" event={"ID":"70814fbc-2d62-4cc7-b815-5b791cdc5e0a","Type":"ContainerStarted","Data":"e3095eaef0084f46e80f6961f7c5bcece0f7c7ca9ba6b49ec98e406a8f49d996"} Mar 08 04:17:34.423915 master-0 kubenswrapper[18592]: I0308 04:17:34.423817 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-ff09-account-create-update-79kwt" event={"ID":"9a3f947a-748d-4cfd-a500-d759b58d22f4","Type":"ContainerStarted","Data":"1049c91a352825463169314201825add1392d083cd00b2ada8beb7b35ebcf11d"} Mar 08 04:17:34.426749 master-0 kubenswrapper[18592]: I0308 04:17:34.426673 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" event={"ID":"05ba9b98-7d2f-4a9b-80ad-60793d8279e8","Type":"ContainerStarted","Data":"77d624f247dca2ac7c5fac1a7d60310e2c7a3d69494a80271ba936f70aab0475"} Mar 08 04:17:34.450371 master-0 kubenswrapper[18592]: I0308 04:17:34.450149 18592 scope.go:117] "RemoveContainer" containerID="2bd972438fceb0a22512a8429c86a34d80d2769a29d2da113c93266de3f8281c" Mar 08 04:17:34.497863 master-0 kubenswrapper[18592]: I0308 04:17:34.490469 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6977d9bcc9-rzdgq"] Mar 08 04:17:34.509960 master-0 kubenswrapper[18592]: I0308 04:17:34.508641 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6977d9bcc9-rzdgq"] Mar 08 04:17:34.542094 master-0 kubenswrapper[18592]: I0308 04:17:34.541988 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b6c769dc-vwrbs"] Mar 08 04:17:34.542445 master-0 kubenswrapper[18592]: I0308 04:17:34.542401 18592 scope.go:117] "RemoveContainer" containerID="ffeb908bf05f96742f97bac99a4cde74e1a36304c0e36a7812b860bcd15de7b6" Mar 08 04:17:34.542959 master-0 kubenswrapper[18592]: E0308 04:17:34.542922 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffeb908bf05f96742f97bac99a4cde74e1a36304c0e36a7812b860bcd15de7b6\": container with ID starting with ffeb908bf05f96742f97bac99a4cde74e1a36304c0e36a7812b860bcd15de7b6 not found: ID does not exist" containerID="ffeb908bf05f96742f97bac99a4cde74e1a36304c0e36a7812b860bcd15de7b6" Mar 08 04:17:34.543021 master-0 kubenswrapper[18592]: I0308 04:17:34.542965 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffeb908bf05f96742f97bac99a4cde74e1a36304c0e36a7812b860bcd15de7b6"} err="failed to get container status \"ffeb908bf05f96742f97bac99a4cde74e1a36304c0e36a7812b860bcd15de7b6\": rpc error: code = NotFound desc = could not find container \"ffeb908bf05f96742f97bac99a4cde74e1a36304c0e36a7812b860bcd15de7b6\": container with ID starting with ffeb908bf05f96742f97bac99a4cde74e1a36304c0e36a7812b860bcd15de7b6 not found: ID does not exist" Mar 08 04:17:34.543021 master-0 kubenswrapper[18592]: I0308 04:17:34.542987 18592 scope.go:117] "RemoveContainer" containerID="2bd972438fceb0a22512a8429c86a34d80d2769a29d2da113c93266de3f8281c" Mar 08 04:17:34.546074 master-0 kubenswrapper[18592]: E0308 04:17:34.546036 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bd972438fceb0a22512a8429c86a34d80d2769a29d2da113c93266de3f8281c\": container with ID starting with 2bd972438fceb0a22512a8429c86a34d80d2769a29d2da113c93266de3f8281c not found: ID does not exist" containerID="2bd972438fceb0a22512a8429c86a34d80d2769a29d2da113c93266de3f8281c" Mar 08 04:17:34.546316 master-0 kubenswrapper[18592]: I0308 04:17:34.546291 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bd972438fceb0a22512a8429c86a34d80d2769a29d2da113c93266de3f8281c"} err="failed to get container status \"2bd972438fceb0a22512a8429c86a34d80d2769a29d2da113c93266de3f8281c\": rpc error: code = NotFound desc = could not find container \"2bd972438fceb0a22512a8429c86a34d80d2769a29d2da113c93266de3f8281c\": container with ID starting with 2bd972438fceb0a22512a8429c86a34d80d2769a29d2da113c93266de3f8281c not found: ID does not exist" Mar 08 04:17:34.651984 master-0 kubenswrapper[18592]: I0308 04:17:34.651927 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-568f56df46-pmcb4"] Mar 08 04:17:35.084431 master-0 kubenswrapper[18592]: I0308 04:17:35.084282 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-conductor-0"] Mar 08 04:17:35.084919 master-0 kubenswrapper[18592]: E0308 04:17:35.084897 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="374c1597-22de-4f77-a61a-6e72a503dfd0" containerName="init" Mar 08 04:17:35.084919 master-0 kubenswrapper[18592]: I0308 04:17:35.084918 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="374c1597-22de-4f77-a61a-6e72a503dfd0" containerName="init" Mar 08 04:17:35.085037 master-0 kubenswrapper[18592]: E0308 04:17:35.084966 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="374c1597-22de-4f77-a61a-6e72a503dfd0" containerName="dnsmasq-dns" Mar 08 04:17:35.085037 master-0 kubenswrapper[18592]: I0308 04:17:35.084973 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="374c1597-22de-4f77-a61a-6e72a503dfd0" containerName="dnsmasq-dns" Mar 08 04:17:35.085415 master-0 kubenswrapper[18592]: I0308 04:17:35.085371 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="374c1597-22de-4f77-a61a-6e72a503dfd0" containerName="dnsmasq-dns" Mar 08 04:17:35.088943 master-0 kubenswrapper[18592]: I0308 04:17:35.088867 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Mar 08 04:17:35.091875 master-0 kubenswrapper[18592]: I0308 04:17:35.091809 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-scripts" Mar 08 04:17:35.092073 master-0 kubenswrapper[18592]: I0308 04:17:35.092025 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-config-data" Mar 08 04:17:35.128917 master-0 kubenswrapper[18592]: I0308 04:17:35.128865 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Mar 08 04:17:35.174914 master-0 kubenswrapper[18592]: I0308 04:17:35.174860 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-config-data\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.175125 master-0 kubenswrapper[18592]: I0308 04:17:35.174926 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.175125 master-0 kubenswrapper[18592]: I0308 04:17:35.174972 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.175125 master-0 kubenswrapper[18592]: I0308 04:17:35.175028 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-scripts\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.175125 master-0 kubenswrapper[18592]: I0308 04:17:35.175057 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7bef596f-5edc-4f76-9cd7-3859c72fe428\" (UniqueName: \"kubernetes.io/csi/topolvm.io^74b3bf21-8cc8-497e-ae2b-1bbaa1f923c6\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.175125 master-0 kubenswrapper[18592]: I0308 04:17:35.175106 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.175360 master-0 kubenswrapper[18592]: I0308 04:17:35.175169 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nmbp\" (UniqueName: \"kubernetes.io/projected/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-kube-api-access-2nmbp\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.175360 master-0 kubenswrapper[18592]: I0308 04:17:35.175227 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.278297 master-0 kubenswrapper[18592]: I0308 04:17:35.277644 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.278297 master-0 kubenswrapper[18592]: I0308 04:17:35.277755 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nmbp\" (UniqueName: \"kubernetes.io/projected/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-kube-api-access-2nmbp\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.278297 master-0 kubenswrapper[18592]: I0308 04:17:35.277803 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.278581 master-0 kubenswrapper[18592]: I0308 04:17:35.278348 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.279807 master-0 kubenswrapper[18592]: I0308 04:17:35.279599 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-config-data\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.279807 master-0 kubenswrapper[18592]: I0308 04:17:35.279634 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.279807 master-0 kubenswrapper[18592]: I0308 04:17:35.279662 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.279807 master-0 kubenswrapper[18592]: I0308 04:17:35.279712 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-scripts\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.279807 master-0 kubenswrapper[18592]: I0308 04:17:35.279740 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7bef596f-5edc-4f76-9cd7-3859c72fe428\" (UniqueName: \"kubernetes.io/csi/topolvm.io^74b3bf21-8cc8-497e-ae2b-1bbaa1f923c6\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.282683 master-0 kubenswrapper[18592]: I0308 04:17:35.281603 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.285016 master-0 kubenswrapper[18592]: I0308 04:17:35.284978 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.292035 master-0 kubenswrapper[18592]: I0308 04:17:35.290214 18592 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 04:17:35.292035 master-0 kubenswrapper[18592]: I0308 04:17:35.290261 18592 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7bef596f-5edc-4f76-9cd7-3859c72fe428\" (UniqueName: \"kubernetes.io/csi/topolvm.io^74b3bf21-8cc8-497e-ae2b-1bbaa1f923c6\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/5af38179e4e2817556da0dd8b028fe3cc8a208bdb264614ab5af9f371ecba4d8/globalmount\"" pod="openstack/ironic-conductor-0" Mar 08 04:17:35.292035 master-0 kubenswrapper[18592]: I0308 04:17:35.290578 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.323772 master-0 kubenswrapper[18592]: I0308 04:17:35.322775 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-scripts\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.323772 master-0 kubenswrapper[18592]: I0308 04:17:35.323599 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-config-data\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.343630 master-0 kubenswrapper[18592]: I0308 04:17:35.343529 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nmbp\" (UniqueName: \"kubernetes.io/projected/abe912ba-4a33-4634-a3fb-b6fb09b38d8e-kube-api-access-2nmbp\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:35.506194 master-0 kubenswrapper[18592]: I0308 04:17:35.505453 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-568f56df46-pmcb4" event={"ID":"a793c425-75a3-4c0d-95a0-43f6bdf96bf5","Type":"ContainerStarted","Data":"5f2b62eb8cc3095bd73c2cb8536c2a050457a4f21c7a40e305e241e12fc92ea7"} Mar 08 04:17:35.577558 master-0 kubenswrapper[18592]: I0308 04:17:35.576754 18592 generic.go:334] "Generic (PLEG): container finished" podID="70814fbc-2d62-4cc7-b815-5b791cdc5e0a" containerID="8d506f5f75e3ed6df7f7d128e438fa040648c2bb9e229937c4c8eb73ef731bd2" exitCode=0 Mar 08 04:17:35.577558 master-0 kubenswrapper[18592]: I0308 04:17:35.576860 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-kgmn4" event={"ID":"70814fbc-2d62-4cc7-b815-5b791cdc5e0a","Type":"ContainerDied","Data":"8d506f5f75e3ed6df7f7d128e438fa040648c2bb9e229937c4c8eb73ef731bd2"} Mar 08 04:17:35.585101 master-0 kubenswrapper[18592]: I0308 04:17:35.584368 18592 generic.go:334] "Generic (PLEG): container finished" podID="9a3f947a-748d-4cfd-a500-d759b58d22f4" containerID="da944f53c1cacc5ea7072e22c65947bc2fd94150ad9087fd44c3193c4dffd65c" exitCode=0 Mar 08 04:17:35.585101 master-0 kubenswrapper[18592]: I0308 04:17:35.584440 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-ff09-account-create-update-79kwt" event={"ID":"9a3f947a-748d-4cfd-a500-d759b58d22f4","Type":"ContainerDied","Data":"da944f53c1cacc5ea7072e22c65947bc2fd94150ad9087fd44c3193c4dffd65c"} Mar 08 04:17:35.595005 master-0 kubenswrapper[18592]: I0308 04:17:35.594586 18592 generic.go:334] "Generic (PLEG): container finished" podID="9d16547d-20e7-4839-b923-a3a01531ae0d" containerID="75860010ea3ac6d113be7b206770088f41b8ca19b7b7d09993bfc651f1bf0c0f" exitCode=0 Mar 08 04:17:35.595005 master-0 kubenswrapper[18592]: I0308 04:17:35.594633 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" event={"ID":"9d16547d-20e7-4839-b923-a3a01531ae0d","Type":"ContainerDied","Data":"75860010ea3ac6d113be7b206770088f41b8ca19b7b7d09993bfc651f1bf0c0f"} Mar 08 04:17:35.595005 master-0 kubenswrapper[18592]: I0308 04:17:35.594657 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" event={"ID":"9d16547d-20e7-4839-b923-a3a01531ae0d","Type":"ContainerStarted","Data":"b7195fc3433b5d8f4b052236ef5a60bebcffbf9ade104bdad8c75baad2aa0c13"} Mar 08 04:17:36.190973 master-0 kubenswrapper[18592]: I0308 04:17:36.190911 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="374c1597-22de-4f77-a61a-6e72a503dfd0" path="/var/lib/kubelet/pods/374c1597-22de-4f77-a61a-6e72a503dfd0/volumes" Mar 08 04:17:36.244842 master-0 kubenswrapper[18592]: I0308 04:17:36.242727 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-5b8d48c5b6-nqv28"] Mar 08 04:17:36.245792 master-0 kubenswrapper[18592]: I0308 04:17:36.245131 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.253622 master-0 kubenswrapper[18592]: I0308 04:17:36.250527 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-public-svc" Mar 08 04:17:36.253622 master-0 kubenswrapper[18592]: I0308 04:17:36.250700 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-internal-svc" Mar 08 04:17:36.316998 master-0 kubenswrapper[18592]: I0308 04:17:36.308786 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-5b8d48c5b6-nqv28"] Mar 08 04:17:36.332288 master-0 kubenswrapper[18592]: I0308 04:17:36.330494 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:36.346335 master-0 kubenswrapper[18592]: I0308 04:17:36.344249 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e39f8cef-9071-463a-af1a-4728e182c6c2-scripts\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.346335 master-0 kubenswrapper[18592]: I0308 04:17:36.344326 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e39f8cef-9071-463a-af1a-4728e182c6c2-config-data-merged\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.346335 master-0 kubenswrapper[18592]: I0308 04:17:36.344361 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e39f8cef-9071-463a-af1a-4728e182c6c2-public-tls-certs\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.346335 master-0 kubenswrapper[18592]: I0308 04:17:36.344497 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e39f8cef-9071-463a-af1a-4728e182c6c2-combined-ca-bundle\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.346335 master-0 kubenswrapper[18592]: I0308 04:17:36.344528 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wqqt\" (UniqueName: \"kubernetes.io/projected/e39f8cef-9071-463a-af1a-4728e182c6c2-kube-api-access-8wqqt\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.346335 master-0 kubenswrapper[18592]: I0308 04:17:36.344607 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e39f8cef-9071-463a-af1a-4728e182c6c2-logs\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.346335 master-0 kubenswrapper[18592]: I0308 04:17:36.344701 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e39f8cef-9071-463a-af1a-4728e182c6c2-config-data-custom\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.346335 master-0 kubenswrapper[18592]: I0308 04:17:36.344758 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e39f8cef-9071-463a-af1a-4728e182c6c2-internal-tls-certs\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.346335 master-0 kubenswrapper[18592]: I0308 04:17:36.344852 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e39f8cef-9071-463a-af1a-4728e182c6c2-config-data\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.346335 master-0 kubenswrapper[18592]: I0308 04:17:36.344888 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e39f8cef-9071-463a-af1a-4728e182c6c2-etc-podinfo\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.438974 master-0 kubenswrapper[18592]: I0308 04:17:36.438295 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ff301-scheduler-0"] Mar 08 04:17:36.448620 master-0 kubenswrapper[18592]: I0308 04:17:36.448533 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e39f8cef-9071-463a-af1a-4728e182c6c2-logs\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.449588 master-0 kubenswrapper[18592]: I0308 04:17:36.449032 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e39f8cef-9071-463a-af1a-4728e182c6c2-config-data-custom\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.450586 master-0 kubenswrapper[18592]: I0308 04:17:36.450558 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e39f8cef-9071-463a-af1a-4728e182c6c2-internal-tls-certs\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.450780 master-0 kubenswrapper[18592]: I0308 04:17:36.450761 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e39f8cef-9071-463a-af1a-4728e182c6c2-config-data\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.450984 master-0 kubenswrapper[18592]: I0308 04:17:36.450964 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e39f8cef-9071-463a-af1a-4728e182c6c2-etc-podinfo\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.451104 master-0 kubenswrapper[18592]: I0308 04:17:36.451087 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e39f8cef-9071-463a-af1a-4728e182c6c2-scripts\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.451223 master-0 kubenswrapper[18592]: I0308 04:17:36.451206 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e39f8cef-9071-463a-af1a-4728e182c6c2-config-data-merged\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.451334 master-0 kubenswrapper[18592]: I0308 04:17:36.451317 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e39f8cef-9071-463a-af1a-4728e182c6c2-public-tls-certs\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.454513 master-0 kubenswrapper[18592]: I0308 04:17:36.454470 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e39f8cef-9071-463a-af1a-4728e182c6c2-internal-tls-certs\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.454660 master-0 kubenswrapper[18592]: I0308 04:17:36.454629 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e39f8cef-9071-463a-af1a-4728e182c6c2-config-data\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.454751 master-0 kubenswrapper[18592]: I0308 04:17:36.454725 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e39f8cef-9071-463a-af1a-4728e182c6c2-logs\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.454899 master-0 kubenswrapper[18592]: I0308 04:17:36.454871 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e39f8cef-9071-463a-af1a-4728e182c6c2-config-data-merged\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.455078 master-0 kubenswrapper[18592]: I0308 04:17:36.455024 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e39f8cef-9071-463a-af1a-4728e182c6c2-combined-ca-bundle\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.455132 master-0 kubenswrapper[18592]: I0308 04:17:36.455110 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wqqt\" (UniqueName: \"kubernetes.io/projected/e39f8cef-9071-463a-af1a-4728e182c6c2-kube-api-access-8wqqt\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.458197 master-0 kubenswrapper[18592]: I0308 04:17:36.456467 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e39f8cef-9071-463a-af1a-4728e182c6c2-etc-podinfo\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.458197 master-0 kubenswrapper[18592]: I0308 04:17:36.457706 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e39f8cef-9071-463a-af1a-4728e182c6c2-combined-ca-bundle\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.459045 master-0 kubenswrapper[18592]: I0308 04:17:36.459018 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e39f8cef-9071-463a-af1a-4728e182c6c2-public-tls-certs\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.462590 master-0 kubenswrapper[18592]: I0308 04:17:36.462556 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e39f8cef-9071-463a-af1a-4728e182c6c2-scripts\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.469728 master-0 kubenswrapper[18592]: I0308 04:17:36.469675 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e39f8cef-9071-463a-af1a-4728e182c6c2-config-data-custom\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.474843 master-0 kubenswrapper[18592]: I0308 04:17:36.474787 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wqqt\" (UniqueName: \"kubernetes.io/projected/e39f8cef-9071-463a-af1a-4728e182c6c2-kube-api-access-8wqqt\") pod \"ironic-5b8d48c5b6-nqv28\" (UID: \"e39f8cef-9071-463a-af1a-4728e182c6c2\") " pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.561731 master-0 kubenswrapper[18592]: I0308 04:17:36.561673 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:36.585455 master-0 kubenswrapper[18592]: I0308 04:17:36.585303 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:36.615064 master-0 kubenswrapper[18592]: I0308 04:17:36.615013 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" event={"ID":"9d16547d-20e7-4839-b923-a3a01531ae0d","Type":"ContainerStarted","Data":"c384a30be75094a05190f4bcfd0ae9c347bb91927d000c9e7d35d9d82e6139e3"} Mar 08 04:17:36.615212 master-0 kubenswrapper[18592]: I0308 04:17:36.615089 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:36.615212 master-0 kubenswrapper[18592]: I0308 04:17:36.615097 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-ff301-scheduler-0" podUID="808f8fcb-c2b2-4df7-9c7a-9838aaf744af" containerName="cinder-scheduler" containerID="cri-o://3eb0f5404cad7df753b609a49341dcfedcd0f8ba7eba964052e8a5ff5d83c94d" gracePeriod=30 Mar 08 04:17:36.615531 master-0 kubenswrapper[18592]: I0308 04:17:36.615223 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-ff301-scheduler-0" podUID="808f8fcb-c2b2-4df7-9c7a-9838aaf744af" containerName="probe" containerID="cri-o://18b944393cbbd8944c868595b59690bf821e5a692d25a4b47311ab6a5400e2d5" gracePeriod=30 Mar 08 04:17:36.656667 master-0 kubenswrapper[18592]: I0308 04:17:36.653158 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" podStartSLOduration=3.6531396000000003 podStartE2EDuration="3.6531396s" podCreationTimestamp="2026-03-08 04:17:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:17:36.646257162 +0000 UTC m=+1468.745011532" watchObservedRunningTime="2026-03-08 04:17:36.6531396 +0000 UTC m=+1468.751893950" Mar 08 04:17:36.656667 master-0 kubenswrapper[18592]: I0308 04:17:36.654459 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ff301-volume-lvm-iscsi-0"] Mar 08 04:17:36.656667 master-0 kubenswrapper[18592]: I0308 04:17:36.654684 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" podUID="ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" containerName="cinder-volume" containerID="cri-o://026138a76b3039bbe26925baaeac4b9e1f51172480fae9c8f1c644a11823150b" gracePeriod=30 Mar 08 04:17:36.656667 master-0 kubenswrapper[18592]: I0308 04:17:36.655164 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" podUID="ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" containerName="probe" containerID="cri-o://98ef436f086a681790db38099cf9de7225c61f5dcbfdb3938dd8fddf0c0c5b68" gracePeriod=30 Mar 08 04:17:36.769351 master-0 kubenswrapper[18592]: I0308 04:17:36.769235 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:36.871992 master-0 kubenswrapper[18592]: I0308 04:17:36.871812 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ff301-backup-0"] Mar 08 04:17:36.876256 master-0 kubenswrapper[18592]: I0308 04:17:36.876179 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7bef596f-5edc-4f76-9cd7-3859c72fe428\" (UniqueName: \"kubernetes.io/csi/topolvm.io^74b3bf21-8cc8-497e-ae2b-1bbaa1f923c6\") pod \"ironic-conductor-0\" (UID: \"abe912ba-4a33-4634-a3fb-b6fb09b38d8e\") " pod="openstack/ironic-conductor-0" Mar 08 04:17:36.921071 master-0 kubenswrapper[18592]: I0308 04:17:36.920995 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Mar 08 04:17:37.625555 master-0 kubenswrapper[18592]: I0308 04:17:37.625355 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-ff301-backup-0" podUID="0f0421ee-53ce-4160-abfc-a5968415005b" containerName="cinder-backup" containerID="cri-o://0fcdce70899a7ba893c9e7446288886f7e98ce6c222494e6a125caa9b136ecaf" gracePeriod=30 Mar 08 04:17:37.625555 master-0 kubenswrapper[18592]: I0308 04:17:37.625488 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-ff301-backup-0" podUID="0f0421ee-53ce-4160-abfc-a5968415005b" containerName="probe" containerID="cri-o://12199ca2ce66a98f608e1da86a561392f75037c0b84ea80ae27c5022a0692d96" gracePeriod=30 Mar 08 04:17:37.986520 master-0 kubenswrapper[18592]: I0308 04:17:37.986150 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-ff09-account-create-update-79kwt" Mar 08 04:17:38.012537 master-0 kubenswrapper[18592]: I0308 04:17:38.011190 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-kgmn4" Mar 08 04:17:38.137982 master-0 kubenswrapper[18592]: I0308 04:17:38.131931 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70814fbc-2d62-4cc7-b815-5b791cdc5e0a-operator-scripts\") pod \"70814fbc-2d62-4cc7-b815-5b791cdc5e0a\" (UID: \"70814fbc-2d62-4cc7-b815-5b791cdc5e0a\") " Mar 08 04:17:38.137982 master-0 kubenswrapper[18592]: I0308 04:17:38.136684 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a3f947a-748d-4cfd-a500-d759b58d22f4-operator-scripts\") pod \"9a3f947a-748d-4cfd-a500-d759b58d22f4\" (UID: \"9a3f947a-748d-4cfd-a500-d759b58d22f4\") " Mar 08 04:17:38.137982 master-0 kubenswrapper[18592]: I0308 04:17:38.136731 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b74sg\" (UniqueName: \"kubernetes.io/projected/70814fbc-2d62-4cc7-b815-5b791cdc5e0a-kube-api-access-b74sg\") pod \"70814fbc-2d62-4cc7-b815-5b791cdc5e0a\" (UID: \"70814fbc-2d62-4cc7-b815-5b791cdc5e0a\") " Mar 08 04:17:38.137982 master-0 kubenswrapper[18592]: I0308 04:17:38.136850 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qc2z\" (UniqueName: \"kubernetes.io/projected/9a3f947a-748d-4cfd-a500-d759b58d22f4-kube-api-access-8qc2z\") pod \"9a3f947a-748d-4cfd-a500-d759b58d22f4\" (UID: \"9a3f947a-748d-4cfd-a500-d759b58d22f4\") " Mar 08 04:17:38.137982 master-0 kubenswrapper[18592]: I0308 04:17:38.137086 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70814fbc-2d62-4cc7-b815-5b791cdc5e0a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "70814fbc-2d62-4cc7-b815-5b791cdc5e0a" (UID: "70814fbc-2d62-4cc7-b815-5b791cdc5e0a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:17:38.137982 master-0 kubenswrapper[18592]: I0308 04:17:38.137576 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70814fbc-2d62-4cc7-b815-5b791cdc5e0a-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.137982 master-0 kubenswrapper[18592]: I0308 04:17:38.137735 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a3f947a-748d-4cfd-a500-d759b58d22f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9a3f947a-748d-4cfd-a500-d759b58d22f4" (UID: "9a3f947a-748d-4cfd-a500-d759b58d22f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:17:38.148748 master-0 kubenswrapper[18592]: I0308 04:17:38.147878 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70814fbc-2d62-4cc7-b815-5b791cdc5e0a-kube-api-access-b74sg" (OuterVolumeSpecName: "kube-api-access-b74sg") pod "70814fbc-2d62-4cc7-b815-5b791cdc5e0a" (UID: "70814fbc-2d62-4cc7-b815-5b791cdc5e0a"). InnerVolumeSpecName "kube-api-access-b74sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:17:38.178847 master-0 kubenswrapper[18592]: I0308 04:17:38.174520 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a3f947a-748d-4cfd-a500-d759b58d22f4-kube-api-access-8qc2z" (OuterVolumeSpecName: "kube-api-access-8qc2z") pod "9a3f947a-748d-4cfd-a500-d759b58d22f4" (UID: "9a3f947a-748d-4cfd-a500-d759b58d22f4"). InnerVolumeSpecName "kube-api-access-8qc2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:17:38.240633 master-0 kubenswrapper[18592]: I0308 04:17:38.240031 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a3f947a-748d-4cfd-a500-d759b58d22f4-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.240633 master-0 kubenswrapper[18592]: I0308 04:17:38.240094 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b74sg\" (UniqueName: \"kubernetes.io/projected/70814fbc-2d62-4cc7-b815-5b791cdc5e0a-kube-api-access-b74sg\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.240633 master-0 kubenswrapper[18592]: I0308 04:17:38.240267 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qc2z\" (UniqueName: \"kubernetes.io/projected/9a3f947a-748d-4cfd-a500-d759b58d22f4-kube-api-access-8qc2z\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.305947 master-0 kubenswrapper[18592]: I0308 04:17:38.305463 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:38.438922 master-0 kubenswrapper[18592]: I0308 04:17:38.431273 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-5b8d48c5b6-nqv28"] Mar 08 04:17:38.449858 master-0 kubenswrapper[18592]: I0308 04:17:38.443780 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-etc-nvme\") pod \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " Mar 08 04:17:38.450215 master-0 kubenswrapper[18592]: I0308 04:17:38.450182 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-combined-ca-bundle\") pod \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " Mar 08 04:17:38.450333 master-0 kubenswrapper[18592]: I0308 04:17:38.450318 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-sys\") pod \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " Mar 08 04:17:38.456848 master-0 kubenswrapper[18592]: I0308 04:17:38.450451 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-etc-machine-id\") pod \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " Mar 08 04:17:38.457102 master-0 kubenswrapper[18592]: I0308 04:17:38.457080 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-lib-modules\") pod \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " Mar 08 04:17:38.457290 master-0 kubenswrapper[18592]: I0308 04:17:38.457275 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-etc-iscsi\") pod \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " Mar 08 04:17:38.457382 master-0 kubenswrapper[18592]: I0308 04:17:38.457370 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-var-lib-cinder\") pod \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " Mar 08 04:17:38.457481 master-0 kubenswrapper[18592]: I0308 04:17:38.457469 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-config-data\") pod \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " Mar 08 04:17:38.459889 master-0 kubenswrapper[18592]: I0308 04:17:38.457555 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-var-locks-brick\") pod \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " Mar 08 04:17:38.460141 master-0 kubenswrapper[18592]: I0308 04:17:38.460118 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-var-locks-cinder\") pod \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " Mar 08 04:17:38.460274 master-0 kubenswrapper[18592]: I0308 04:17:38.460260 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-dev\") pod \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " Mar 08 04:17:38.460369 master-0 kubenswrapper[18592]: I0308 04:17:38.460356 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-scripts\") pod \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " Mar 08 04:17:38.460466 master-0 kubenswrapper[18592]: I0308 04:17:38.460451 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-config-data-custom\") pod \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " Mar 08 04:17:38.460603 master-0 kubenswrapper[18592]: I0308 04:17:38.460590 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwvdn\" (UniqueName: \"kubernetes.io/projected/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-kube-api-access-gwvdn\") pod \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " Mar 08 04:17:38.460685 master-0 kubenswrapper[18592]: I0308 04:17:38.460670 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-run\") pod \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\" (UID: \"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01\") " Mar 08 04:17:38.463155 master-0 kubenswrapper[18592]: I0308 04:17:38.444673 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" (UID: "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:38.463155 master-0 kubenswrapper[18592]: I0308 04:17:38.452220 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-sys" (OuterVolumeSpecName: "sys") pod "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" (UID: "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:38.463155 master-0 kubenswrapper[18592]: I0308 04:17:38.452287 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" (UID: "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:38.463155 master-0 kubenswrapper[18592]: I0308 04:17:38.457991 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" (UID: "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:38.463155 master-0 kubenswrapper[18592]: I0308 04:17:38.457990 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" (UID: "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:38.463155 master-0 kubenswrapper[18592]: I0308 04:17:38.458118 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" (UID: "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:38.463155 master-0 kubenswrapper[18592]: I0308 04:17:38.458143 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" (UID: "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:38.463155 master-0 kubenswrapper[18592]: I0308 04:17:38.460679 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-dev" (OuterVolumeSpecName: "dev") pod "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" (UID: "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:38.463155 master-0 kubenswrapper[18592]: I0308 04:17:38.460703 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" (UID: "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:38.463155 master-0 kubenswrapper[18592]: W0308 04:17:38.460948 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode39f8cef_9071_463a_af1a_4728e182c6c2.slice/crio-6a12810ba7c08d74a600630af6ae9555bdd50a494e93dc29a75d7ebfa5e60d11 WatchSource:0}: Error finding container 6a12810ba7c08d74a600630af6ae9555bdd50a494e93dc29a75d7ebfa5e60d11: Status 404 returned error can't find the container with id 6a12810ba7c08d74a600630af6ae9555bdd50a494e93dc29a75d7ebfa5e60d11 Mar 08 04:17:38.463155 master-0 kubenswrapper[18592]: I0308 04:17:38.461022 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-run" (OuterVolumeSpecName: "run") pod "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" (UID: "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:38.466865 master-0 kubenswrapper[18592]: I0308 04:17:38.466841 18592 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.466981 master-0 kubenswrapper[18592]: I0308 04:17:38.466969 18592 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.467052 master-0 kubenswrapper[18592]: I0308 04:17:38.467041 18592 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.467124 master-0 kubenswrapper[18592]: I0308 04:17:38.467113 18592 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.467441 master-0 kubenswrapper[18592]: I0308 04:17:38.467427 18592 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-dev\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.467515 master-0 kubenswrapper[18592]: I0308 04:17:38.467504 18592 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-run\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.467646 master-0 kubenswrapper[18592]: I0308 04:17:38.467635 18592 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-etc-nvme\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.467722 master-0 kubenswrapper[18592]: I0308 04:17:38.467712 18592 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-sys\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.468071 master-0 kubenswrapper[18592]: I0308 04:17:38.467801 18592 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.468282 master-0 kubenswrapper[18592]: I0308 04:17:38.468269 18592 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-lib-modules\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.468382 master-0 kubenswrapper[18592]: I0308 04:17:38.466908 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-scripts" (OuterVolumeSpecName: "scripts") pod "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" (UID: "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:38.468453 master-0 kubenswrapper[18592]: I0308 04:17:38.468035 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-kube-api-access-gwvdn" (OuterVolumeSpecName: "kube-api-access-gwvdn") pod "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" (UID: "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01"). InnerVolumeSpecName "kube-api-access-gwvdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:17:38.468646 master-0 kubenswrapper[18592]: I0308 04:17:38.468625 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" (UID: "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:38.524191 master-0 kubenswrapper[18592]: W0308 04:17:38.524138 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabe912ba_4a33_4634_a3fb_b6fb09b38d8e.slice/crio-daa084427eac6118ad26fa44f8f99d16d393f4b268a1a8cc332a2e0e0ba0b6a4 WatchSource:0}: Error finding container daa084427eac6118ad26fa44f8f99d16d393f4b268a1a8cc332a2e0e0ba0b6a4: Status 404 returned error can't find the container with id daa084427eac6118ad26fa44f8f99d16d393f4b268a1a8cc332a2e0e0ba0b6a4 Mar 08 04:17:38.524324 master-0 kubenswrapper[18592]: I0308 04:17:38.524273 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Mar 08 04:17:38.545104 master-0 kubenswrapper[18592]: I0308 04:17:38.544561 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" (UID: "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:38.569867 master-0 kubenswrapper[18592]: I0308 04:17:38.569804 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.569867 master-0 kubenswrapper[18592]: I0308 04:17:38.569856 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.569867 master-0 kubenswrapper[18592]: I0308 04:17:38.569867 18592 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.569867 master-0 kubenswrapper[18592]: I0308 04:17:38.569878 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwvdn\" (UniqueName: \"kubernetes.io/projected/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-kube-api-access-gwvdn\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.610971 master-0 kubenswrapper[18592]: I0308 04:17:38.610922 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-config-data" (OuterVolumeSpecName: "config-data") pod "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" (UID: "ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:38.649657 master-0 kubenswrapper[18592]: I0308 04:17:38.649590 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"abe912ba-4a33-4634-a3fb-b6fb09b38d8e","Type":"ContainerStarted","Data":"daa084427eac6118ad26fa44f8f99d16d393f4b268a1a8cc332a2e0e0ba0b6a4"} Mar 08 04:17:38.652355 master-0 kubenswrapper[18592]: I0308 04:17:38.652279 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-5b8d48c5b6-nqv28" event={"ID":"e39f8cef-9071-463a-af1a-4728e182c6c2","Type":"ContainerStarted","Data":"6a12810ba7c08d74a600630af6ae9555bdd50a494e93dc29a75d7ebfa5e60d11"} Mar 08 04:17:38.655027 master-0 kubenswrapper[18592]: I0308 04:17:38.654088 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-ff09-account-create-update-79kwt" Mar 08 04:17:38.655027 master-0 kubenswrapper[18592]: I0308 04:17:38.654986 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-ff09-account-create-update-79kwt" event={"ID":"9a3f947a-748d-4cfd-a500-d759b58d22f4","Type":"ContainerDied","Data":"1049c91a352825463169314201825add1392d083cd00b2ada8beb7b35ebcf11d"} Mar 08 04:17:38.655027 master-0 kubenswrapper[18592]: I0308 04:17:38.655004 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1049c91a352825463169314201825add1392d083cd00b2ada8beb7b35ebcf11d" Mar 08 04:17:38.656845 master-0 kubenswrapper[18592]: I0308 04:17:38.656791 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" event={"ID":"05ba9b98-7d2f-4a9b-80ad-60793d8279e8","Type":"ContainerStarted","Data":"78a26f8923875ef6ca34dad35ae3996e9a46df95c44c8b7f37f4d7b9154f9694"} Mar 08 04:17:38.657114 master-0 kubenswrapper[18592]: I0308 04:17:38.657085 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:17:38.661605 master-0 kubenswrapper[18592]: I0308 04:17:38.661522 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-568f56df46-pmcb4" event={"ID":"a793c425-75a3-4c0d-95a0-43f6bdf96bf5","Type":"ContainerStarted","Data":"d5391456248391ef7db24714017806821c3d4bcb5663d38293aef340995bb5f7"} Mar 08 04:17:38.668548 master-0 kubenswrapper[18592]: I0308 04:17:38.668256 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-kgmn4" Mar 08 04:17:38.668548 master-0 kubenswrapper[18592]: I0308 04:17:38.668292 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-kgmn4" event={"ID":"70814fbc-2d62-4cc7-b815-5b791cdc5e0a","Type":"ContainerDied","Data":"e3095eaef0084f46e80f6961f7c5bcece0f7c7ca9ba6b49ec98e406a8f49d996"} Mar 08 04:17:38.668548 master-0 kubenswrapper[18592]: I0308 04:17:38.668371 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3095eaef0084f46e80f6961f7c5bcece0f7c7ca9ba6b49ec98e406a8f49d996" Mar 08 04:17:38.679233 master-0 kubenswrapper[18592]: I0308 04:17:38.679140 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:38.682627 master-0 kubenswrapper[18592]: I0308 04:17:38.682588 18592 generic.go:334] "Generic (PLEG): container finished" podID="ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" containerID="98ef436f086a681790db38099cf9de7225c61f5dcbfdb3938dd8fddf0c0c5b68" exitCode=0 Mar 08 04:17:38.682729 master-0 kubenswrapper[18592]: I0308 04:17:38.682714 18592 generic.go:334] "Generic (PLEG): container finished" podID="ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" containerID="026138a76b3039bbe26925baaeac4b9e1f51172480fae9c8f1c644a11823150b" exitCode=0 Mar 08 04:17:38.682999 master-0 kubenswrapper[18592]: I0308 04:17:38.682983 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:38.683696 master-0 kubenswrapper[18592]: I0308 04:17:38.682979 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" event={"ID":"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01","Type":"ContainerDied","Data":"98ef436f086a681790db38099cf9de7225c61f5dcbfdb3938dd8fddf0c0c5b68"} Mar 08 04:17:38.683851 master-0 kubenswrapper[18592]: I0308 04:17:38.683709 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" event={"ID":"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01","Type":"ContainerDied","Data":"026138a76b3039bbe26925baaeac4b9e1f51172480fae9c8f1c644a11823150b"} Mar 08 04:17:38.683851 master-0 kubenswrapper[18592]: I0308 04:17:38.683725 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" event={"ID":"ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01","Type":"ContainerDied","Data":"f743e8645bdb4996ccedcac0cbe2af7c6001c7fafc0468d8e2082f085338ee8a"} Mar 08 04:17:38.683851 master-0 kubenswrapper[18592]: I0308 04:17:38.683838 18592 scope.go:117] "RemoveContainer" containerID="98ef436f086a681790db38099cf9de7225c61f5dcbfdb3938dd8fddf0c0c5b68" Mar 08 04:17:38.693681 master-0 kubenswrapper[18592]: I0308 04:17:38.693600 18592 generic.go:334] "Generic (PLEG): container finished" podID="0f0421ee-53ce-4160-abfc-a5968415005b" containerID="12199ca2ce66a98f608e1da86a561392f75037c0b84ea80ae27c5022a0692d96" exitCode=0 Mar 08 04:17:38.693882 master-0 kubenswrapper[18592]: I0308 04:17:38.693815 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-backup-0" event={"ID":"0f0421ee-53ce-4160-abfc-a5968415005b","Type":"ContainerDied","Data":"12199ca2ce66a98f608e1da86a561392f75037c0b84ea80ae27c5022a0692d96"} Mar 08 04:17:38.706918 master-0 kubenswrapper[18592]: I0308 04:17:38.694266 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" podStartSLOduration=2.06522482 podStartE2EDuration="5.694245503s" podCreationTimestamp="2026-03-08 04:17:33 +0000 UTC" firstStartedPulling="2026-03-08 04:17:34.290154449 +0000 UTC m=+1466.388908799" lastFinishedPulling="2026-03-08 04:17:37.919175142 +0000 UTC m=+1470.017929482" observedRunningTime="2026-03-08 04:17:38.671346801 +0000 UTC m=+1470.770101151" watchObservedRunningTime="2026-03-08 04:17:38.694245503 +0000 UTC m=+1470.792999853" Mar 08 04:17:38.707227 master-0 kubenswrapper[18592]: I0308 04:17:38.704774 18592 generic.go:334] "Generic (PLEG): container finished" podID="808f8fcb-c2b2-4df7-9c7a-9838aaf744af" containerID="18b944393cbbd8944c868595b59690bf821e5a692d25a4b47311ab6a5400e2d5" exitCode=0 Mar 08 04:17:38.707264 master-0 kubenswrapper[18592]: I0308 04:17:38.704804 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-scheduler-0" event={"ID":"808f8fcb-c2b2-4df7-9c7a-9838aaf744af","Type":"ContainerDied","Data":"18b944393cbbd8944c868595b59690bf821e5a692d25a4b47311ab6a5400e2d5"} Mar 08 04:17:38.744392 master-0 kubenswrapper[18592]: I0308 04:17:38.744342 18592 scope.go:117] "RemoveContainer" containerID="026138a76b3039bbe26925baaeac4b9e1f51172480fae9c8f1c644a11823150b" Mar 08 04:17:38.883810 master-0 kubenswrapper[18592]: I0308 04:17:38.883757 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-ff301-api-0" Mar 08 04:17:38.928175 master-0 kubenswrapper[18592]: I0308 04:17:38.927703 18592 scope.go:117] "RemoveContainer" containerID="98ef436f086a681790db38099cf9de7225c61f5dcbfdb3938dd8fddf0c0c5b68" Mar 08 04:17:38.928932 master-0 kubenswrapper[18592]: E0308 04:17:38.928565 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98ef436f086a681790db38099cf9de7225c61f5dcbfdb3938dd8fddf0c0c5b68\": container with ID starting with 98ef436f086a681790db38099cf9de7225c61f5dcbfdb3938dd8fddf0c0c5b68 not found: ID does not exist" containerID="98ef436f086a681790db38099cf9de7225c61f5dcbfdb3938dd8fddf0c0c5b68" Mar 08 04:17:38.928932 master-0 kubenswrapper[18592]: I0308 04:17:38.928618 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ef436f086a681790db38099cf9de7225c61f5dcbfdb3938dd8fddf0c0c5b68"} err="failed to get container status \"98ef436f086a681790db38099cf9de7225c61f5dcbfdb3938dd8fddf0c0c5b68\": rpc error: code = NotFound desc = could not find container \"98ef436f086a681790db38099cf9de7225c61f5dcbfdb3938dd8fddf0c0c5b68\": container with ID starting with 98ef436f086a681790db38099cf9de7225c61f5dcbfdb3938dd8fddf0c0c5b68 not found: ID does not exist" Mar 08 04:17:38.928932 master-0 kubenswrapper[18592]: I0308 04:17:38.928643 18592 scope.go:117] "RemoveContainer" containerID="026138a76b3039bbe26925baaeac4b9e1f51172480fae9c8f1c644a11823150b" Mar 08 04:17:38.929326 master-0 kubenswrapper[18592]: E0308 04:17:38.929076 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"026138a76b3039bbe26925baaeac4b9e1f51172480fae9c8f1c644a11823150b\": container with ID starting with 026138a76b3039bbe26925baaeac4b9e1f51172480fae9c8f1c644a11823150b not found: ID does not exist" containerID="026138a76b3039bbe26925baaeac4b9e1f51172480fae9c8f1c644a11823150b" Mar 08 04:17:38.929326 master-0 kubenswrapper[18592]: I0308 04:17:38.929110 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"026138a76b3039bbe26925baaeac4b9e1f51172480fae9c8f1c644a11823150b"} err="failed to get container status \"026138a76b3039bbe26925baaeac4b9e1f51172480fae9c8f1c644a11823150b\": rpc error: code = NotFound desc = could not find container \"026138a76b3039bbe26925baaeac4b9e1f51172480fae9c8f1c644a11823150b\": container with ID starting with 026138a76b3039bbe26925baaeac4b9e1f51172480fae9c8f1c644a11823150b not found: ID does not exist" Mar 08 04:17:38.929326 master-0 kubenswrapper[18592]: I0308 04:17:38.929130 18592 scope.go:117] "RemoveContainer" containerID="98ef436f086a681790db38099cf9de7225c61f5dcbfdb3938dd8fddf0c0c5b68" Mar 08 04:17:38.930599 master-0 kubenswrapper[18592]: I0308 04:17:38.929441 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ef436f086a681790db38099cf9de7225c61f5dcbfdb3938dd8fddf0c0c5b68"} err="failed to get container status \"98ef436f086a681790db38099cf9de7225c61f5dcbfdb3938dd8fddf0c0c5b68\": rpc error: code = NotFound desc = could not find container \"98ef436f086a681790db38099cf9de7225c61f5dcbfdb3938dd8fddf0c0c5b68\": container with ID starting with 98ef436f086a681790db38099cf9de7225c61f5dcbfdb3938dd8fddf0c0c5b68 not found: ID does not exist" Mar 08 04:17:38.930599 master-0 kubenswrapper[18592]: I0308 04:17:38.929491 18592 scope.go:117] "RemoveContainer" containerID="026138a76b3039bbe26925baaeac4b9e1f51172480fae9c8f1c644a11823150b" Mar 08 04:17:38.930599 master-0 kubenswrapper[18592]: I0308 04:17:38.930006 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"026138a76b3039bbe26925baaeac4b9e1f51172480fae9c8f1c644a11823150b"} err="failed to get container status \"026138a76b3039bbe26925baaeac4b9e1f51172480fae9c8f1c644a11823150b\": rpc error: code = NotFound desc = could not find container \"026138a76b3039bbe26925baaeac4b9e1f51172480fae9c8f1c644a11823150b\": container with ID starting with 026138a76b3039bbe26925baaeac4b9e1f51172480fae9c8f1c644a11823150b not found: ID does not exist" Mar 08 04:17:38.992851 master-0 kubenswrapper[18592]: I0308 04:17:38.981552 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ff301-volume-lvm-iscsi-0"] Mar 08 04:17:38.999591 master-0 kubenswrapper[18592]: I0308 04:17:38.999542 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-ff301-volume-lvm-iscsi-0"] Mar 08 04:17:39.012932 master-0 kubenswrapper[18592]: I0308 04:17:39.011809 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ff301-volume-lvm-iscsi-0"] Mar 08 04:17:39.013419 master-0 kubenswrapper[18592]: E0308 04:17:39.013358 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70814fbc-2d62-4cc7-b815-5b791cdc5e0a" containerName="mariadb-database-create" Mar 08 04:17:39.013419 master-0 kubenswrapper[18592]: I0308 04:17:39.013382 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="70814fbc-2d62-4cc7-b815-5b791cdc5e0a" containerName="mariadb-database-create" Mar 08 04:17:39.013507 master-0 kubenswrapper[18592]: E0308 04:17:39.013430 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" containerName="probe" Mar 08 04:17:39.013507 master-0 kubenswrapper[18592]: I0308 04:17:39.013438 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" containerName="probe" Mar 08 04:17:39.013507 master-0 kubenswrapper[18592]: E0308 04:17:39.013497 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a3f947a-748d-4cfd-a500-d759b58d22f4" containerName="mariadb-account-create-update" Mar 08 04:17:39.013507 master-0 kubenswrapper[18592]: I0308 04:17:39.013505 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3f947a-748d-4cfd-a500-d759b58d22f4" containerName="mariadb-account-create-update" Mar 08 04:17:39.013642 master-0 kubenswrapper[18592]: E0308 04:17:39.013534 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" containerName="cinder-volume" Mar 08 04:17:39.013642 master-0 kubenswrapper[18592]: I0308 04:17:39.013540 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" containerName="cinder-volume" Mar 08 04:17:39.014509 master-0 kubenswrapper[18592]: I0308 04:17:39.014321 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="70814fbc-2d62-4cc7-b815-5b791cdc5e0a" containerName="mariadb-database-create" Mar 08 04:17:39.014509 master-0 kubenswrapper[18592]: I0308 04:17:39.014358 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" containerName="cinder-volume" Mar 08 04:17:39.014509 master-0 kubenswrapper[18592]: I0308 04:17:39.014413 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" containerName="probe" Mar 08 04:17:39.014509 master-0 kubenswrapper[18592]: I0308 04:17:39.014421 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a3f947a-748d-4cfd-a500-d759b58d22f4" containerName="mariadb-account-create-update" Mar 08 04:17:39.018096 master-0 kubenswrapper[18592]: I0308 04:17:39.016004 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.021463 master-0 kubenswrapper[18592]: I0308 04:17:39.021429 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-ff301-volume-lvm-iscsi-config-data" Mar 08 04:17:39.032700 master-0 kubenswrapper[18592]: I0308 04:17:39.032566 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ff301-volume-lvm-iscsi-0"] Mar 08 04:17:39.200541 master-0 kubenswrapper[18592]: I0308 04:17:39.200462 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-config-data-custom\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.200855 master-0 kubenswrapper[18592]: I0308 04:17:39.200543 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-run\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.200855 master-0 kubenswrapper[18592]: I0308 04:17:39.200600 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdprg\" (UniqueName: \"kubernetes.io/projected/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-kube-api-access-wdprg\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.200855 master-0 kubenswrapper[18592]: I0308 04:17:39.200628 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-config-data\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.200855 master-0 kubenswrapper[18592]: I0308 04:17:39.200677 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-etc-nvme\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.200855 master-0 kubenswrapper[18592]: I0308 04:17:39.200841 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-etc-iscsi\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.201052 master-0 kubenswrapper[18592]: I0308 04:17:39.200875 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-lib-modules\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.201052 master-0 kubenswrapper[18592]: I0308 04:17:39.200898 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-sys\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.201052 master-0 kubenswrapper[18592]: I0308 04:17:39.200931 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-var-lib-cinder\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.201147 master-0 kubenswrapper[18592]: I0308 04:17:39.201052 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-var-locks-cinder\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.201147 master-0 kubenswrapper[18592]: I0308 04:17:39.201136 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-combined-ca-bundle\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.201259 master-0 kubenswrapper[18592]: I0308 04:17:39.201228 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-scripts\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.201302 master-0 kubenswrapper[18592]: I0308 04:17:39.201258 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-var-locks-brick\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.201388 master-0 kubenswrapper[18592]: I0308 04:17:39.201355 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-dev\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.201497 master-0 kubenswrapper[18592]: I0308 04:17:39.201462 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-etc-machine-id\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.304132 master-0 kubenswrapper[18592]: I0308 04:17:39.304079 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-combined-ca-bundle\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.304605 master-0 kubenswrapper[18592]: I0308 04:17:39.304541 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-scripts\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.304605 master-0 kubenswrapper[18592]: I0308 04:17:39.304600 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-var-locks-brick\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.305032 master-0 kubenswrapper[18592]: I0308 04:17:39.304995 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-var-locks-brick\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.305108 master-0 kubenswrapper[18592]: I0308 04:17:39.305036 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-dev\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.305158 master-0 kubenswrapper[18592]: I0308 04:17:39.305137 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-etc-machine-id\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.305302 master-0 kubenswrapper[18592]: I0308 04:17:39.305238 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-config-data-custom\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.305377 master-0 kubenswrapper[18592]: I0308 04:17:39.305354 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-run\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.305537 master-0 kubenswrapper[18592]: I0308 04:17:39.305467 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-dev\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.305537 master-0 kubenswrapper[18592]: I0308 04:17:39.305526 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdprg\" (UniqueName: \"kubernetes.io/projected/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-kube-api-access-wdprg\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.305635 master-0 kubenswrapper[18592]: I0308 04:17:39.305583 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-config-data\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.305635 master-0 kubenswrapper[18592]: I0308 04:17:39.305611 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-etc-nvme\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.305731 master-0 kubenswrapper[18592]: I0308 04:17:39.305693 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-etc-iscsi\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.305777 master-0 kubenswrapper[18592]: I0308 04:17:39.305728 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-lib-modules\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.306243 master-0 kubenswrapper[18592]: I0308 04:17:39.306197 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-etc-machine-id\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.306316 master-0 kubenswrapper[18592]: I0308 04:17:39.306263 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-lib-modules\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.306316 master-0 kubenswrapper[18592]: I0308 04:17:39.306285 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-sys\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.306411 master-0 kubenswrapper[18592]: I0308 04:17:39.306330 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-var-lib-cinder\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.306411 master-0 kubenswrapper[18592]: I0308 04:17:39.306380 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-sys\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.306581 master-0 kubenswrapper[18592]: I0308 04:17:39.306491 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-var-lib-cinder\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.306711 master-0 kubenswrapper[18592]: I0308 04:17:39.306685 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-run\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.306786 master-0 kubenswrapper[18592]: I0308 04:17:39.306731 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-etc-nvme\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.306786 master-0 kubenswrapper[18592]: I0308 04:17:39.306752 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-etc-iscsi\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.307892 master-0 kubenswrapper[18592]: I0308 04:17:39.307853 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-scripts\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.307892 master-0 kubenswrapper[18592]: I0308 04:17:39.307880 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-combined-ca-bundle\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.308575 master-0 kubenswrapper[18592]: I0308 04:17:39.308523 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-var-locks-cinder\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.309646 master-0 kubenswrapper[18592]: I0308 04:17:39.309604 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-var-locks-cinder\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.313216 master-0 kubenswrapper[18592]: I0308 04:17:39.313184 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-config-data\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.313989 master-0 kubenswrapper[18592]: I0308 04:17:39.313953 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-config-data-custom\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.327336 master-0 kubenswrapper[18592]: I0308 04:17:39.327258 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdprg\" (UniqueName: \"kubernetes.io/projected/f6c6845e-2f27-4421-b2ee-5d5892a8f5c9-kube-api-access-wdprg\") pod \"cinder-ff301-volume-lvm-iscsi-0\" (UID: \"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9\") " pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.395607 master-0 kubenswrapper[18592]: I0308 04:17:39.395538 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:39.720983 master-0 kubenswrapper[18592]: I0308 04:17:39.720930 18592 generic.go:334] "Generic (PLEG): container finished" podID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" containerID="d5391456248391ef7db24714017806821c3d4bcb5663d38293aef340995bb5f7" exitCode=0 Mar 08 04:17:39.721438 master-0 kubenswrapper[18592]: I0308 04:17:39.721031 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-568f56df46-pmcb4" event={"ID":"a793c425-75a3-4c0d-95a0-43f6bdf96bf5","Type":"ContainerDied","Data":"d5391456248391ef7db24714017806821c3d4bcb5663d38293aef340995bb5f7"} Mar 08 04:17:39.727249 master-0 kubenswrapper[18592]: I0308 04:17:39.727179 18592 generic.go:334] "Generic (PLEG): container finished" podID="e39f8cef-9071-463a-af1a-4728e182c6c2" containerID="e669d99e8ea4d44ceba36dd6ba0ccf1239891c0a70567639de9fdef33aa02539" exitCode=0 Mar 08 04:17:39.727312 master-0 kubenswrapper[18592]: I0308 04:17:39.727267 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-5b8d48c5b6-nqv28" event={"ID":"e39f8cef-9071-463a-af1a-4728e182c6c2","Type":"ContainerDied","Data":"e669d99e8ea4d44ceba36dd6ba0ccf1239891c0a70567639de9fdef33aa02539"} Mar 08 04:17:39.796674 master-0 kubenswrapper[18592]: I0308 04:17:39.796623 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"abe912ba-4a33-4634-a3fb-b6fb09b38d8e","Type":"ContainerStarted","Data":"7412fef920c13b9ed12be8da1a35b12e4fe6ef0a0a5ae459634bcd60e1c0636f"} Mar 08 04:17:39.953021 master-0 kubenswrapper[18592]: W0308 04:17:39.952969 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6c6845e_2f27_4421_b2ee_5d5892a8f5c9.slice/crio-52493b8856e29c019aa498db5a9fe2e3e71c8685ee788576da1169ecc27a0257 WatchSource:0}: Error finding container 52493b8856e29c019aa498db5a9fe2e3e71c8685ee788576da1169ecc27a0257: Status 404 returned error can't find the container with id 52493b8856e29c019aa498db5a9fe2e3e71c8685ee788576da1169ecc27a0257 Mar 08 04:17:39.955338 master-0 kubenswrapper[18592]: I0308 04:17:39.955301 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ff301-volume-lvm-iscsi-0"] Mar 08 04:17:40.177539 master-0 kubenswrapper[18592]: I0308 04:17:40.177325 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01" path="/var/lib/kubelet/pods/ec4f9339-16d7-4ffb-af5d-e4a7d4f69e01/volumes" Mar 08 04:17:40.870276 master-0 kubenswrapper[18592]: I0308 04:17:40.869835 18592 generic.go:334] "Generic (PLEG): container finished" podID="808f8fcb-c2b2-4df7-9c7a-9838aaf744af" containerID="3eb0f5404cad7df753b609a49341dcfedcd0f8ba7eba964052e8a5ff5d83c94d" exitCode=0 Mar 08 04:17:40.870276 master-0 kubenswrapper[18592]: I0308 04:17:40.869906 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-scheduler-0" event={"ID":"808f8fcb-c2b2-4df7-9c7a-9838aaf744af","Type":"ContainerDied","Data":"3eb0f5404cad7df753b609a49341dcfedcd0f8ba7eba964052e8a5ff5d83c94d"} Mar 08 04:17:40.872614 master-0 kubenswrapper[18592]: I0308 04:17:40.872575 18592 generic.go:334] "Generic (PLEG): container finished" podID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" containerID="eede0fd52126bd1934ba16ebaa1eb739915adaebb008cabb1f0d1cdc912ef9c4" exitCode=1 Mar 08 04:17:40.872703 master-0 kubenswrapper[18592]: I0308 04:17:40.872617 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-568f56df46-pmcb4" event={"ID":"a793c425-75a3-4c0d-95a0-43f6bdf96bf5","Type":"ContainerDied","Data":"eede0fd52126bd1934ba16ebaa1eb739915adaebb008cabb1f0d1cdc912ef9c4"} Mar 08 04:17:40.872703 master-0 kubenswrapper[18592]: I0308 04:17:40.872634 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-568f56df46-pmcb4" event={"ID":"a793c425-75a3-4c0d-95a0-43f6bdf96bf5","Type":"ContainerStarted","Data":"4b3492860ca9dc480436b650c75257796e5192f624bc94b6ebbedddef4bda44c"} Mar 08 04:17:40.877206 master-0 kubenswrapper[18592]: I0308 04:17:40.873353 18592 scope.go:117] "RemoveContainer" containerID="eede0fd52126bd1934ba16ebaa1eb739915adaebb008cabb1f0d1cdc912ef9c4" Mar 08 04:17:40.877355 master-0 kubenswrapper[18592]: I0308 04:17:40.877234 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-5b8d48c5b6-nqv28" event={"ID":"e39f8cef-9071-463a-af1a-4728e182c6c2","Type":"ContainerStarted","Data":"5f9f119428deb2fab4390a38b755cca47169d84e4a614f13f65655e7d4266748"} Mar 08 04:17:40.877355 master-0 kubenswrapper[18592]: I0308 04:17:40.877269 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-5b8d48c5b6-nqv28" event={"ID":"e39f8cef-9071-463a-af1a-4728e182c6c2","Type":"ContainerStarted","Data":"2b9d5cbb33e0f3d0dfe6c0bebee2433f5eb3ae824fde316df9f91b0daabe009e"} Mar 08 04:17:40.877577 master-0 kubenswrapper[18592]: I0308 04:17:40.877549 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:40.881901 master-0 kubenswrapper[18592]: I0308 04:17:40.881381 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" event={"ID":"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9","Type":"ContainerStarted","Data":"f75bc4d520b65073308a013a1c84d6fc235b4ff673bab168184b199c94c5164c"} Mar 08 04:17:40.881901 master-0 kubenswrapper[18592]: I0308 04:17:40.881458 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" event={"ID":"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9","Type":"ContainerStarted","Data":"1ec2cbf914b3125162449c17f47c35301f716c3072ea4d89d14429ed637cd9cb"} Mar 08 04:17:40.881901 master-0 kubenswrapper[18592]: I0308 04:17:40.881477 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" event={"ID":"f6c6845e-2f27-4421-b2ee-5d5892a8f5c9","Type":"ContainerStarted","Data":"52493b8856e29c019aa498db5a9fe2e3e71c8685ee788576da1169ecc27a0257"} Mar 08 04:17:40.890697 master-0 kubenswrapper[18592]: I0308 04:17:40.884067 18592 generic.go:334] "Generic (PLEG): container finished" podID="0f0421ee-53ce-4160-abfc-a5968415005b" containerID="0fcdce70899a7ba893c9e7446288886f7e98ce6c222494e6a125caa9b136ecaf" exitCode=0 Mar 08 04:17:40.890697 master-0 kubenswrapper[18592]: I0308 04:17:40.884391 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-backup-0" event={"ID":"0f0421ee-53ce-4160-abfc-a5968415005b","Type":"ContainerDied","Data":"0fcdce70899a7ba893c9e7446288886f7e98ce6c222494e6a125caa9b136ecaf"} Mar 08 04:17:40.962927 master-0 kubenswrapper[18592]: I0308 04:17:40.961434 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-5b8d48c5b6-nqv28" podStartSLOduration=4.96140658 podStartE2EDuration="4.96140658s" podCreationTimestamp="2026-03-08 04:17:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:17:40.923780727 +0000 UTC m=+1473.022535077" watchObservedRunningTime="2026-03-08 04:17:40.96140658 +0000 UTC m=+1473.060160930" Mar 08 04:17:40.991244 master-0 kubenswrapper[18592]: I0308 04:17:40.984293 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" podStartSLOduration=2.984273381 podStartE2EDuration="2.984273381s" podCreationTimestamp="2026-03-08 04:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:17:40.965997404 +0000 UTC m=+1473.064751754" watchObservedRunningTime="2026-03-08 04:17:40.984273381 +0000 UTC m=+1473.083027731" Mar 08 04:17:41.486585 master-0 kubenswrapper[18592]: I0308 04:17:41.486543 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:41.494512 master-0 kubenswrapper[18592]: I0308 04:17:41.494383 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:41.680881 master-0 kubenswrapper[18592]: I0308 04:17:41.680803 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-sys\") pod \"0f0421ee-53ce-4160-abfc-a5968415005b\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " Mar 08 04:17:41.681628 master-0 kubenswrapper[18592]: I0308 04:17:41.680897 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-config-data\") pod \"0f0421ee-53ce-4160-abfc-a5968415005b\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " Mar 08 04:17:41.681628 master-0 kubenswrapper[18592]: I0308 04:17:41.681607 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-etc-nvme\") pod \"0f0421ee-53ce-4160-abfc-a5968415005b\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " Mar 08 04:17:41.681727 master-0 kubenswrapper[18592]: I0308 04:17:41.681657 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-etc-machine-id\") pod \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " Mar 08 04:17:41.681727 master-0 kubenswrapper[18592]: I0308 04:17:41.681685 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-var-lib-cinder\") pod \"0f0421ee-53ce-4160-abfc-a5968415005b\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " Mar 08 04:17:41.681908 master-0 kubenswrapper[18592]: I0308 04:17:41.681868 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf822\" (UniqueName: \"kubernetes.io/projected/0f0421ee-53ce-4160-abfc-a5968415005b-kube-api-access-bf822\") pod \"0f0421ee-53ce-4160-abfc-a5968415005b\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " Mar 08 04:17:41.682026 master-0 kubenswrapper[18592]: I0308 04:17:41.681997 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvb9v\" (UniqueName: \"kubernetes.io/projected/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-kube-api-access-rvb9v\") pod \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " Mar 08 04:17:41.682127 master-0 kubenswrapper[18592]: I0308 04:17:41.682074 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-var-locks-brick\") pod \"0f0421ee-53ce-4160-abfc-a5968415005b\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " Mar 08 04:17:41.682232 master-0 kubenswrapper[18592]: I0308 04:17:41.682182 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-etc-machine-id\") pod \"0f0421ee-53ce-4160-abfc-a5968415005b\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " Mar 08 04:17:41.682308 master-0 kubenswrapper[18592]: I0308 04:17:41.682276 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-var-locks-cinder\") pod \"0f0421ee-53ce-4160-abfc-a5968415005b\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " Mar 08 04:17:41.682308 master-0 kubenswrapper[18592]: I0308 04:17:41.682300 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-combined-ca-bundle\") pod \"0f0421ee-53ce-4160-abfc-a5968415005b\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " Mar 08 04:17:41.682371 master-0 kubenswrapper[18592]: I0308 04:17:41.682325 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-scripts\") pod \"0f0421ee-53ce-4160-abfc-a5968415005b\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " Mar 08 04:17:41.682371 master-0 kubenswrapper[18592]: I0308 04:17:41.682345 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-combined-ca-bundle\") pod \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " Mar 08 04:17:41.682448 master-0 kubenswrapper[18592]: I0308 04:17:41.682388 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-lib-modules\") pod \"0f0421ee-53ce-4160-abfc-a5968415005b\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " Mar 08 04:17:41.682448 master-0 kubenswrapper[18592]: I0308 04:17:41.682411 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-config-data\") pod \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " Mar 08 04:17:41.682448 master-0 kubenswrapper[18592]: I0308 04:17:41.682430 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-scripts\") pod \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " Mar 08 04:17:41.682534 master-0 kubenswrapper[18592]: I0308 04:17:41.682450 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-config-data-custom\") pod \"0f0421ee-53ce-4160-abfc-a5968415005b\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " Mar 08 04:17:41.682534 master-0 kubenswrapper[18592]: I0308 04:17:41.682482 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-etc-iscsi\") pod \"0f0421ee-53ce-4160-abfc-a5968415005b\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " Mar 08 04:17:41.682534 master-0 kubenswrapper[18592]: I0308 04:17:41.682521 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-run\") pod \"0f0421ee-53ce-4160-abfc-a5968415005b\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " Mar 08 04:17:41.682622 master-0 kubenswrapper[18592]: I0308 04:17:41.682600 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-dev\") pod \"0f0421ee-53ce-4160-abfc-a5968415005b\" (UID: \"0f0421ee-53ce-4160-abfc-a5968415005b\") " Mar 08 04:17:41.682622 master-0 kubenswrapper[18592]: I0308 04:17:41.682619 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-config-data-custom\") pod \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\" (UID: \"808f8fcb-c2b2-4df7-9c7a-9838aaf744af\") " Mar 08 04:17:41.686029 master-0 kubenswrapper[18592]: I0308 04:17:41.681146 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-sys" (OuterVolumeSpecName: "sys") pod "0f0421ee-53ce-4160-abfc-a5968415005b" (UID: "0f0421ee-53ce-4160-abfc-a5968415005b"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:41.686135 master-0 kubenswrapper[18592]: I0308 04:17:41.686074 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "0f0421ee-53ce-4160-abfc-a5968415005b" (UID: "0f0421ee-53ce-4160-abfc-a5968415005b"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:41.686184 master-0 kubenswrapper[18592]: I0308 04:17:41.686133 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "808f8fcb-c2b2-4df7-9c7a-9838aaf744af" (UID: "808f8fcb-c2b2-4df7-9c7a-9838aaf744af"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:41.686184 master-0 kubenswrapper[18592]: I0308 04:17:41.686169 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "0f0421ee-53ce-4160-abfc-a5968415005b" (UID: "0f0421ee-53ce-4160-abfc-a5968415005b"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:41.686789 master-0 kubenswrapper[18592]: I0308 04:17:41.686764 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "0f0421ee-53ce-4160-abfc-a5968415005b" (UID: "0f0421ee-53ce-4160-abfc-a5968415005b"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:41.688407 master-0 kubenswrapper[18592]: I0308 04:17:41.687582 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "808f8fcb-c2b2-4df7-9c7a-9838aaf744af" (UID: "808f8fcb-c2b2-4df7-9c7a-9838aaf744af"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:41.688407 master-0 kubenswrapper[18592]: I0308 04:17:41.687665 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-run" (OuterVolumeSpecName: "run") pod "0f0421ee-53ce-4160-abfc-a5968415005b" (UID: "0f0421ee-53ce-4160-abfc-a5968415005b"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:41.688407 master-0 kubenswrapper[18592]: I0308 04:17:41.687687 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-dev" (OuterVolumeSpecName: "dev") pod "0f0421ee-53ce-4160-abfc-a5968415005b" (UID: "0f0421ee-53ce-4160-abfc-a5968415005b"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:41.688407 master-0 kubenswrapper[18592]: I0308 04:17:41.688078 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "0f0421ee-53ce-4160-abfc-a5968415005b" (UID: "0f0421ee-53ce-4160-abfc-a5968415005b"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:41.688983 master-0 kubenswrapper[18592]: I0308 04:17:41.688933 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0f0421ee-53ce-4160-abfc-a5968415005b" (UID: "0f0421ee-53ce-4160-abfc-a5968415005b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:41.688983 master-0 kubenswrapper[18592]: I0308 04:17:41.688975 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "0f0421ee-53ce-4160-abfc-a5968415005b" (UID: "0f0421ee-53ce-4160-abfc-a5968415005b"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:41.689068 master-0 kubenswrapper[18592]: I0308 04:17:41.688999 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "0f0421ee-53ce-4160-abfc-a5968415005b" (UID: "0f0421ee-53ce-4160-abfc-a5968415005b"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:17:41.701390 master-0 kubenswrapper[18592]: I0308 04:17:41.701343 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0f0421ee-53ce-4160-abfc-a5968415005b" (UID: "0f0421ee-53ce-4160-abfc-a5968415005b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:41.710145 master-0 kubenswrapper[18592]: I0308 04:17:41.710062 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-kube-api-access-rvb9v" (OuterVolumeSpecName: "kube-api-access-rvb9v") pod "808f8fcb-c2b2-4df7-9c7a-9838aaf744af" (UID: "808f8fcb-c2b2-4df7-9c7a-9838aaf744af"). InnerVolumeSpecName "kube-api-access-rvb9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:17:41.711052 master-0 kubenswrapper[18592]: I0308 04:17:41.710994 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-scripts" (OuterVolumeSpecName: "scripts") pod "808f8fcb-c2b2-4df7-9c7a-9838aaf744af" (UID: "808f8fcb-c2b2-4df7-9c7a-9838aaf744af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:41.711052 master-0 kubenswrapper[18592]: I0308 04:17:41.711021 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-scripts" (OuterVolumeSpecName: "scripts") pod "0f0421ee-53ce-4160-abfc-a5968415005b" (UID: "0f0421ee-53ce-4160-abfc-a5968415005b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:41.714058 master-0 kubenswrapper[18592]: I0308 04:17:41.714014 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f0421ee-53ce-4160-abfc-a5968415005b-kube-api-access-bf822" (OuterVolumeSpecName: "kube-api-access-bf822") pod "0f0421ee-53ce-4160-abfc-a5968415005b" (UID: "0f0421ee-53ce-4160-abfc-a5968415005b"). InnerVolumeSpecName "kube-api-access-bf822". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:17:41.789726 master-0 kubenswrapper[18592]: I0308 04:17:41.784877 18592 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-etc-nvme\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.789726 master-0 kubenswrapper[18592]: I0308 04:17:41.784914 18592 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.789726 master-0 kubenswrapper[18592]: I0308 04:17:41.784923 18592 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.789726 master-0 kubenswrapper[18592]: I0308 04:17:41.784931 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf822\" (UniqueName: \"kubernetes.io/projected/0f0421ee-53ce-4160-abfc-a5968415005b-kube-api-access-bf822\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.789726 master-0 kubenswrapper[18592]: I0308 04:17:41.784941 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvb9v\" (UniqueName: \"kubernetes.io/projected/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-kube-api-access-rvb9v\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.789726 master-0 kubenswrapper[18592]: I0308 04:17:41.784951 18592 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.789726 master-0 kubenswrapper[18592]: I0308 04:17:41.784961 18592 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.789726 master-0 kubenswrapper[18592]: I0308 04:17:41.784970 18592 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.789726 master-0 kubenswrapper[18592]: I0308 04:17:41.784978 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.789726 master-0 kubenswrapper[18592]: I0308 04:17:41.784986 18592 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-lib-modules\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.789726 master-0 kubenswrapper[18592]: I0308 04:17:41.784993 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.789726 master-0 kubenswrapper[18592]: I0308 04:17:41.785001 18592 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.789726 master-0 kubenswrapper[18592]: I0308 04:17:41.785009 18592 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.789726 master-0 kubenswrapper[18592]: I0308 04:17:41.785017 18592 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-run\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.789726 master-0 kubenswrapper[18592]: I0308 04:17:41.785025 18592 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-dev\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.789726 master-0 kubenswrapper[18592]: I0308 04:17:41.785034 18592 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.789726 master-0 kubenswrapper[18592]: I0308 04:17:41.785042 18592 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0f0421ee-53ce-4160-abfc-a5968415005b-sys\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.789726 master-0 kubenswrapper[18592]: I0308 04:17:41.786013 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f0421ee-53ce-4160-abfc-a5968415005b" (UID: "0f0421ee-53ce-4160-abfc-a5968415005b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:41.801174 master-0 kubenswrapper[18592]: I0308 04:17:41.801076 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "808f8fcb-c2b2-4df7-9c7a-9838aaf744af" (UID: "808f8fcb-c2b2-4df7-9c7a-9838aaf744af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:41.887889 master-0 kubenswrapper[18592]: I0308 04:17:41.887800 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-config-data" (OuterVolumeSpecName: "config-data") pod "0f0421ee-53ce-4160-abfc-a5968415005b" (UID: "0f0421ee-53ce-4160-abfc-a5968415005b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:41.887889 master-0 kubenswrapper[18592]: I0308 04:17:41.887844 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.888360 master-0 kubenswrapper[18592]: I0308 04:17:41.887912 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:41.900038 master-0 kubenswrapper[18592]: I0308 04:17:41.899979 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-config-data" (OuterVolumeSpecName: "config-data") pod "808f8fcb-c2b2-4df7-9c7a-9838aaf744af" (UID: "808f8fcb-c2b2-4df7-9c7a-9838aaf744af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:41.916398 master-0 kubenswrapper[18592]: I0308 04:17:41.916287 18592 generic.go:334] "Generic (PLEG): container finished" podID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" containerID="462e81e5ce25b176c9b0817827c6ba8697290e0fcb4c8129909f17d350faa51b" exitCode=1 Mar 08 04:17:41.916398 master-0 kubenswrapper[18592]: I0308 04:17:41.916354 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-568f56df46-pmcb4" event={"ID":"a793c425-75a3-4c0d-95a0-43f6bdf96bf5","Type":"ContainerDied","Data":"462e81e5ce25b176c9b0817827c6ba8697290e0fcb4c8129909f17d350faa51b"} Mar 08 04:17:41.916398 master-0 kubenswrapper[18592]: I0308 04:17:41.916385 18592 scope.go:117] "RemoveContainer" containerID="eede0fd52126bd1934ba16ebaa1eb739915adaebb008cabb1f0d1cdc912ef9c4" Mar 08 04:17:41.920197 master-0 kubenswrapper[18592]: I0308 04:17:41.919740 18592 scope.go:117] "RemoveContainer" containerID="462e81e5ce25b176c9b0817827c6ba8697290e0fcb4c8129909f17d350faa51b" Mar 08 04:17:41.920197 master-0 kubenswrapper[18592]: E0308 04:17:41.920041 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-568f56df46-pmcb4_openstack(a793c425-75a3-4c0d-95a0-43f6bdf96bf5)\"" pod="openstack/ironic-568f56df46-pmcb4" podUID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" Mar 08 04:17:41.972060 master-0 kubenswrapper[18592]: I0308 04:17:41.963236 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-backup-0" event={"ID":"0f0421ee-53ce-4160-abfc-a5968415005b","Type":"ContainerDied","Data":"740342c0d64c7138e970900d47e4d32f154103cedb519501c4826b27b35dc15e"} Mar 08 04:17:41.972060 master-0 kubenswrapper[18592]: I0308 04:17:41.963349 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:41.972060 master-0 kubenswrapper[18592]: I0308 04:17:41.971808 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-scheduler-0" event={"ID":"808f8fcb-c2b2-4df7-9c7a-9838aaf744af","Type":"ContainerDied","Data":"18f7e2063e3936084eaca1b17f02965ef79317b24466e0686f4498c2cc7dbacc"} Mar 08 04:17:41.974578 master-0 kubenswrapper[18592]: I0308 04:17:41.974145 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.000926 master-0 kubenswrapper[18592]: I0308 04:17:41.991602 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f0421ee-53ce-4160-abfc-a5968415005b-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:42.000926 master-0 kubenswrapper[18592]: I0308 04:17:41.991656 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/808f8fcb-c2b2-4df7-9c7a-9838aaf744af-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:42.024985 master-0 kubenswrapper[18592]: I0308 04:17:42.024785 18592 scope.go:117] "RemoveContainer" containerID="12199ca2ce66a98f608e1da86a561392f75037c0b84ea80ae27c5022a0692d96" Mar 08 04:17:42.104424 master-0 kubenswrapper[18592]: I0308 04:17:42.103772 18592 scope.go:117] "RemoveContainer" containerID="0fcdce70899a7ba893c9e7446288886f7e98ce6c222494e6a125caa9b136ecaf" Mar 08 04:17:42.126773 master-0 kubenswrapper[18592]: I0308 04:17:42.126636 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ff301-scheduler-0"] Mar 08 04:17:42.163599 master-0 kubenswrapper[18592]: I0308 04:17:42.162176 18592 scope.go:117] "RemoveContainer" containerID="18b944393cbbd8944c868595b59690bf821e5a692d25a4b47311ab6a5400e2d5" Mar 08 04:17:42.192569 master-0 kubenswrapper[18592]: I0308 04:17:42.187271 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-ff301-scheduler-0"] Mar 08 04:17:42.197007 master-0 kubenswrapper[18592]: I0308 04:17:42.196583 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ff301-backup-0"] Mar 08 04:17:42.224773 master-0 kubenswrapper[18592]: I0308 04:17:42.223728 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-ff301-backup-0"] Mar 08 04:17:42.239960 master-0 kubenswrapper[18592]: I0308 04:17:42.239891 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ff301-scheduler-0"] Mar 08 04:17:42.240454 master-0 kubenswrapper[18592]: E0308 04:17:42.240426 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="808f8fcb-c2b2-4df7-9c7a-9838aaf744af" containerName="cinder-scheduler" Mar 08 04:17:42.240454 master-0 kubenswrapper[18592]: I0308 04:17:42.240444 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="808f8fcb-c2b2-4df7-9c7a-9838aaf744af" containerName="cinder-scheduler" Mar 08 04:17:42.240581 master-0 kubenswrapper[18592]: E0308 04:17:42.240459 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f0421ee-53ce-4160-abfc-a5968415005b" containerName="probe" Mar 08 04:17:42.240581 master-0 kubenswrapper[18592]: I0308 04:17:42.240466 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f0421ee-53ce-4160-abfc-a5968415005b" containerName="probe" Mar 08 04:17:42.240581 master-0 kubenswrapper[18592]: E0308 04:17:42.240496 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f0421ee-53ce-4160-abfc-a5968415005b" containerName="cinder-backup" Mar 08 04:17:42.240581 master-0 kubenswrapper[18592]: I0308 04:17:42.240502 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f0421ee-53ce-4160-abfc-a5968415005b" containerName="cinder-backup" Mar 08 04:17:42.240581 master-0 kubenswrapper[18592]: E0308 04:17:42.240524 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="808f8fcb-c2b2-4df7-9c7a-9838aaf744af" containerName="probe" Mar 08 04:17:42.240581 master-0 kubenswrapper[18592]: I0308 04:17:42.240532 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="808f8fcb-c2b2-4df7-9c7a-9838aaf744af" containerName="probe" Mar 08 04:17:42.240788 master-0 kubenswrapper[18592]: I0308 04:17:42.240771 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f0421ee-53ce-4160-abfc-a5968415005b" containerName="probe" Mar 08 04:17:42.240788 master-0 kubenswrapper[18592]: I0308 04:17:42.240788 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="808f8fcb-c2b2-4df7-9c7a-9838aaf744af" containerName="cinder-scheduler" Mar 08 04:17:42.241057 master-0 kubenswrapper[18592]: I0308 04:17:42.240832 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f0421ee-53ce-4160-abfc-a5968415005b" containerName="cinder-backup" Mar 08 04:17:42.241057 master-0 kubenswrapper[18592]: I0308 04:17:42.240841 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="808f8fcb-c2b2-4df7-9c7a-9838aaf744af" containerName="probe" Mar 08 04:17:42.242241 master-0 kubenswrapper[18592]: I0308 04:17:42.241908 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.251658 master-0 kubenswrapper[18592]: I0308 04:17:42.251476 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-ff301-scheduler-config-data" Mar 08 04:17:42.251658 master-0 kubenswrapper[18592]: I0308 04:17:42.251522 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ff301-scheduler-0"] Mar 08 04:17:42.274430 master-0 kubenswrapper[18592]: I0308 04:17:42.274376 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ff301-backup-0"] Mar 08 04:17:42.276323 master-0 kubenswrapper[18592]: I0308 04:17:42.276142 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.278213 master-0 kubenswrapper[18592]: I0308 04:17:42.278182 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-ff301-backup-config-data" Mar 08 04:17:42.299230 master-0 kubenswrapper[18592]: I0308 04:17:42.299179 18592 scope.go:117] "RemoveContainer" containerID="3eb0f5404cad7df753b609a49341dcfedcd0f8ba7eba964052e8a5ff5d83c94d" Mar 08 04:17:42.299325 master-0 kubenswrapper[18592]: I0308 04:17:42.299314 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ff301-backup-0"] Mar 08 04:17:42.408988 master-0 kubenswrapper[18592]: I0308 04:17:42.408808 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-var-locks-brick\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.408988 master-0 kubenswrapper[18592]: I0308 04:17:42.408906 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-run\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.408988 master-0 kubenswrapper[18592]: I0308 04:17:42.408935 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08bdf9f0-9d7a-436b-9f65-a313c8d71f69-etc-machine-id\") pod \"cinder-ff301-scheduler-0\" (UID: \"08bdf9f0-9d7a-436b-9f65-a313c8d71f69\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.408988 master-0 kubenswrapper[18592]: I0308 04:17:42.408966 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-etc-machine-id\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.408988 master-0 kubenswrapper[18592]: I0308 04:17:42.408983 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ad32be0-4591-419d-9d36-83abd556234b-combined-ca-bundle\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.408988 master-0 kubenswrapper[18592]: I0308 04:17:42.408999 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smnfz\" (UniqueName: \"kubernetes.io/projected/08bdf9f0-9d7a-436b-9f65-a313c8d71f69-kube-api-access-smnfz\") pod \"cinder-ff301-scheduler-0\" (UID: \"08bdf9f0-9d7a-436b-9f65-a313c8d71f69\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.409387 master-0 kubenswrapper[18592]: I0308 04:17:42.409045 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ad32be0-4591-419d-9d36-83abd556234b-scripts\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.409387 master-0 kubenswrapper[18592]: I0308 04:17:42.409079 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vltmk\" (UniqueName: \"kubernetes.io/projected/6ad32be0-4591-419d-9d36-83abd556234b-kube-api-access-vltmk\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.409387 master-0 kubenswrapper[18592]: I0308 04:17:42.409096 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-lib-modules\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.409387 master-0 kubenswrapper[18592]: I0308 04:17:42.409154 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-var-lib-cinder\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.409387 master-0 kubenswrapper[18592]: I0308 04:17:42.409267 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-etc-nvme\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.409387 master-0 kubenswrapper[18592]: I0308 04:17:42.409296 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-etc-iscsi\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.409387 master-0 kubenswrapper[18592]: I0308 04:17:42.409363 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ad32be0-4591-419d-9d36-83abd556234b-config-data\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.409580 master-0 kubenswrapper[18592]: I0308 04:17:42.409393 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-var-locks-cinder\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.409580 master-0 kubenswrapper[18592]: I0308 04:17:42.409422 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-dev\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.409580 master-0 kubenswrapper[18592]: I0308 04:17:42.409438 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08bdf9f0-9d7a-436b-9f65-a313c8d71f69-combined-ca-bundle\") pod \"cinder-ff301-scheduler-0\" (UID: \"08bdf9f0-9d7a-436b-9f65-a313c8d71f69\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.409580 master-0 kubenswrapper[18592]: I0308 04:17:42.409477 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-sys\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.409580 master-0 kubenswrapper[18592]: I0308 04:17:42.409525 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08bdf9f0-9d7a-436b-9f65-a313c8d71f69-config-data-custom\") pod \"cinder-ff301-scheduler-0\" (UID: \"08bdf9f0-9d7a-436b-9f65-a313c8d71f69\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.409580 master-0 kubenswrapper[18592]: I0308 04:17:42.409568 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08bdf9f0-9d7a-436b-9f65-a313c8d71f69-scripts\") pod \"cinder-ff301-scheduler-0\" (UID: \"08bdf9f0-9d7a-436b-9f65-a313c8d71f69\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.409846 master-0 kubenswrapper[18592]: I0308 04:17:42.409591 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ad32be0-4591-419d-9d36-83abd556234b-config-data-custom\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.409846 master-0 kubenswrapper[18592]: I0308 04:17:42.409612 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08bdf9f0-9d7a-436b-9f65-a313c8d71f69-config-data\") pod \"cinder-ff301-scheduler-0\" (UID: \"08bdf9f0-9d7a-436b-9f65-a313c8d71f69\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.512089 master-0 kubenswrapper[18592]: I0308 04:17:42.511949 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-etc-nvme\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.512402 master-0 kubenswrapper[18592]: I0308 04:17:42.512378 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-etc-iscsi\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.519052 master-0 kubenswrapper[18592]: I0308 04:17:42.518966 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ad32be0-4591-419d-9d36-83abd556234b-config-data\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.519349 master-0 kubenswrapper[18592]: I0308 04:17:42.519327 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-var-locks-cinder\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.519515 master-0 kubenswrapper[18592]: I0308 04:17:42.519495 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-dev\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.519620 master-0 kubenswrapper[18592]: I0308 04:17:42.519579 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:42.519720 master-0 kubenswrapper[18592]: I0308 04:17:42.519699 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08bdf9f0-9d7a-436b-9f65-a313c8d71f69-combined-ca-bundle\") pod \"cinder-ff301-scheduler-0\" (UID: \"08bdf9f0-9d7a-436b-9f65-a313c8d71f69\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.519948 master-0 kubenswrapper[18592]: I0308 04:17:42.519927 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-sys\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.520087 master-0 kubenswrapper[18592]: I0308 04:17:42.520067 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08bdf9f0-9d7a-436b-9f65-a313c8d71f69-config-data-custom\") pod \"cinder-ff301-scheduler-0\" (UID: \"08bdf9f0-9d7a-436b-9f65-a313c8d71f69\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.520228 master-0 kubenswrapper[18592]: I0308 04:17:42.520207 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08bdf9f0-9d7a-436b-9f65-a313c8d71f69-scripts\") pod \"cinder-ff301-scheduler-0\" (UID: \"08bdf9f0-9d7a-436b-9f65-a313c8d71f69\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.520347 master-0 kubenswrapper[18592]: I0308 04:17:42.520326 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ad32be0-4591-419d-9d36-83abd556234b-config-data-custom\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.520468 master-0 kubenswrapper[18592]: I0308 04:17:42.520449 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08bdf9f0-9d7a-436b-9f65-a313c8d71f69-config-data\") pod \"cinder-ff301-scheduler-0\" (UID: \"08bdf9f0-9d7a-436b-9f65-a313c8d71f69\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.520750 master-0 kubenswrapper[18592]: I0308 04:17:42.520727 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-var-locks-brick\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.520942 master-0 kubenswrapper[18592]: I0308 04:17:42.520921 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-run\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.521082 master-0 kubenswrapper[18592]: I0308 04:17:42.521061 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08bdf9f0-9d7a-436b-9f65-a313c8d71f69-etc-machine-id\") pod \"cinder-ff301-scheduler-0\" (UID: \"08bdf9f0-9d7a-436b-9f65-a313c8d71f69\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.521210 master-0 kubenswrapper[18592]: I0308 04:17:42.521182 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-sys\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.521279 master-0 kubenswrapper[18592]: I0308 04:17:42.514737 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-etc-nvme\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.521375 master-0 kubenswrapper[18592]: I0308 04:17:42.521355 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-etc-machine-id\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.521486 master-0 kubenswrapper[18592]: I0308 04:17:42.521468 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ad32be0-4591-419d-9d36-83abd556234b-combined-ca-bundle\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.521592 master-0 kubenswrapper[18592]: I0308 04:17:42.521573 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smnfz\" (UniqueName: \"kubernetes.io/projected/08bdf9f0-9d7a-436b-9f65-a313c8d71f69-kube-api-access-smnfz\") pod \"cinder-ff301-scheduler-0\" (UID: \"08bdf9f0-9d7a-436b-9f65-a313c8d71f69\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.521863 master-0 kubenswrapper[18592]: I0308 04:17:42.521796 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ad32be0-4591-419d-9d36-83abd556234b-scripts\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.522041 master-0 kubenswrapper[18592]: I0308 04:17:42.522019 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vltmk\" (UniqueName: \"kubernetes.io/projected/6ad32be0-4591-419d-9d36-83abd556234b-kube-api-access-vltmk\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.522157 master-0 kubenswrapper[18592]: I0308 04:17:42.522138 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-lib-modules\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.522327 master-0 kubenswrapper[18592]: I0308 04:17:42.522307 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-var-lib-cinder\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.522596 master-0 kubenswrapper[18592]: I0308 04:17:42.522575 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-var-lib-cinder\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.524461 master-0 kubenswrapper[18592]: I0308 04:17:42.523712 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-var-locks-cinder\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.524461 master-0 kubenswrapper[18592]: I0308 04:17:42.514774 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-etc-iscsi\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.524461 master-0 kubenswrapper[18592]: I0308 04:17:42.523861 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-lib-modules\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.527860 master-0 kubenswrapper[18592]: I0308 04:17:42.524784 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ad32be0-4591-419d-9d36-83abd556234b-config-data\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.527860 master-0 kubenswrapper[18592]: I0308 04:17:42.525113 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-dev\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.527860 master-0 kubenswrapper[18592]: I0308 04:17:42.525739 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-etc-machine-id\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.527860 master-0 kubenswrapper[18592]: I0308 04:17:42.526470 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ad32be0-4591-419d-9d36-83abd556234b-scripts\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.527860 master-0 kubenswrapper[18592]: I0308 04:17:42.526681 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08bdf9f0-9d7a-436b-9f65-a313c8d71f69-etc-machine-id\") pod \"cinder-ff301-scheduler-0\" (UID: \"08bdf9f0-9d7a-436b-9f65-a313c8d71f69\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.527860 master-0 kubenswrapper[18592]: I0308 04:17:42.526731 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-var-locks-brick\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.527860 master-0 kubenswrapper[18592]: I0308 04:17:42.526755 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6ad32be0-4591-419d-9d36-83abd556234b-run\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.530095 master-0 kubenswrapper[18592]: I0308 04:17:42.528969 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ad32be0-4591-419d-9d36-83abd556234b-combined-ca-bundle\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.530095 master-0 kubenswrapper[18592]: I0308 04:17:42.529379 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08bdf9f0-9d7a-436b-9f65-a313c8d71f69-scripts\") pod \"cinder-ff301-scheduler-0\" (UID: \"08bdf9f0-9d7a-436b-9f65-a313c8d71f69\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.530095 master-0 kubenswrapper[18592]: I0308 04:17:42.530021 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ad32be0-4591-419d-9d36-83abd556234b-config-data-custom\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.530423 master-0 kubenswrapper[18592]: I0308 04:17:42.530386 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08bdf9f0-9d7a-436b-9f65-a313c8d71f69-config-data\") pod \"cinder-ff301-scheduler-0\" (UID: \"08bdf9f0-9d7a-436b-9f65-a313c8d71f69\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.530850 master-0 kubenswrapper[18592]: I0308 04:17:42.530815 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08bdf9f0-9d7a-436b-9f65-a313c8d71f69-config-data-custom\") pod \"cinder-ff301-scheduler-0\" (UID: \"08bdf9f0-9d7a-436b-9f65-a313c8d71f69\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.536879 master-0 kubenswrapper[18592]: I0308 04:17:42.534958 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08bdf9f0-9d7a-436b-9f65-a313c8d71f69-combined-ca-bundle\") pod \"cinder-ff301-scheduler-0\" (UID: \"08bdf9f0-9d7a-436b-9f65-a313c8d71f69\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.559070 master-0 kubenswrapper[18592]: I0308 04:17:42.559003 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:17:42.661578 master-0 kubenswrapper[18592]: I0308 04:17:42.661126 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vltmk\" (UniqueName: \"kubernetes.io/projected/6ad32be0-4591-419d-9d36-83abd556234b-kube-api-access-vltmk\") pod \"cinder-ff301-backup-0\" (UID: \"6ad32be0-4591-419d-9d36-83abd556234b\") " pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:42.671619 master-0 kubenswrapper[18592]: I0308 04:17:42.671565 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smnfz\" (UniqueName: \"kubernetes.io/projected/08bdf9f0-9d7a-436b-9f65-a313c8d71f69-kube-api-access-smnfz\") pod \"cinder-ff301-scheduler-0\" (UID: \"08bdf9f0-9d7a-436b-9f65-a313c8d71f69\") " pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.708665 master-0 kubenswrapper[18592]: I0308 04:17:42.708599 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:42.733340 master-0 kubenswrapper[18592]: I0308 04:17:42.733173 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:43.032626 master-0 kubenswrapper[18592]: I0308 04:17:43.017223 18592 scope.go:117] "RemoveContainer" containerID="462e81e5ce25b176c9b0817827c6ba8697290e0fcb4c8129909f17d350faa51b" Mar 08 04:17:43.032626 master-0 kubenswrapper[18592]: I0308 04:17:43.023868 18592 generic.go:334] "Generic (PLEG): container finished" podID="abe912ba-4a33-4634-a3fb-b6fb09b38d8e" containerID="7412fef920c13b9ed12be8da1a35b12e4fe6ef0a0a5ae459634bcd60e1c0636f" exitCode=0 Mar 08 04:17:43.032626 master-0 kubenswrapper[18592]: I0308 04:17:43.023954 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"abe912ba-4a33-4634-a3fb-b6fb09b38d8e","Type":"ContainerDied","Data":"7412fef920c13b9ed12be8da1a35b12e4fe6ef0a0a5ae459634bcd60e1c0636f"} Mar 08 04:17:43.032626 master-0 kubenswrapper[18592]: E0308 04:17:43.028466 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-568f56df46-pmcb4_openstack(a793c425-75a3-4c0d-95a0-43f6bdf96bf5)\"" pod="openstack/ironic-568f56df46-pmcb4" podUID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" Mar 08 04:17:43.179923 master-0 kubenswrapper[18592]: I0308 04:17:43.179837 18592 generic.go:334] "Generic (PLEG): container finished" podID="05ba9b98-7d2f-4a9b-80ad-60793d8279e8" containerID="78a26f8923875ef6ca34dad35ae3996e9a46df95c44c8b7f37f4d7b9154f9694" exitCode=1 Mar 08 04:17:43.180195 master-0 kubenswrapper[18592]: I0308 04:17:43.180142 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" event={"ID":"05ba9b98-7d2f-4a9b-80ad-60793d8279e8","Type":"ContainerDied","Data":"78a26f8923875ef6ca34dad35ae3996e9a46df95c44c8b7f37f4d7b9154f9694"} Mar 08 04:17:43.181266 master-0 kubenswrapper[18592]: I0308 04:17:43.181040 18592 scope.go:117] "RemoveContainer" containerID="78a26f8923875ef6ca34dad35ae3996e9a46df95c44c8b7f37f4d7b9154f9694" Mar 08 04:17:43.193250 master-0 kubenswrapper[18592]: I0308 04:17:43.185109 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-59ffff478d-tll4r"] Mar 08 04:17:43.193250 master-0 kubenswrapper[18592]: I0308 04:17:43.187056 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.243109 master-0 kubenswrapper[18592]: I0308 04:17:43.243032 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-59ffff478d-tll4r"] Mar 08 04:17:43.357860 master-0 kubenswrapper[18592]: I0308 04:17:43.357721 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q6jg\" (UniqueName: \"kubernetes.io/projected/29953134-4064-46e8-9791-cf9c07d7f106-kube-api-access-4q6jg\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.357860 master-0 kubenswrapper[18592]: I0308 04:17:43.357771 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29953134-4064-46e8-9791-cf9c07d7f106-config-data\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.357979 master-0 kubenswrapper[18592]: I0308 04:17:43.357864 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29953134-4064-46e8-9791-cf9c07d7f106-scripts\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.357979 master-0 kubenswrapper[18592]: I0308 04:17:43.357923 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29953134-4064-46e8-9791-cf9c07d7f106-public-tls-certs\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.357979 master-0 kubenswrapper[18592]: I0308 04:17:43.357949 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29953134-4064-46e8-9791-cf9c07d7f106-combined-ca-bundle\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.358079 master-0 kubenswrapper[18592]: I0308 04:17:43.358000 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29953134-4064-46e8-9791-cf9c07d7f106-logs\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.358079 master-0 kubenswrapper[18592]: I0308 04:17:43.358031 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29953134-4064-46e8-9791-cf9c07d7f106-internal-tls-certs\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.409984 master-0 kubenswrapper[18592]: I0308 04:17:43.409925 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ff301-backup-0"] Mar 08 04:17:43.459154 master-0 kubenswrapper[18592]: I0308 04:17:43.459078 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ff301-scheduler-0"] Mar 08 04:17:43.460016 master-0 kubenswrapper[18592]: I0308 04:17:43.459969 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29953134-4064-46e8-9791-cf9c07d7f106-public-tls-certs\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.460070 master-0 kubenswrapper[18592]: I0308 04:17:43.460026 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29953134-4064-46e8-9791-cf9c07d7f106-combined-ca-bundle\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.460149 master-0 kubenswrapper[18592]: I0308 04:17:43.460091 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29953134-4064-46e8-9791-cf9c07d7f106-logs\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.460149 master-0 kubenswrapper[18592]: I0308 04:17:43.460127 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29953134-4064-46e8-9791-cf9c07d7f106-internal-tls-certs\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.460228 master-0 kubenswrapper[18592]: I0308 04:17:43.460195 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q6jg\" (UniqueName: \"kubernetes.io/projected/29953134-4064-46e8-9791-cf9c07d7f106-kube-api-access-4q6jg\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.460228 master-0 kubenswrapper[18592]: I0308 04:17:43.460213 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29953134-4064-46e8-9791-cf9c07d7f106-config-data\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.460672 master-0 kubenswrapper[18592]: I0308 04:17:43.460631 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29953134-4064-46e8-9791-cf9c07d7f106-scripts\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.462358 master-0 kubenswrapper[18592]: I0308 04:17:43.462317 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29953134-4064-46e8-9791-cf9c07d7f106-logs\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.465461 master-0 kubenswrapper[18592]: I0308 04:17:43.465409 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29953134-4064-46e8-9791-cf9c07d7f106-public-tls-certs\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.465774 master-0 kubenswrapper[18592]: I0308 04:17:43.465733 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29953134-4064-46e8-9791-cf9c07d7f106-scripts\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.468651 master-0 kubenswrapper[18592]: I0308 04:17:43.468617 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29953134-4064-46e8-9791-cf9c07d7f106-internal-tls-certs\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.469427 master-0 kubenswrapper[18592]: I0308 04:17:43.469363 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29953134-4064-46e8-9791-cf9c07d7f106-combined-ca-bundle\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.478602 master-0 kubenswrapper[18592]: I0308 04:17:43.478570 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29953134-4064-46e8-9791-cf9c07d7f106-config-data\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.490135 master-0 kubenswrapper[18592]: I0308 04:17:43.490101 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q6jg\" (UniqueName: \"kubernetes.io/projected/29953134-4064-46e8-9791-cf9c07d7f106-kube-api-access-4q6jg\") pod \"placement-59ffff478d-tll4r\" (UID: \"29953134-4064-46e8-9791-cf9c07d7f106\") " pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.520304 master-0 kubenswrapper[18592]: I0308 04:17:43.520257 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:43.520484 master-0 kubenswrapper[18592]: I0308 04:17:43.520313 18592 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:17:43.550695 master-0 kubenswrapper[18592]: I0308 04:17:43.550647 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:17:43.645868 master-0 kubenswrapper[18592]: I0308 04:17:43.644965 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5584fcc769-csbz6"] Mar 08 04:17:43.645868 master-0 kubenswrapper[18592]: I0308 04:17:43.645205 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5584fcc769-csbz6" podUID="ea394be4-dd2b-4519-b7bc-80ac84a4cd16" containerName="dnsmasq-dns" containerID="cri-o://05b7d99b429e89d85d450e165aad20f9bbf2cbd354aa2d0ae238fab324d45d41" gracePeriod=10 Mar 08 04:17:43.869944 master-0 kubenswrapper[18592]: I0308 04:17:43.865165 18592 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:43.869944 master-0 kubenswrapper[18592]: I0308 04:17:43.865222 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:44.303846 master-0 kubenswrapper[18592]: I0308 04:17:44.301056 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f0421ee-53ce-4160-abfc-a5968415005b" path="/var/lib/kubelet/pods/0f0421ee-53ce-4160-abfc-a5968415005b/volumes" Mar 08 04:17:44.335845 master-0 kubenswrapper[18592]: I0308 04:17:44.332982 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="808f8fcb-c2b2-4df7-9c7a-9838aaf744af" path="/var/lib/kubelet/pods/808f8fcb-c2b2-4df7-9c7a-9838aaf744af/volumes" Mar 08 04:17:44.403079 master-0 kubenswrapper[18592]: I0308 04:17:44.401689 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:44.463256 master-0 kubenswrapper[18592]: I0308 04:17:44.463087 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" event={"ID":"05ba9b98-7d2f-4a9b-80ad-60793d8279e8","Type":"ContainerStarted","Data":"ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8"} Mar 08 04:17:44.463838 master-0 kubenswrapper[18592]: I0308 04:17:44.463763 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:17:44.548452 master-0 kubenswrapper[18592]: I0308 04:17:44.540743 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-backup-0" event={"ID":"6ad32be0-4591-419d-9d36-83abd556234b","Type":"ContainerStarted","Data":"2fc37dede99b5f9a9fcea5d7c8b177b1cc9ada704a4c6de0ab22e6d7e5d6acfd"} Mar 08 04:17:44.548452 master-0 kubenswrapper[18592]: I0308 04:17:44.540795 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-backup-0" event={"ID":"6ad32be0-4591-419d-9d36-83abd556234b","Type":"ContainerStarted","Data":"b7b76f89c055c37eab7d744979088c250907877d2a532d0fde35d920233f716f"} Mar 08 04:17:44.632198 master-0 kubenswrapper[18592]: I0308 04:17:44.628112 18592 generic.go:334] "Generic (PLEG): container finished" podID="ea394be4-dd2b-4519-b7bc-80ac84a4cd16" containerID="05b7d99b429e89d85d450e165aad20f9bbf2cbd354aa2d0ae238fab324d45d41" exitCode=0 Mar 08 04:17:44.632198 master-0 kubenswrapper[18592]: I0308 04:17:44.628245 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5584fcc769-csbz6" event={"ID":"ea394be4-dd2b-4519-b7bc-80ac84a4cd16","Type":"ContainerDied","Data":"05b7d99b429e89d85d450e165aad20f9bbf2cbd354aa2d0ae238fab324d45d41"} Mar 08 04:17:44.632198 master-0 kubenswrapper[18592]: I0308 04:17:44.628884 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:17:44.639882 master-0 kubenswrapper[18592]: I0308 04:17:44.639648 18592 scope.go:117] "RemoveContainer" containerID="462e81e5ce25b176c9b0817827c6ba8697290e0fcb4c8129909f17d350faa51b" Mar 08 04:17:44.640043 master-0 kubenswrapper[18592]: E0308 04:17:44.639921 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-568f56df46-pmcb4_openstack(a793c425-75a3-4c0d-95a0-43f6bdf96bf5)\"" pod="openstack/ironic-568f56df46-pmcb4" podUID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" Mar 08 04:17:44.643149 master-0 kubenswrapper[18592]: I0308 04:17:44.640111 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-scheduler-0" event={"ID":"08bdf9f0-9d7a-436b-9f65-a313c8d71f69","Type":"ContainerStarted","Data":"6dc5082b3bfa5c8874e40b8097e82e279ecf4921731ebfe2be3f23bb9ddd90b0"} Mar 08 04:17:44.662002 master-0 kubenswrapper[18592]: I0308 04:17:44.661963 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-59ffff478d-tll4r"] Mar 08 04:17:44.683432 master-0 kubenswrapper[18592]: I0308 04:17:44.683376 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-ff301-backup-0" podStartSLOduration=2.683353027 podStartE2EDuration="2.683353027s" podCreationTimestamp="2026-03-08 04:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:17:44.615394591 +0000 UTC m=+1476.714148941" watchObservedRunningTime="2026-03-08 04:17:44.683353027 +0000 UTC m=+1476.782107377" Mar 08 04:17:44.727843 master-0 kubenswrapper[18592]: I0308 04:17:44.718895 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-dns-swift-storage-0\") pod \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " Mar 08 04:17:44.727843 master-0 kubenswrapper[18592]: I0308 04:17:44.719082 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-dns-svc\") pod \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " Mar 08 04:17:44.727843 master-0 kubenswrapper[18592]: I0308 04:17:44.719105 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-ovsdbserver-sb\") pod \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " Mar 08 04:17:44.727843 master-0 kubenswrapper[18592]: I0308 04:17:44.719137 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-config\") pod \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " Mar 08 04:17:44.727843 master-0 kubenswrapper[18592]: I0308 04:17:44.719208 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95w42\" (UniqueName: \"kubernetes.io/projected/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-kube-api-access-95w42\") pod \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " Mar 08 04:17:44.727843 master-0 kubenswrapper[18592]: I0308 04:17:44.719297 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-ovsdbserver-nb\") pod \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\" (UID: \"ea394be4-dd2b-4519-b7bc-80ac84a4cd16\") " Mar 08 04:17:44.790648 master-0 kubenswrapper[18592]: I0308 04:17:44.790386 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-kube-api-access-95w42" (OuterVolumeSpecName: "kube-api-access-95w42") pod "ea394be4-dd2b-4519-b7bc-80ac84a4cd16" (UID: "ea394be4-dd2b-4519-b7bc-80ac84a4cd16"). InnerVolumeSpecName "kube-api-access-95w42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:17:44.822607 master-0 kubenswrapper[18592]: I0308 04:17:44.822222 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95w42\" (UniqueName: \"kubernetes.io/projected/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-kube-api-access-95w42\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:44.985501 master-0 kubenswrapper[18592]: I0308 04:17:44.985415 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ea394be4-dd2b-4519-b7bc-80ac84a4cd16" (UID: "ea394be4-dd2b-4519-b7bc-80ac84a4cd16"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:17:45.024314 master-0 kubenswrapper[18592]: I0308 04:17:45.024023 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ea394be4-dd2b-4519-b7bc-80ac84a4cd16" (UID: "ea394be4-dd2b-4519-b7bc-80ac84a4cd16"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:17:45.027046 master-0 kubenswrapper[18592]: I0308 04:17:45.026903 18592 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:45.027046 master-0 kubenswrapper[18592]: I0308 04:17:45.027033 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:45.034348 master-0 kubenswrapper[18592]: I0308 04:17:45.032920 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ea394be4-dd2b-4519-b7bc-80ac84a4cd16" (UID: "ea394be4-dd2b-4519-b7bc-80ac84a4cd16"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:17:45.048496 master-0 kubenswrapper[18592]: I0308 04:17:45.048452 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-config" (OuterVolumeSpecName: "config") pod "ea394be4-dd2b-4519-b7bc-80ac84a4cd16" (UID: "ea394be4-dd2b-4519-b7bc-80ac84a4cd16"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:17:45.065183 master-0 kubenswrapper[18592]: I0308 04:17:45.065129 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ea394be4-dd2b-4519-b7bc-80ac84a4cd16" (UID: "ea394be4-dd2b-4519-b7bc-80ac84a4cd16"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:17:45.131297 master-0 kubenswrapper[18592]: I0308 04:17:45.128520 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:45.131297 master-0 kubenswrapper[18592]: I0308 04:17:45.128566 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:45.131297 master-0 kubenswrapper[18592]: I0308 04:17:45.128581 18592 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea394be4-dd2b-4519-b7bc-80ac84a4cd16-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:45.655296 master-0 kubenswrapper[18592]: I0308 04:17:45.655250 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-backup-0" event={"ID":"6ad32be0-4591-419d-9d36-83abd556234b","Type":"ContainerStarted","Data":"ad893e7eb99699d555982074e0d6349ce0fe5d10336d158c490ab9aaf074eba3"} Mar 08 04:17:45.658370 master-0 kubenswrapper[18592]: I0308 04:17:45.658075 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5584fcc769-csbz6" event={"ID":"ea394be4-dd2b-4519-b7bc-80ac84a4cd16","Type":"ContainerDied","Data":"b55ff8dfc679a86aeb9f480f0251a0568e987ff79fa428cfbf0c61d62ffb8556"} Mar 08 04:17:45.658370 master-0 kubenswrapper[18592]: I0308 04:17:45.658122 18592 scope.go:117] "RemoveContainer" containerID="05b7d99b429e89d85d450e165aad20f9bbf2cbd354aa2d0ae238fab324d45d41" Mar 08 04:17:45.658370 master-0 kubenswrapper[18592]: I0308 04:17:45.658226 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5584fcc769-csbz6" Mar 08 04:17:45.670778 master-0 kubenswrapper[18592]: I0308 04:17:45.670737 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-scheduler-0" event={"ID":"08bdf9f0-9d7a-436b-9f65-a313c8d71f69","Type":"ContainerStarted","Data":"8f6010fd9e6437e22687dd4327e88be461e5b716d7d5b74bc295db4a703a4077"} Mar 08 04:17:45.674383 master-0 kubenswrapper[18592]: I0308 04:17:45.674298 18592 scope.go:117] "RemoveContainer" containerID="462e81e5ce25b176c9b0817827c6ba8697290e0fcb4c8129909f17d350faa51b" Mar 08 04:17:45.674646 master-0 kubenswrapper[18592]: E0308 04:17:45.674619 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-568f56df46-pmcb4_openstack(a793c425-75a3-4c0d-95a0-43f6bdf96bf5)\"" pod="openstack/ironic-568f56df46-pmcb4" podUID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" Mar 08 04:17:45.676060 master-0 kubenswrapper[18592]: I0308 04:17:45.674979 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59ffff478d-tll4r" event={"ID":"29953134-4064-46e8-9791-cf9c07d7f106","Type":"ContainerStarted","Data":"ade56d313efccf9b90a3b80e8c9facab7a03a061810cf2613516691f1040cbd0"} Mar 08 04:17:45.676060 master-0 kubenswrapper[18592]: I0308 04:17:45.675006 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59ffff478d-tll4r" event={"ID":"29953134-4064-46e8-9791-cf9c07d7f106","Type":"ContainerStarted","Data":"81b852f2ab10446cb2da09180b0964013a1648dd740467f157eb5fd4bfab4487"} Mar 08 04:17:45.701140 master-0 kubenswrapper[18592]: I0308 04:17:45.701024 18592 scope.go:117] "RemoveContainer" containerID="a8b5d8309bbaf476dac3e6aafc6d67940f17c8c7d9318f0f2f1137533e2777d0" Mar 08 04:17:45.748427 master-0 kubenswrapper[18592]: I0308 04:17:45.748329 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5584fcc769-csbz6"] Mar 08 04:17:45.776591 master-0 kubenswrapper[18592]: I0308 04:17:45.776548 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5584fcc769-csbz6"] Mar 08 04:17:46.186587 master-0 kubenswrapper[18592]: I0308 04:17:46.186528 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea394be4-dd2b-4519-b7bc-80ac84a4cd16" path="/var/lib/kubelet/pods/ea394be4-dd2b-4519-b7bc-80ac84a4cd16/volumes" Mar 08 04:17:46.701780 master-0 kubenswrapper[18592]: I0308 04:17:46.701720 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-scheduler-0" event={"ID":"08bdf9f0-9d7a-436b-9f65-a313c8d71f69","Type":"ContainerStarted","Data":"80b41d3d3c49bd721c91fdfffd43b2b35116e020a12467572f3f09e885bcea93"} Mar 08 04:17:46.705475 master-0 kubenswrapper[18592]: I0308 04:17:46.705431 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59ffff478d-tll4r" event={"ID":"29953134-4064-46e8-9791-cf9c07d7f106","Type":"ContainerStarted","Data":"00dc589c59a5db23672385faa9d976773151a9d88bc292b8b46884b8d6c5aea5"} Mar 08 04:17:46.841281 master-0 kubenswrapper[18592]: I0308 04:17:46.840134 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-ff301-scheduler-0" podStartSLOduration=4.840110153 podStartE2EDuration="4.840110153s" podCreationTimestamp="2026-03-08 04:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:17:46.803053267 +0000 UTC m=+1478.901807627" watchObservedRunningTime="2026-03-08 04:17:46.840110153 +0000 UTC m=+1478.938864503" Mar 08 04:17:47.709107 master-0 kubenswrapper[18592]: I0308 04:17:47.709056 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:47.733147 master-0 kubenswrapper[18592]: I0308 04:17:47.733102 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:47.733308 master-0 kubenswrapper[18592]: I0308 04:17:47.733160 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:47.733975 master-0 kubenswrapper[18592]: I0308 04:17:47.733925 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:17:47.958920 master-0 kubenswrapper[18592]: I0308 04:17:47.954814 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-59ffff478d-tll4r" podStartSLOduration=4.954795653 podStartE2EDuration="4.954795653s" podCreationTimestamp="2026-03-08 04:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:17:46.850362542 +0000 UTC m=+1478.949116902" watchObservedRunningTime="2026-03-08 04:17:47.954795653 +0000 UTC m=+1480.053550003" Mar 08 04:17:47.966506 master-0 kubenswrapper[18592]: I0308 04:17:47.965546 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-sync-fswtl"] Mar 08 04:17:47.967433 master-0 kubenswrapper[18592]: E0308 04:17:47.967406 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea394be4-dd2b-4519-b7bc-80ac84a4cd16" containerName="dnsmasq-dns" Mar 08 04:17:47.967498 master-0 kubenswrapper[18592]: I0308 04:17:47.967435 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea394be4-dd2b-4519-b7bc-80ac84a4cd16" containerName="dnsmasq-dns" Mar 08 04:17:47.967498 master-0 kubenswrapper[18592]: E0308 04:17:47.967492 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea394be4-dd2b-4519-b7bc-80ac84a4cd16" containerName="init" Mar 08 04:17:47.967498 master-0 kubenswrapper[18592]: I0308 04:17:47.967499 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea394be4-dd2b-4519-b7bc-80ac84a4cd16" containerName="init" Mar 08 04:17:47.967765 master-0 kubenswrapper[18592]: I0308 04:17:47.967739 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea394be4-dd2b-4519-b7bc-80ac84a4cd16" containerName="dnsmasq-dns" Mar 08 04:17:47.972810 master-0 kubenswrapper[18592]: I0308 04:17:47.968548 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:47.974419 master-0 kubenswrapper[18592]: I0308 04:17:47.974383 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 08 04:17:47.974680 master-0 kubenswrapper[18592]: I0308 04:17:47.974614 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 08 04:17:47.981692 master-0 kubenswrapper[18592]: I0308 04:17:47.981646 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-fswtl"] Mar 08 04:17:48.012960 master-0 kubenswrapper[18592]: I0308 04:17:48.012900 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-5b8d48c5b6-nqv28" Mar 08 04:17:48.109319 master-0 kubenswrapper[18592]: I0308 04:17:48.109255 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/67779b28-c7a4-43b1-bc15-a8401880ff2e-var-lib-ironic\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.109503 master-0 kubenswrapper[18592]: I0308 04:17:48.109401 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/67779b28-c7a4-43b1-bc15-a8401880ff2e-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.109503 master-0 kubenswrapper[18592]: I0308 04:17:48.109458 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgfb2\" (UniqueName: \"kubernetes.io/projected/67779b28-c7a4-43b1-bc15-a8401880ff2e-kube-api-access-pgfb2\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.109669 master-0 kubenswrapper[18592]: I0308 04:17:48.109582 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67779b28-c7a4-43b1-bc15-a8401880ff2e-config\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.109669 master-0 kubenswrapper[18592]: I0308 04:17:48.109631 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67779b28-c7a4-43b1-bc15-a8401880ff2e-scripts\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.109745 master-0 kubenswrapper[18592]: I0308 04:17:48.109673 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/67779b28-c7a4-43b1-bc15-a8401880ff2e-etc-podinfo\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.109780 master-0 kubenswrapper[18592]: I0308 04:17:48.109742 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67779b28-c7a4-43b1-bc15-a8401880ff2e-combined-ca-bundle\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.123770 master-0 kubenswrapper[18592]: I0308 04:17:48.122589 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-568f56df46-pmcb4"] Mar 08 04:17:48.123770 master-0 kubenswrapper[18592]: I0308 04:17:48.122807 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ironic-568f56df46-pmcb4" podUID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" containerName="ironic-api-log" containerID="cri-o://4b3492860ca9dc480436b650c75257796e5192f624bc94b6ebbedddef4bda44c" gracePeriod=60 Mar 08 04:17:48.224857 master-0 kubenswrapper[18592]: I0308 04:17:48.224731 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/67779b28-c7a4-43b1-bc15-a8401880ff2e-var-lib-ironic\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.225025 master-0 kubenswrapper[18592]: I0308 04:17:48.224858 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/67779b28-c7a4-43b1-bc15-a8401880ff2e-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.225025 master-0 kubenswrapper[18592]: I0308 04:17:48.224946 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgfb2\" (UniqueName: \"kubernetes.io/projected/67779b28-c7a4-43b1-bc15-a8401880ff2e-kube-api-access-pgfb2\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.225105 master-0 kubenswrapper[18592]: I0308 04:17:48.225088 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67779b28-c7a4-43b1-bc15-a8401880ff2e-config\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.228315 master-0 kubenswrapper[18592]: I0308 04:17:48.225140 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67779b28-c7a4-43b1-bc15-a8401880ff2e-scripts\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.228315 master-0 kubenswrapper[18592]: I0308 04:17:48.225184 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/67779b28-c7a4-43b1-bc15-a8401880ff2e-etc-podinfo\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.228315 master-0 kubenswrapper[18592]: I0308 04:17:48.225445 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67779b28-c7a4-43b1-bc15-a8401880ff2e-combined-ca-bundle\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.239253 master-0 kubenswrapper[18592]: I0308 04:17:48.231249 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/67779b28-c7a4-43b1-bc15-a8401880ff2e-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.239253 master-0 kubenswrapper[18592]: I0308 04:17:48.231968 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/67779b28-c7a4-43b1-bc15-a8401880ff2e-var-lib-ironic\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.239253 master-0 kubenswrapper[18592]: I0308 04:17:48.234785 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/67779b28-c7a4-43b1-bc15-a8401880ff2e-config\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.239253 master-0 kubenswrapper[18592]: I0308 04:17:48.236326 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/67779b28-c7a4-43b1-bc15-a8401880ff2e-etc-podinfo\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.247309 master-0 kubenswrapper[18592]: I0308 04:17:48.246808 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67779b28-c7a4-43b1-bc15-a8401880ff2e-scripts\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.247309 master-0 kubenswrapper[18592]: I0308 04:17:48.247147 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67779b28-c7a4-43b1-bc15-a8401880ff2e-combined-ca-bundle\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.256745 master-0 kubenswrapper[18592]: I0308 04:17:48.256511 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgfb2\" (UniqueName: \"kubernetes.io/projected/67779b28-c7a4-43b1-bc15-a8401880ff2e-kube-api-access-pgfb2\") pod \"ironic-inspector-db-sync-fswtl\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.318953 master-0 kubenswrapper[18592]: I0308 04:17:48.316554 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:48.520956 master-0 kubenswrapper[18592]: E0308 04:17:48.520663 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8 is running failed: container process not found" containerID="ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8" cmd=["/bin/true"] Mar 08 04:17:48.520956 master-0 kubenswrapper[18592]: E0308 04:17:48.520790 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8 is running failed: container process not found" containerID="ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8" cmd=["/bin/true"] Mar 08 04:17:48.521631 master-0 kubenswrapper[18592]: E0308 04:17:48.521558 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8 is running failed: container process not found" containerID="ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8" cmd=["/bin/true"] Mar 08 04:17:48.522112 master-0 kubenswrapper[18592]: E0308 04:17:48.521765 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8 is running failed: container process not found" containerID="ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8" cmd=["/bin/true"] Mar 08 04:17:48.522112 master-0 kubenswrapper[18592]: E0308 04:17:48.521791 18592 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8 is running failed: container process not found" probeType="Liveness" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" podUID="05ba9b98-7d2f-4a9b-80ad-60793d8279e8" containerName="ironic-neutron-agent" Mar 08 04:17:48.522112 master-0 kubenswrapper[18592]: E0308 04:17:48.522048 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8 is running failed: container process not found" containerID="ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8" cmd=["/bin/true"] Mar 08 04:17:48.522375 master-0 kubenswrapper[18592]: E0308 04:17:48.522310 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8 is running failed: container process not found" containerID="ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8" cmd=["/bin/true"] Mar 08 04:17:48.522375 master-0 kubenswrapper[18592]: E0308 04:17:48.522340 18592 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8 is running failed: container process not found" probeType="Readiness" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" podUID="05ba9b98-7d2f-4a9b-80ad-60793d8279e8" containerName="ironic-neutron-agent" Mar 08 04:17:48.937923 master-0 kubenswrapper[18592]: I0308 04:17:48.937842 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:49.374305 master-0 kubenswrapper[18592]: I0308 04:17:49.374045 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:49.477525 master-0 kubenswrapper[18592]: I0308 04:17:49.477469 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-fswtl"] Mar 08 04:17:49.480940 master-0 kubenswrapper[18592]: I0308 04:17:49.480885 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-config-data\") pod \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " Mar 08 04:17:49.481034 master-0 kubenswrapper[18592]: I0308 04:17:49.481009 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6k2b7\" (UniqueName: \"kubernetes.io/projected/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-kube-api-access-6k2b7\") pod \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " Mar 08 04:17:49.481126 master-0 kubenswrapper[18592]: I0308 04:17:49.481074 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-config-data-custom\") pod \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " Mar 08 04:17:49.481171 master-0 kubenswrapper[18592]: I0308 04:17:49.481153 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-etc-podinfo\") pod \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " Mar 08 04:17:49.481226 master-0 kubenswrapper[18592]: I0308 04:17:49.481206 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-logs\") pod \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " Mar 08 04:17:49.481267 master-0 kubenswrapper[18592]: I0308 04:17:49.481231 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-combined-ca-bundle\") pod \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " Mar 08 04:17:49.481306 master-0 kubenswrapper[18592]: I0308 04:17:49.481284 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-scripts\") pod \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " Mar 08 04:17:49.481372 master-0 kubenswrapper[18592]: I0308 04:17:49.481354 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-config-data-merged\") pod \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\" (UID: \"a793c425-75a3-4c0d-95a0-43f6bdf96bf5\") " Mar 08 04:17:49.482330 master-0 kubenswrapper[18592]: I0308 04:17:49.482278 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "a793c425-75a3-4c0d-95a0-43f6bdf96bf5" (UID: "a793c425-75a3-4c0d-95a0-43f6bdf96bf5"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:17:49.483632 master-0 kubenswrapper[18592]: I0308 04:17:49.483561 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-logs" (OuterVolumeSpecName: "logs") pod "a793c425-75a3-4c0d-95a0-43f6bdf96bf5" (UID: "a793c425-75a3-4c0d-95a0-43f6bdf96bf5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:17:49.488492 master-0 kubenswrapper[18592]: I0308 04:17:49.488437 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-kube-api-access-6k2b7" (OuterVolumeSpecName: "kube-api-access-6k2b7") pod "a793c425-75a3-4c0d-95a0-43f6bdf96bf5" (UID: "a793c425-75a3-4c0d-95a0-43f6bdf96bf5"). InnerVolumeSpecName "kube-api-access-6k2b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:17:49.491896 master-0 kubenswrapper[18592]: I0308 04:17:49.491856 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "a793c425-75a3-4c0d-95a0-43f6bdf96bf5" (UID: "a793c425-75a3-4c0d-95a0-43f6bdf96bf5"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 04:17:49.491961 master-0 kubenswrapper[18592]: I0308 04:17:49.491938 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a793c425-75a3-4c0d-95a0-43f6bdf96bf5" (UID: "a793c425-75a3-4c0d-95a0-43f6bdf96bf5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:49.503141 master-0 kubenswrapper[18592]: I0308 04:17:49.503092 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-scripts" (OuterVolumeSpecName: "scripts") pod "a793c425-75a3-4c0d-95a0-43f6bdf96bf5" (UID: "a793c425-75a3-4c0d-95a0-43f6bdf96bf5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:49.522299 master-0 kubenswrapper[18592]: I0308 04:17:49.522252 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-config-data" (OuterVolumeSpecName: "config-data") pod "a793c425-75a3-4c0d-95a0-43f6bdf96bf5" (UID: "a793c425-75a3-4c0d-95a0-43f6bdf96bf5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:49.544998 master-0 kubenswrapper[18592]: I0308 04:17:49.544946 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a793c425-75a3-4c0d-95a0-43f6bdf96bf5" (UID: "a793c425-75a3-4c0d-95a0-43f6bdf96bf5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:49.583857 master-0 kubenswrapper[18592]: I0308 04:17:49.583782 18592 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:49.583857 master-0 kubenswrapper[18592]: I0308 04:17:49.583849 18592 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:49.583857 master-0 kubenswrapper[18592]: I0308 04:17:49.583861 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:49.584028 master-0 kubenswrapper[18592]: I0308 04:17:49.583873 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:49.584028 master-0 kubenswrapper[18592]: I0308 04:17:49.583883 18592 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-config-data-merged\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:49.584089 master-0 kubenswrapper[18592]: I0308 04:17:49.584053 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:49.584089 master-0 kubenswrapper[18592]: I0308 04:17:49.584067 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6k2b7\" (UniqueName: \"kubernetes.io/projected/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-kube-api-access-6k2b7\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:49.584186 master-0 kubenswrapper[18592]: I0308 04:17:49.584076 18592 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a793c425-75a3-4c0d-95a0-43f6bdf96bf5-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:49.598814 master-0 kubenswrapper[18592]: I0308 04:17:49.598741 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-ff301-volume-lvm-iscsi-0" Mar 08 04:17:49.758296 master-0 kubenswrapper[18592]: I0308 04:17:49.758237 18592 generic.go:334] "Generic (PLEG): container finished" podID="05ba9b98-7d2f-4a9b-80ad-60793d8279e8" containerID="ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8" exitCode=1 Mar 08 04:17:49.758504 master-0 kubenswrapper[18592]: I0308 04:17:49.758346 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" event={"ID":"05ba9b98-7d2f-4a9b-80ad-60793d8279e8","Type":"ContainerDied","Data":"ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8"} Mar 08 04:17:49.758504 master-0 kubenswrapper[18592]: I0308 04:17:49.758495 18592 scope.go:117] "RemoveContainer" containerID="78a26f8923875ef6ca34dad35ae3996e9a46df95c44c8b7f37f4d7b9154f9694" Mar 08 04:17:49.759235 master-0 kubenswrapper[18592]: I0308 04:17:49.759198 18592 scope.go:117] "RemoveContainer" containerID="ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8" Mar 08 04:17:49.759625 master-0 kubenswrapper[18592]: E0308 04:17:49.759571 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-859d47fc89-z2wvz_openstack(05ba9b98-7d2f-4a9b-80ad-60793d8279e8)\"" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" podUID="05ba9b98-7d2f-4a9b-80ad-60793d8279e8" Mar 08 04:17:49.763373 master-0 kubenswrapper[18592]: I0308 04:17:49.763289 18592 generic.go:334] "Generic (PLEG): container finished" podID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" containerID="4b3492860ca9dc480436b650c75257796e5192f624bc94b6ebbedddef4bda44c" exitCode=143 Mar 08 04:17:49.763438 master-0 kubenswrapper[18592]: I0308 04:17:49.763346 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-568f56df46-pmcb4" event={"ID":"a793c425-75a3-4c0d-95a0-43f6bdf96bf5","Type":"ContainerDied","Data":"4b3492860ca9dc480436b650c75257796e5192f624bc94b6ebbedddef4bda44c"} Mar 08 04:17:49.763471 master-0 kubenswrapper[18592]: I0308 04:17:49.763439 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-568f56df46-pmcb4" event={"ID":"a793c425-75a3-4c0d-95a0-43f6bdf96bf5","Type":"ContainerDied","Data":"5f2b62eb8cc3095bd73c2cb8536c2a050457a4f21c7a40e305e241e12fc92ea7"} Mar 08 04:17:49.763471 master-0 kubenswrapper[18592]: I0308 04:17:49.763438 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-568f56df46-pmcb4" Mar 08 04:17:49.765161 master-0 kubenswrapper[18592]: I0308 04:17:49.765119 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-fswtl" event={"ID":"67779b28-c7a4-43b1-bc15-a8401880ff2e","Type":"ContainerStarted","Data":"15d3783e334db022cbe71dd51c7633e3e862d050d6762c2086e91bfbddf86290"} Mar 08 04:17:49.801416 master-0 kubenswrapper[18592]: I0308 04:17:49.801345 18592 scope.go:117] "RemoveContainer" containerID="462e81e5ce25b176c9b0817827c6ba8697290e0fcb4c8129909f17d350faa51b" Mar 08 04:17:49.830211 master-0 kubenswrapper[18592]: I0308 04:17:49.830148 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-568f56df46-pmcb4"] Mar 08 04:17:49.835203 master-0 kubenswrapper[18592]: I0308 04:17:49.835167 18592 scope.go:117] "RemoveContainer" containerID="4b3492860ca9dc480436b650c75257796e5192f624bc94b6ebbedddef4bda44c" Mar 08 04:17:49.839803 master-0 kubenswrapper[18592]: I0308 04:17:49.839749 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-568f56df46-pmcb4"] Mar 08 04:17:49.862701 master-0 kubenswrapper[18592]: I0308 04:17:49.862250 18592 scope.go:117] "RemoveContainer" containerID="d5391456248391ef7db24714017806821c3d4bcb5663d38293aef340995bb5f7" Mar 08 04:17:49.886192 master-0 kubenswrapper[18592]: I0308 04:17:49.886145 18592 scope.go:117] "RemoveContainer" containerID="462e81e5ce25b176c9b0817827c6ba8697290e0fcb4c8129909f17d350faa51b" Mar 08 04:17:49.886586 master-0 kubenswrapper[18592]: E0308 04:17:49.886532 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"462e81e5ce25b176c9b0817827c6ba8697290e0fcb4c8129909f17d350faa51b\": container with ID starting with 462e81e5ce25b176c9b0817827c6ba8697290e0fcb4c8129909f17d350faa51b not found: ID does not exist" containerID="462e81e5ce25b176c9b0817827c6ba8697290e0fcb4c8129909f17d350faa51b" Mar 08 04:17:49.886633 master-0 kubenswrapper[18592]: I0308 04:17:49.886574 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"462e81e5ce25b176c9b0817827c6ba8697290e0fcb4c8129909f17d350faa51b"} err="failed to get container status \"462e81e5ce25b176c9b0817827c6ba8697290e0fcb4c8129909f17d350faa51b\": rpc error: code = NotFound desc = could not find container \"462e81e5ce25b176c9b0817827c6ba8697290e0fcb4c8129909f17d350faa51b\": container with ID starting with 462e81e5ce25b176c9b0817827c6ba8697290e0fcb4c8129909f17d350faa51b not found: ID does not exist" Mar 08 04:17:49.886633 master-0 kubenswrapper[18592]: I0308 04:17:49.886601 18592 scope.go:117] "RemoveContainer" containerID="4b3492860ca9dc480436b650c75257796e5192f624bc94b6ebbedddef4bda44c" Mar 08 04:17:49.886937 master-0 kubenswrapper[18592]: E0308 04:17:49.886884 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b3492860ca9dc480436b650c75257796e5192f624bc94b6ebbedddef4bda44c\": container with ID starting with 4b3492860ca9dc480436b650c75257796e5192f624bc94b6ebbedddef4bda44c not found: ID does not exist" containerID="4b3492860ca9dc480436b650c75257796e5192f624bc94b6ebbedddef4bda44c" Mar 08 04:17:49.886937 master-0 kubenswrapper[18592]: I0308 04:17:49.886916 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b3492860ca9dc480436b650c75257796e5192f624bc94b6ebbedddef4bda44c"} err="failed to get container status \"4b3492860ca9dc480436b650c75257796e5192f624bc94b6ebbedddef4bda44c\": rpc error: code = NotFound desc = could not find container \"4b3492860ca9dc480436b650c75257796e5192f624bc94b6ebbedddef4bda44c\": container with ID starting with 4b3492860ca9dc480436b650c75257796e5192f624bc94b6ebbedddef4bda44c not found: ID does not exist" Mar 08 04:17:49.886937 master-0 kubenswrapper[18592]: I0308 04:17:49.886933 18592 scope.go:117] "RemoveContainer" containerID="d5391456248391ef7db24714017806821c3d4bcb5663d38293aef340995bb5f7" Mar 08 04:17:49.887309 master-0 kubenswrapper[18592]: E0308 04:17:49.887267 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5391456248391ef7db24714017806821c3d4bcb5663d38293aef340995bb5f7\": container with ID starting with d5391456248391ef7db24714017806821c3d4bcb5663d38293aef340995bb5f7 not found: ID does not exist" containerID="d5391456248391ef7db24714017806821c3d4bcb5663d38293aef340995bb5f7" Mar 08 04:17:49.887309 master-0 kubenswrapper[18592]: I0308 04:17:49.887288 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5391456248391ef7db24714017806821c3d4bcb5663d38293aef340995bb5f7"} err="failed to get container status \"d5391456248391ef7db24714017806821c3d4bcb5663d38293aef340995bb5f7\": rpc error: code = NotFound desc = could not find container \"d5391456248391ef7db24714017806821c3d4bcb5663d38293aef340995bb5f7\": container with ID starting with d5391456248391ef7db24714017806821c3d4bcb5663d38293aef340995bb5f7 not found: ID does not exist" Mar 08 04:17:49.999772 master-0 kubenswrapper[18592]: I0308 04:17:49.999696 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-56d7944fd-t2xf9" Mar 08 04:17:50.163977 master-0 kubenswrapper[18592]: I0308 04:17:50.162983 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" path="/var/lib/kubelet/pods/a793c425-75a3-4c0d-95a0-43f6bdf96bf5/volumes" Mar 08 04:17:51.625044 master-0 kubenswrapper[18592]: I0308 04:17:51.624873 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67869b4f85-qhzqs" Mar 08 04:17:51.709783 master-0 kubenswrapper[18592]: I0308 04:17:51.709723 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c664f854-r9dzf"] Mar 08 04:17:51.710004 master-0 kubenswrapper[18592]: I0308 04:17:51.709977 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-c664f854-r9dzf" podUID="5a3e60db-4326-462e-bdad-0d06970f36c3" containerName="neutron-api" containerID="cri-o://38f96f580b3c6adb13d63b37ceb373efcd3ca83c0e0d2bb9dd61b0854d1f737f" gracePeriod=30 Mar 08 04:17:51.710136 master-0 kubenswrapper[18592]: I0308 04:17:51.710108 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-c664f854-r9dzf" podUID="5a3e60db-4326-462e-bdad-0d06970f36c3" containerName="neutron-httpd" containerID="cri-o://ce39c5441b3afe06932ac615a56726bf3b59a0535525e8c100fd43967b33b085" gracePeriod=30 Mar 08 04:17:51.786955 master-0 kubenswrapper[18592]: I0308 04:17:51.785622 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 08 04:17:51.791315 master-0 kubenswrapper[18592]: E0308 04:17:51.791165 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" containerName="ironic-api" Mar 08 04:17:51.791315 master-0 kubenswrapper[18592]: I0308 04:17:51.791206 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" containerName="ironic-api" Mar 08 04:17:51.791315 master-0 kubenswrapper[18592]: E0308 04:17:51.791226 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" containerName="ironic-api-log" Mar 08 04:17:51.796335 master-0 kubenswrapper[18592]: I0308 04:17:51.791233 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" containerName="ironic-api-log" Mar 08 04:17:51.796335 master-0 kubenswrapper[18592]: E0308 04:17:51.791638 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" containerName="init" Mar 08 04:17:51.796335 master-0 kubenswrapper[18592]: I0308 04:17:51.791650 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" containerName="init" Mar 08 04:17:51.796335 master-0 kubenswrapper[18592]: E0308 04:17:51.791661 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" containerName="ironic-api" Mar 08 04:17:51.796335 master-0 kubenswrapper[18592]: I0308 04:17:51.791668 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" containerName="ironic-api" Mar 08 04:17:51.796335 master-0 kubenswrapper[18592]: I0308 04:17:51.795390 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" containerName="ironic-api" Mar 08 04:17:51.796335 master-0 kubenswrapper[18592]: I0308 04:17:51.795456 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" containerName="ironic-api" Mar 08 04:17:51.796335 master-0 kubenswrapper[18592]: I0308 04:17:51.795479 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="a793c425-75a3-4c0d-95a0-43f6bdf96bf5" containerName="ironic-api-log" Mar 08 04:17:51.798169 master-0 kubenswrapper[18592]: I0308 04:17:51.798124 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 04:17:51.802276 master-0 kubenswrapper[18592]: I0308 04:17:51.800976 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 08 04:17:51.802276 master-0 kubenswrapper[18592]: I0308 04:17:51.801090 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 08 04:17:51.842924 master-0 kubenswrapper[18592]: I0308 04:17:51.839958 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 04:17:51.950400 master-0 kubenswrapper[18592]: I0308 04:17:51.948786 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qszvt\" (UniqueName: \"kubernetes.io/projected/ee2a865f-ef65-4965-abfc-1425b7f0bf84-kube-api-access-qszvt\") pod \"openstackclient\" (UID: \"ee2a865f-ef65-4965-abfc-1425b7f0bf84\") " pod="openstack/openstackclient" Mar 08 04:17:51.950400 master-0 kubenswrapper[18592]: I0308 04:17:51.948939 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee2a865f-ef65-4965-abfc-1425b7f0bf84-openstack-config-secret\") pod \"openstackclient\" (UID: \"ee2a865f-ef65-4965-abfc-1425b7f0bf84\") " pod="openstack/openstackclient" Mar 08 04:17:51.950400 master-0 kubenswrapper[18592]: I0308 04:17:51.949000 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee2a865f-ef65-4965-abfc-1425b7f0bf84-openstack-config\") pod \"openstackclient\" (UID: \"ee2a865f-ef65-4965-abfc-1425b7f0bf84\") " pod="openstack/openstackclient" Mar 08 04:17:51.950400 master-0 kubenswrapper[18592]: I0308 04:17:51.949051 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee2a865f-ef65-4965-abfc-1425b7f0bf84-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ee2a865f-ef65-4965-abfc-1425b7f0bf84\") " pod="openstack/openstackclient" Mar 08 04:17:52.050703 master-0 kubenswrapper[18592]: I0308 04:17:52.050662 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee2a865f-ef65-4965-abfc-1425b7f0bf84-openstack-config-secret\") pod \"openstackclient\" (UID: \"ee2a865f-ef65-4965-abfc-1425b7f0bf84\") " pod="openstack/openstackclient" Mar 08 04:17:52.051050 master-0 kubenswrapper[18592]: I0308 04:17:52.051030 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee2a865f-ef65-4965-abfc-1425b7f0bf84-openstack-config\") pod \"openstackclient\" (UID: \"ee2a865f-ef65-4965-abfc-1425b7f0bf84\") " pod="openstack/openstackclient" Mar 08 04:17:52.051186 master-0 kubenswrapper[18592]: I0308 04:17:52.051172 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee2a865f-ef65-4965-abfc-1425b7f0bf84-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ee2a865f-ef65-4965-abfc-1425b7f0bf84\") " pod="openstack/openstackclient" Mar 08 04:17:52.051410 master-0 kubenswrapper[18592]: I0308 04:17:52.051395 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qszvt\" (UniqueName: \"kubernetes.io/projected/ee2a865f-ef65-4965-abfc-1425b7f0bf84-kube-api-access-qszvt\") pod \"openstackclient\" (UID: \"ee2a865f-ef65-4965-abfc-1425b7f0bf84\") " pod="openstack/openstackclient" Mar 08 04:17:52.052065 master-0 kubenswrapper[18592]: I0308 04:17:52.052026 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee2a865f-ef65-4965-abfc-1425b7f0bf84-openstack-config\") pod \"openstackclient\" (UID: \"ee2a865f-ef65-4965-abfc-1425b7f0bf84\") " pod="openstack/openstackclient" Mar 08 04:17:52.061805 master-0 kubenswrapper[18592]: I0308 04:17:52.053731 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee2a865f-ef65-4965-abfc-1425b7f0bf84-openstack-config-secret\") pod \"openstackclient\" (UID: \"ee2a865f-ef65-4965-abfc-1425b7f0bf84\") " pod="openstack/openstackclient" Mar 08 04:17:52.061805 master-0 kubenswrapper[18592]: I0308 04:17:52.055145 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee2a865f-ef65-4965-abfc-1425b7f0bf84-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ee2a865f-ef65-4965-abfc-1425b7f0bf84\") " pod="openstack/openstackclient" Mar 08 04:17:52.069313 master-0 kubenswrapper[18592]: I0308 04:17:52.066283 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qszvt\" (UniqueName: \"kubernetes.io/projected/ee2a865f-ef65-4965-abfc-1425b7f0bf84-kube-api-access-qszvt\") pod \"openstackclient\" (UID: \"ee2a865f-ef65-4965-abfc-1425b7f0bf84\") " pod="openstack/openstackclient" Mar 08 04:17:52.127424 master-0 kubenswrapper[18592]: I0308 04:17:52.127385 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 04:17:52.677150 master-0 kubenswrapper[18592]: I0308 04:17:52.676239 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 04:17:52.703390 master-0 kubenswrapper[18592]: W0308 04:17:52.703342 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee2a865f_ef65_4965_abfc_1425b7f0bf84.slice/crio-c78d06ce1752e790505edfdfafdc8a9d943d401853ed61fd86f08319a9c4f7cf WatchSource:0}: Error finding container c78d06ce1752e790505edfdfafdc8a9d943d401853ed61fd86f08319a9c4f7cf: Status 404 returned error can't find the container with id c78d06ce1752e790505edfdfafdc8a9d943d401853ed61fd86f08319a9c4f7cf Mar 08 04:17:52.834344 master-0 kubenswrapper[18592]: I0308 04:17:52.834193 18592 generic.go:334] "Generic (PLEG): container finished" podID="5a3e60db-4326-462e-bdad-0d06970f36c3" containerID="ce39c5441b3afe06932ac615a56726bf3b59a0535525e8c100fd43967b33b085" exitCode=0 Mar 08 04:17:52.834616 master-0 kubenswrapper[18592]: I0308 04:17:52.834273 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c664f854-r9dzf" event={"ID":"5a3e60db-4326-462e-bdad-0d06970f36c3","Type":"ContainerDied","Data":"ce39c5441b3afe06932ac615a56726bf3b59a0535525e8c100fd43967b33b085"} Mar 08 04:17:52.856490 master-0 kubenswrapper[18592]: I0308 04:17:52.856416 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ee2a865f-ef65-4965-abfc-1425b7f0bf84","Type":"ContainerStarted","Data":"c78d06ce1752e790505edfdfafdc8a9d943d401853ed61fd86f08319a9c4f7cf"} Mar 08 04:17:52.859698 master-0 kubenswrapper[18592]: I0308 04:17:52.859369 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-fswtl" event={"ID":"67779b28-c7a4-43b1-bc15-a8401880ff2e","Type":"ContainerStarted","Data":"7028d5939f93d9c1447410c6d8f2d3738a8f4d02a3d683a2410839cec945d10f"} Mar 08 04:17:52.921629 master-0 kubenswrapper[18592]: I0308 04:17:52.921551 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-db-sync-fswtl" podStartSLOduration=3.880723001 podStartE2EDuration="5.921528556s" podCreationTimestamp="2026-03-08 04:17:47 +0000 UTC" firstStartedPulling="2026-03-08 04:17:49.494912523 +0000 UTC m=+1481.593666873" lastFinishedPulling="2026-03-08 04:17:51.535718078 +0000 UTC m=+1483.634472428" observedRunningTime="2026-03-08 04:17:52.894677207 +0000 UTC m=+1484.993431557" watchObservedRunningTime="2026-03-08 04:17:52.921528556 +0000 UTC m=+1485.020282906" Mar 08 04:17:52.992390 master-0 kubenswrapper[18592]: I0308 04:17:52.992107 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-ff301-scheduler-0" Mar 08 04:17:53.100708 master-0 kubenswrapper[18592]: I0308 04:17:53.100597 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-ff301-backup-0" Mar 08 04:17:53.524967 master-0 kubenswrapper[18592]: I0308 04:17:53.524500 18592 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:17:53.530845 master-0 kubenswrapper[18592]: I0308 04:17:53.525640 18592 scope.go:117] "RemoveContainer" containerID="ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8" Mar 08 04:17:53.530845 master-0 kubenswrapper[18592]: E0308 04:17:53.530221 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-859d47fc89-z2wvz_openstack(05ba9b98-7d2f-4a9b-80ad-60793d8279e8)\"" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" podUID="05ba9b98-7d2f-4a9b-80ad-60793d8279e8" Mar 08 04:17:53.882863 master-0 kubenswrapper[18592]: I0308 04:17:53.882794 18592 generic.go:334] "Generic (PLEG): container finished" podID="5a3e60db-4326-462e-bdad-0d06970f36c3" containerID="38f96f580b3c6adb13d63b37ceb373efcd3ca83c0e0d2bb9dd61b0854d1f737f" exitCode=0 Mar 08 04:17:53.883783 master-0 kubenswrapper[18592]: I0308 04:17:53.883757 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c664f854-r9dzf" event={"ID":"5a3e60db-4326-462e-bdad-0d06970f36c3","Type":"ContainerDied","Data":"38f96f580b3c6adb13d63b37ceb373efcd3ca83c0e0d2bb9dd61b0854d1f737f"} Mar 08 04:17:54.399319 master-0 kubenswrapper[18592]: I0308 04:17:54.398718 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:54.466796 master-0 kubenswrapper[18592]: I0308 04:17:54.466696 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-config\") pod \"5a3e60db-4326-462e-bdad-0d06970f36c3\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " Mar 08 04:17:54.467050 master-0 kubenswrapper[18592]: I0308 04:17:54.466924 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-httpd-config\") pod \"5a3e60db-4326-462e-bdad-0d06970f36c3\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " Mar 08 04:17:54.467050 master-0 kubenswrapper[18592]: I0308 04:17:54.466960 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-combined-ca-bundle\") pod \"5a3e60db-4326-462e-bdad-0d06970f36c3\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " Mar 08 04:17:54.467050 master-0 kubenswrapper[18592]: I0308 04:17:54.466980 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-ovndb-tls-certs\") pod \"5a3e60db-4326-462e-bdad-0d06970f36c3\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " Mar 08 04:17:54.467159 master-0 kubenswrapper[18592]: I0308 04:17:54.467105 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-954w4\" (UniqueName: \"kubernetes.io/projected/5a3e60db-4326-462e-bdad-0d06970f36c3-kube-api-access-954w4\") pod \"5a3e60db-4326-462e-bdad-0d06970f36c3\" (UID: \"5a3e60db-4326-462e-bdad-0d06970f36c3\") " Mar 08 04:17:54.470771 master-0 kubenswrapper[18592]: I0308 04:17:54.470739 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a3e60db-4326-462e-bdad-0d06970f36c3-kube-api-access-954w4" (OuterVolumeSpecName: "kube-api-access-954w4") pod "5a3e60db-4326-462e-bdad-0d06970f36c3" (UID: "5a3e60db-4326-462e-bdad-0d06970f36c3"). InnerVolumeSpecName "kube-api-access-954w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:17:54.471613 master-0 kubenswrapper[18592]: I0308 04:17:54.471566 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5a3e60db-4326-462e-bdad-0d06970f36c3" (UID: "5a3e60db-4326-462e-bdad-0d06970f36c3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:54.521209 master-0 kubenswrapper[18592]: I0308 04:17:54.521153 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-config" (OuterVolumeSpecName: "config") pod "5a3e60db-4326-462e-bdad-0d06970f36c3" (UID: "5a3e60db-4326-462e-bdad-0d06970f36c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:54.571088 master-0 kubenswrapper[18592]: I0308 04:17:54.570706 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:54.571088 master-0 kubenswrapper[18592]: I0308 04:17:54.570746 18592 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-httpd-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:54.571088 master-0 kubenswrapper[18592]: I0308 04:17:54.570757 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-954w4\" (UniqueName: \"kubernetes.io/projected/5a3e60db-4326-462e-bdad-0d06970f36c3-kube-api-access-954w4\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:54.575108 master-0 kubenswrapper[18592]: I0308 04:17:54.575053 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a3e60db-4326-462e-bdad-0d06970f36c3" (UID: "5a3e60db-4326-462e-bdad-0d06970f36c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:54.581616 master-0 kubenswrapper[18592]: I0308 04:17:54.581584 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5a3e60db-4326-462e-bdad-0d06970f36c3" (UID: "5a3e60db-4326-462e-bdad-0d06970f36c3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:54.673575 master-0 kubenswrapper[18592]: I0308 04:17:54.673450 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:54.673575 master-0 kubenswrapper[18592]: I0308 04:17:54.673497 18592 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a3e60db-4326-462e-bdad-0d06970f36c3-ovndb-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:54.906393 master-0 kubenswrapper[18592]: I0308 04:17:54.906344 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c664f854-r9dzf" event={"ID":"5a3e60db-4326-462e-bdad-0d06970f36c3","Type":"ContainerDied","Data":"7d909f92de517bb888eea5a2a344264eefb388485fc834689514c22b10e67b0b"} Mar 08 04:17:54.906915 master-0 kubenswrapper[18592]: I0308 04:17:54.906401 18592 scope.go:117] "RemoveContainer" containerID="ce39c5441b3afe06932ac615a56726bf3b59a0535525e8c100fd43967b33b085" Mar 08 04:17:54.906915 master-0 kubenswrapper[18592]: I0308 04:17:54.906517 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c664f854-r9dzf" Mar 08 04:17:54.910561 master-0 kubenswrapper[18592]: I0308 04:17:54.910533 18592 generic.go:334] "Generic (PLEG): container finished" podID="67779b28-c7a4-43b1-bc15-a8401880ff2e" containerID="7028d5939f93d9c1447410c6d8f2d3738a8f4d02a3d683a2410839cec945d10f" exitCode=0 Mar 08 04:17:54.910614 master-0 kubenswrapper[18592]: I0308 04:17:54.910574 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-fswtl" event={"ID":"67779b28-c7a4-43b1-bc15-a8401880ff2e","Type":"ContainerDied","Data":"7028d5939f93d9c1447410c6d8f2d3738a8f4d02a3d683a2410839cec945d10f"} Mar 08 04:17:54.972151 master-0 kubenswrapper[18592]: I0308 04:17:54.971715 18592 scope.go:117] "RemoveContainer" containerID="38f96f580b3c6adb13d63b37ceb373efcd3ca83c0e0d2bb9dd61b0854d1f737f" Mar 08 04:17:54.985987 master-0 kubenswrapper[18592]: I0308 04:17:54.985935 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c664f854-r9dzf"] Mar 08 04:17:54.997858 master-0 kubenswrapper[18592]: I0308 04:17:54.996423 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c664f854-r9dzf"] Mar 08 04:17:55.417966 master-0 kubenswrapper[18592]: I0308 04:17:55.417749 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-59f645d994-2wg6m"] Mar 08 04:17:55.419292 master-0 kubenswrapper[18592]: E0308 04:17:55.418237 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a3e60db-4326-462e-bdad-0d06970f36c3" containerName="neutron-httpd" Mar 08 04:17:55.419292 master-0 kubenswrapper[18592]: I0308 04:17:55.418257 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a3e60db-4326-462e-bdad-0d06970f36c3" containerName="neutron-httpd" Mar 08 04:17:55.419292 master-0 kubenswrapper[18592]: E0308 04:17:55.418317 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a3e60db-4326-462e-bdad-0d06970f36c3" containerName="neutron-api" Mar 08 04:17:55.419292 master-0 kubenswrapper[18592]: I0308 04:17:55.418323 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a3e60db-4326-462e-bdad-0d06970f36c3" containerName="neutron-api" Mar 08 04:17:55.419292 master-0 kubenswrapper[18592]: I0308 04:17:55.418528 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a3e60db-4326-462e-bdad-0d06970f36c3" containerName="neutron-api" Mar 08 04:17:55.419292 master-0 kubenswrapper[18592]: I0308 04:17:55.418562 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a3e60db-4326-462e-bdad-0d06970f36c3" containerName="neutron-httpd" Mar 08 04:17:55.419678 master-0 kubenswrapper[18592]: I0308 04:17:55.419655 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.423899 master-0 kubenswrapper[18592]: I0308 04:17:55.423331 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 08 04:17:55.423899 master-0 kubenswrapper[18592]: I0308 04:17:55.423495 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 08 04:17:55.424580 master-0 kubenswrapper[18592]: I0308 04:17:55.424557 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 08 04:17:55.465849 master-0 kubenswrapper[18592]: I0308 04:17:55.465184 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-59f645d994-2wg6m"] Mar 08 04:17:55.497864 master-0 kubenswrapper[18592]: I0308 04:17:55.497299 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/987b967c-b06a-4079-ac6a-46f89e7c1b49-public-tls-certs\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.497864 master-0 kubenswrapper[18592]: I0308 04:17:55.497356 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/987b967c-b06a-4079-ac6a-46f89e7c1b49-internal-tls-certs\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.497864 master-0 kubenswrapper[18592]: I0308 04:17:55.497386 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/987b967c-b06a-4079-ac6a-46f89e7c1b49-run-httpd\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.497864 master-0 kubenswrapper[18592]: I0308 04:17:55.497417 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/987b967c-b06a-4079-ac6a-46f89e7c1b49-config-data\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.497864 master-0 kubenswrapper[18592]: I0308 04:17:55.497443 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/987b967c-b06a-4079-ac6a-46f89e7c1b49-log-httpd\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.497864 master-0 kubenswrapper[18592]: I0308 04:17:55.497468 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987b967c-b06a-4079-ac6a-46f89e7c1b49-combined-ca-bundle\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.497864 master-0 kubenswrapper[18592]: I0308 04:17:55.497574 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/987b967c-b06a-4079-ac6a-46f89e7c1b49-etc-swift\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.497864 master-0 kubenswrapper[18592]: I0308 04:17:55.497601 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjmnk\" (UniqueName: \"kubernetes.io/projected/987b967c-b06a-4079-ac6a-46f89e7c1b49-kube-api-access-bjmnk\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.619854 master-0 kubenswrapper[18592]: I0308 04:17:55.617261 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987b967c-b06a-4079-ac6a-46f89e7c1b49-combined-ca-bundle\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.619854 master-0 kubenswrapper[18592]: I0308 04:17:55.617452 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/987b967c-b06a-4079-ac6a-46f89e7c1b49-etc-swift\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.619854 master-0 kubenswrapper[18592]: I0308 04:17:55.617481 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjmnk\" (UniqueName: \"kubernetes.io/projected/987b967c-b06a-4079-ac6a-46f89e7c1b49-kube-api-access-bjmnk\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.619854 master-0 kubenswrapper[18592]: I0308 04:17:55.617558 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/987b967c-b06a-4079-ac6a-46f89e7c1b49-public-tls-certs\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.619854 master-0 kubenswrapper[18592]: I0308 04:17:55.617584 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/987b967c-b06a-4079-ac6a-46f89e7c1b49-internal-tls-certs\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.619854 master-0 kubenswrapper[18592]: I0308 04:17:55.617767 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/987b967c-b06a-4079-ac6a-46f89e7c1b49-run-httpd\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.619854 master-0 kubenswrapper[18592]: I0308 04:17:55.617794 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/987b967c-b06a-4079-ac6a-46f89e7c1b49-config-data\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.619854 master-0 kubenswrapper[18592]: I0308 04:17:55.617816 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/987b967c-b06a-4079-ac6a-46f89e7c1b49-log-httpd\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.619854 master-0 kubenswrapper[18592]: I0308 04:17:55.618368 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/987b967c-b06a-4079-ac6a-46f89e7c1b49-log-httpd\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.631867 master-0 kubenswrapper[18592]: I0308 04:17:55.630369 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/987b967c-b06a-4079-ac6a-46f89e7c1b49-run-httpd\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.637857 master-0 kubenswrapper[18592]: I0308 04:17:55.632670 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/987b967c-b06a-4079-ac6a-46f89e7c1b49-public-tls-certs\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.637857 master-0 kubenswrapper[18592]: I0308 04:17:55.633195 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/987b967c-b06a-4079-ac6a-46f89e7c1b49-etc-swift\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.637857 master-0 kubenswrapper[18592]: I0308 04:17:55.634888 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/987b967c-b06a-4079-ac6a-46f89e7c1b49-internal-tls-certs\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.637857 master-0 kubenswrapper[18592]: I0308 04:17:55.635843 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/987b967c-b06a-4079-ac6a-46f89e7c1b49-config-data\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.651868 master-0 kubenswrapper[18592]: I0308 04:17:55.651810 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987b967c-b06a-4079-ac6a-46f89e7c1b49-combined-ca-bundle\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.667077 master-0 kubenswrapper[18592]: I0308 04:17:55.667028 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjmnk\" (UniqueName: \"kubernetes.io/projected/987b967c-b06a-4079-ac6a-46f89e7c1b49-kube-api-access-bjmnk\") pod \"swift-proxy-59f645d994-2wg6m\" (UID: \"987b967c-b06a-4079-ac6a-46f89e7c1b49\") " pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:55.790569 master-0 kubenswrapper[18592]: I0308 04:17:55.790261 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:56.158001 master-0 kubenswrapper[18592]: I0308 04:17:56.157751 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a3e60db-4326-462e-bdad-0d06970f36c3" path="/var/lib/kubelet/pods/5a3e60db-4326-462e-bdad-0d06970f36c3/volumes" Mar 08 04:17:56.537616 master-0 kubenswrapper[18592]: I0308 04:17:56.537578 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:56.562016 master-0 kubenswrapper[18592]: I0308 04:17:56.561640 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-59f645d994-2wg6m"] Mar 08 04:17:56.661173 master-0 kubenswrapper[18592]: I0308 04:17:56.660521 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67779b28-c7a4-43b1-bc15-a8401880ff2e-scripts\") pod \"67779b28-c7a4-43b1-bc15-a8401880ff2e\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " Mar 08 04:17:56.661173 master-0 kubenswrapper[18592]: I0308 04:17:56.660603 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/67779b28-c7a4-43b1-bc15-a8401880ff2e-var-lib-ironic\") pod \"67779b28-c7a4-43b1-bc15-a8401880ff2e\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " Mar 08 04:17:56.661173 master-0 kubenswrapper[18592]: I0308 04:17:56.660781 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgfb2\" (UniqueName: \"kubernetes.io/projected/67779b28-c7a4-43b1-bc15-a8401880ff2e-kube-api-access-pgfb2\") pod \"67779b28-c7a4-43b1-bc15-a8401880ff2e\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " Mar 08 04:17:56.661173 master-0 kubenswrapper[18592]: I0308 04:17:56.660888 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67779b28-c7a4-43b1-bc15-a8401880ff2e-config\") pod \"67779b28-c7a4-43b1-bc15-a8401880ff2e\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " Mar 08 04:17:56.661173 master-0 kubenswrapper[18592]: I0308 04:17:56.661004 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/67779b28-c7a4-43b1-bc15-a8401880ff2e-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"67779b28-c7a4-43b1-bc15-a8401880ff2e\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " Mar 08 04:17:56.661173 master-0 kubenswrapper[18592]: I0308 04:17:56.661071 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67779b28-c7a4-43b1-bc15-a8401880ff2e-combined-ca-bundle\") pod \"67779b28-c7a4-43b1-bc15-a8401880ff2e\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " Mar 08 04:17:56.661173 master-0 kubenswrapper[18592]: I0308 04:17:56.661099 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/67779b28-c7a4-43b1-bc15-a8401880ff2e-etc-podinfo\") pod \"67779b28-c7a4-43b1-bc15-a8401880ff2e\" (UID: \"67779b28-c7a4-43b1-bc15-a8401880ff2e\") " Mar 08 04:17:56.661173 master-0 kubenswrapper[18592]: I0308 04:17:56.661133 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67779b28-c7a4-43b1-bc15-a8401880ff2e-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "67779b28-c7a4-43b1-bc15-a8401880ff2e" (UID: "67779b28-c7a4-43b1-bc15-a8401880ff2e"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:17:56.663893 master-0 kubenswrapper[18592]: I0308 04:17:56.661639 18592 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/67779b28-c7a4-43b1-bc15-a8401880ff2e-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:56.668025 master-0 kubenswrapper[18592]: I0308 04:17:56.667825 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67779b28-c7a4-43b1-bc15-a8401880ff2e-scripts" (OuterVolumeSpecName: "scripts") pod "67779b28-c7a4-43b1-bc15-a8401880ff2e" (UID: "67779b28-c7a4-43b1-bc15-a8401880ff2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:56.671532 master-0 kubenswrapper[18592]: I0308 04:17:56.671490 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67779b28-c7a4-43b1-bc15-a8401880ff2e-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "67779b28-c7a4-43b1-bc15-a8401880ff2e" (UID: "67779b28-c7a4-43b1-bc15-a8401880ff2e"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:17:56.681328 master-0 kubenswrapper[18592]: I0308 04:17:56.681258 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67779b28-c7a4-43b1-bc15-a8401880ff2e-kube-api-access-pgfb2" (OuterVolumeSpecName: "kube-api-access-pgfb2") pod "67779b28-c7a4-43b1-bc15-a8401880ff2e" (UID: "67779b28-c7a4-43b1-bc15-a8401880ff2e"). InnerVolumeSpecName "kube-api-access-pgfb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:17:56.687020 master-0 kubenswrapper[18592]: I0308 04:17:56.686970 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/67779b28-c7a4-43b1-bc15-a8401880ff2e-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "67779b28-c7a4-43b1-bc15-a8401880ff2e" (UID: "67779b28-c7a4-43b1-bc15-a8401880ff2e"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 04:17:56.696692 master-0 kubenswrapper[18592]: I0308 04:17:56.696585 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67779b28-c7a4-43b1-bc15-a8401880ff2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67779b28-c7a4-43b1-bc15-a8401880ff2e" (UID: "67779b28-c7a4-43b1-bc15-a8401880ff2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:56.709018 master-0 kubenswrapper[18592]: I0308 04:17:56.708966 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67779b28-c7a4-43b1-bc15-a8401880ff2e-config" (OuterVolumeSpecName: "config") pod "67779b28-c7a4-43b1-bc15-a8401880ff2e" (UID: "67779b28-c7a4-43b1-bc15-a8401880ff2e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:17:56.763263 master-0 kubenswrapper[18592]: I0308 04:17:56.763185 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67779b28-c7a4-43b1-bc15-a8401880ff2e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:56.763263 master-0 kubenswrapper[18592]: I0308 04:17:56.763225 18592 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/67779b28-c7a4-43b1-bc15-a8401880ff2e-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:56.763263 master-0 kubenswrapper[18592]: I0308 04:17:56.763236 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67779b28-c7a4-43b1-bc15-a8401880ff2e-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:56.763263 master-0 kubenswrapper[18592]: I0308 04:17:56.763245 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgfb2\" (UniqueName: \"kubernetes.io/projected/67779b28-c7a4-43b1-bc15-a8401880ff2e-kube-api-access-pgfb2\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:56.763263 master-0 kubenswrapper[18592]: I0308 04:17:56.763255 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/67779b28-c7a4-43b1-bc15-a8401880ff2e-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:56.763540 master-0 kubenswrapper[18592]: I0308 04:17:56.763265 18592 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/67779b28-c7a4-43b1-bc15-a8401880ff2e-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Mar 08 04:17:56.945872 master-0 kubenswrapper[18592]: I0308 04:17:56.945789 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-fswtl" event={"ID":"67779b28-c7a4-43b1-bc15-a8401880ff2e","Type":"ContainerDied","Data":"15d3783e334db022cbe71dd51c7633e3e862d050d6762c2086e91bfbddf86290"} Mar 08 04:17:56.945872 master-0 kubenswrapper[18592]: I0308 04:17:56.945851 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15d3783e334db022cbe71dd51c7633e3e862d050d6762c2086e91bfbddf86290" Mar 08 04:17:56.946130 master-0 kubenswrapper[18592]: I0308 04:17:56.945905 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-fswtl" Mar 08 04:17:56.950282 master-0 kubenswrapper[18592]: I0308 04:17:56.950208 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-59f645d994-2wg6m" event={"ID":"987b967c-b06a-4079-ac6a-46f89e7c1b49","Type":"ContainerStarted","Data":"d5ea480313f27f9bcfd809e42d904821af9228caefa696db1ac0936488da7d67"} Mar 08 04:17:56.950282 master-0 kubenswrapper[18592]: I0308 04:17:56.950275 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-59f645d994-2wg6m" event={"ID":"987b967c-b06a-4079-ac6a-46f89e7c1b49","Type":"ContainerStarted","Data":"a902b21774bdf1225e2da96dfaea6b4795c6cc20c8679bce7f3beeadea5dbba5"} Mar 08 04:17:57.228916 master-0 kubenswrapper[18592]: I0308 04:17:57.227855 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-afe2b-default-external-api-0"] Mar 08 04:17:57.228916 master-0 kubenswrapper[18592]: I0308 04:17:57.228102 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-afe2b-default-external-api-0" podUID="0eff6f29-794c-4597-b53f-c030263b2080" containerName="glance-log" containerID="cri-o://9bdd4546b9740edeb6f0f584dc8c3feaebf706adb02f7a37561eaa759eeb812f" gracePeriod=30 Mar 08 04:17:57.228916 master-0 kubenswrapper[18592]: I0308 04:17:57.228224 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-afe2b-default-external-api-0" podUID="0eff6f29-794c-4597-b53f-c030263b2080" containerName="glance-httpd" containerID="cri-o://acb051cdf721850d97db6528a56f259be53e65432bf2f5a93ccebdde9aac4e11" gracePeriod=30 Mar 08 04:17:57.961652 master-0 kubenswrapper[18592]: I0308 04:17:57.961590 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-59f645d994-2wg6m" event={"ID":"987b967c-b06a-4079-ac6a-46f89e7c1b49","Type":"ContainerStarted","Data":"78b3f7964e6bf9e35b1029a798a5e1ce85800f444912404094e6f1e86f65aa95"} Mar 08 04:17:57.961923 master-0 kubenswrapper[18592]: I0308 04:17:57.961744 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:57.964267 master-0 kubenswrapper[18592]: I0308 04:17:57.964219 18592 generic.go:334] "Generic (PLEG): container finished" podID="0eff6f29-794c-4597-b53f-c030263b2080" containerID="9bdd4546b9740edeb6f0f584dc8c3feaebf706adb02f7a37561eaa759eeb812f" exitCode=143 Mar 08 04:17:57.964367 master-0 kubenswrapper[18592]: I0308 04:17:57.964282 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-afe2b-default-external-api-0" event={"ID":"0eff6f29-794c-4597-b53f-c030263b2080","Type":"ContainerDied","Data":"9bdd4546b9740edeb6f0f584dc8c3feaebf706adb02f7a37561eaa759eeb812f"} Mar 08 04:17:57.997347 master-0 kubenswrapper[18592]: I0308 04:17:57.997266 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-59f645d994-2wg6m" podStartSLOduration=2.99724679 podStartE2EDuration="2.99724679s" podCreationTimestamp="2026-03-08 04:17:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:17:57.979138899 +0000 UTC m=+1490.077893249" watchObservedRunningTime="2026-03-08 04:17:57.99724679 +0000 UTC m=+1490.096001140" Mar 08 04:17:58.283076 master-0 kubenswrapper[18592]: I0308 04:17:58.279126 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78b99bbb9f-4bv2m"] Mar 08 04:17:58.283076 master-0 kubenswrapper[18592]: E0308 04:17:58.279628 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67779b28-c7a4-43b1-bc15-a8401880ff2e" containerName="ironic-inspector-db-sync" Mar 08 04:17:58.283076 master-0 kubenswrapper[18592]: I0308 04:17:58.279641 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="67779b28-c7a4-43b1-bc15-a8401880ff2e" containerName="ironic-inspector-db-sync" Mar 08 04:17:58.283076 master-0 kubenswrapper[18592]: I0308 04:17:58.280173 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="67779b28-c7a4-43b1-bc15-a8401880ff2e" containerName="ironic-inspector-db-sync" Mar 08 04:17:58.295881 master-0 kubenswrapper[18592]: I0308 04:17:58.285737 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.306142 master-0 kubenswrapper[18592]: I0308 04:17:58.306075 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78b99bbb9f-4bv2m"] Mar 08 04:17:58.426941 master-0 kubenswrapper[18592]: I0308 04:17:58.426580 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-dns-svc\") pod \"dnsmasq-dns-78b99bbb9f-4bv2m\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.426941 master-0 kubenswrapper[18592]: I0308 04:17:58.426636 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-dns-swift-storage-0\") pod \"dnsmasq-dns-78b99bbb9f-4bv2m\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.426941 master-0 kubenswrapper[18592]: I0308 04:17:58.426660 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-ovsdbserver-sb\") pod \"dnsmasq-dns-78b99bbb9f-4bv2m\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.426941 master-0 kubenswrapper[18592]: I0308 04:17:58.426716 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw7st\" (UniqueName: \"kubernetes.io/projected/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-kube-api-access-nw7st\") pod \"dnsmasq-dns-78b99bbb9f-4bv2m\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.426941 master-0 kubenswrapper[18592]: I0308 04:17:58.426738 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-ovsdbserver-nb\") pod \"dnsmasq-dns-78b99bbb9f-4bv2m\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.426941 master-0 kubenswrapper[18592]: I0308 04:17:58.426837 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-config\") pod \"dnsmasq-dns-78b99bbb9f-4bv2m\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.446593 master-0 kubenswrapper[18592]: I0308 04:17:58.446004 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 04:17:58.450189 master-0 kubenswrapper[18592]: I0308 04:17:58.450147 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 08 04:17:58.452434 master-0 kubenswrapper[18592]: I0308 04:17:58.452387 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 08 04:17:58.452491 master-0 kubenswrapper[18592]: I0308 04:17:58.452455 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Mar 08 04:17:58.452846 master-0 kubenswrapper[18592]: I0308 04:17:58.452805 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 08 04:17:58.502903 master-0 kubenswrapper[18592]: I0308 04:17:58.502057 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 04:17:58.528742 master-0 kubenswrapper[18592]: I0308 04:17:58.528660 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw7st\" (UniqueName: \"kubernetes.io/projected/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-kube-api-access-nw7st\") pod \"dnsmasq-dns-78b99bbb9f-4bv2m\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.528742 master-0 kubenswrapper[18592]: I0308 04:17:58.528734 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-ovsdbserver-nb\") pod \"dnsmasq-dns-78b99bbb9f-4bv2m\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.531113 master-0 kubenswrapper[18592]: I0308 04:17:58.528822 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.531113 master-0 kubenswrapper[18592]: I0308 04:17:58.530192 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-ovsdbserver-nb\") pod \"dnsmasq-dns-78b99bbb9f-4bv2m\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.531113 master-0 kubenswrapper[18592]: I0308 04:17:58.530249 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.531113 master-0 kubenswrapper[18592]: I0308 04:17:58.530336 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-config\") pod \"dnsmasq-dns-78b99bbb9f-4bv2m\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.531113 master-0 kubenswrapper[18592]: I0308 04:17:58.530383 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-scripts\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.531113 master-0 kubenswrapper[18592]: I0308 04:17:58.530447 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.531113 master-0 kubenswrapper[18592]: I0308 04:17:58.530585 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.531113 master-0 kubenswrapper[18592]: I0308 04:17:58.530614 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-config\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.531113 master-0 kubenswrapper[18592]: I0308 04:17:58.530692 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-dns-svc\") pod \"dnsmasq-dns-78b99bbb9f-4bv2m\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.531113 master-0 kubenswrapper[18592]: I0308 04:17:58.530723 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-dns-swift-storage-0\") pod \"dnsmasq-dns-78b99bbb9f-4bv2m\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.531113 master-0 kubenswrapper[18592]: I0308 04:17:58.530769 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-ovsdbserver-sb\") pod \"dnsmasq-dns-78b99bbb9f-4bv2m\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.531113 master-0 kubenswrapper[18592]: I0308 04:17:58.530857 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qprv\" (UniqueName: \"kubernetes.io/projected/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-kube-api-access-2qprv\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.531742 master-0 kubenswrapper[18592]: I0308 04:17:58.531717 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-config\") pod \"dnsmasq-dns-78b99bbb9f-4bv2m\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.534443 master-0 kubenswrapper[18592]: I0308 04:17:58.532307 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-dns-svc\") pod \"dnsmasq-dns-78b99bbb9f-4bv2m\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.534443 master-0 kubenswrapper[18592]: I0308 04:17:58.532860 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-dns-swift-storage-0\") pod \"dnsmasq-dns-78b99bbb9f-4bv2m\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.534443 master-0 kubenswrapper[18592]: I0308 04:17:58.534093 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-ovsdbserver-sb\") pod \"dnsmasq-dns-78b99bbb9f-4bv2m\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.550699 master-0 kubenswrapper[18592]: I0308 04:17:58.550655 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw7st\" (UniqueName: \"kubernetes.io/projected/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-kube-api-access-nw7st\") pod \"dnsmasq-dns-78b99bbb9f-4bv2m\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.619905 master-0 kubenswrapper[18592]: I0308 04:17:58.617435 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:17:58.632847 master-0 kubenswrapper[18592]: I0308 04:17:58.632782 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.633079 master-0 kubenswrapper[18592]: I0308 04:17:58.633063 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-config\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.633210 master-0 kubenswrapper[18592]: I0308 04:17:58.633196 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qprv\" (UniqueName: \"kubernetes.io/projected/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-kube-api-access-2qprv\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.633369 master-0 kubenswrapper[18592]: I0308 04:17:58.633356 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.633448 master-0 kubenswrapper[18592]: I0308 04:17:58.633435 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.633543 master-0 kubenswrapper[18592]: I0308 04:17:58.633531 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-scripts\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.633628 master-0 kubenswrapper[18592]: I0308 04:17:58.633615 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.635651 master-0 kubenswrapper[18592]: I0308 04:17:58.635608 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.640841 master-0 kubenswrapper[18592]: I0308 04:17:58.638636 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.649768 master-0 kubenswrapper[18592]: I0308 04:17:58.649734 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.659651 master-0 kubenswrapper[18592]: I0308 04:17:58.659586 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-scripts\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.661302 master-0 kubenswrapper[18592]: I0308 04:17:58.661199 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.669182 master-0 kubenswrapper[18592]: I0308 04:17:58.668484 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-config\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.686914 master-0 kubenswrapper[18592]: I0308 04:17:58.683799 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qprv\" (UniqueName: \"kubernetes.io/projected/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-kube-api-access-2qprv\") pod \"ironic-inspector-0\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " pod="openstack/ironic-inspector-0" Mar 08 04:17:58.806025 master-0 kubenswrapper[18592]: I0308 04:17:58.804523 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 08 04:17:59.025606 master-0 kubenswrapper[18592]: I0308 04:17:59.025528 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:17:59.515407 master-0 kubenswrapper[18592]: I0308 04:17:59.515365 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78b99bbb9f-4bv2m"] Mar 08 04:17:59.713047 master-0 kubenswrapper[18592]: I0308 04:17:59.706806 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 04:18:00.049394 master-0 kubenswrapper[18592]: I0308 04:18:00.047737 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" event={"ID":"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252","Type":"ContainerStarted","Data":"1f979ec6ce88865ea0f7dbf7ca0c057fc76c0c6b34e72ea8bfa2747405d9b91a"} Mar 08 04:18:00.056239 master-0 kubenswrapper[18592]: I0308 04:18:00.055983 18592 generic.go:334] "Generic (PLEG): container finished" podID="c4dde472-61ed-49eb-aa34-0addbba05d94" containerID="0d4de652b4f62c68bbd94344ea7a6f0f182d6c5f80ef08f93bb5fdf5b7df6533" exitCode=137 Mar 08 04:18:00.056239 master-0 kubenswrapper[18592]: I0308 04:18:00.056042 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-api-0" event={"ID":"c4dde472-61ed-49eb-aa34-0addbba05d94","Type":"ContainerDied","Data":"0d4de652b4f62c68bbd94344ea7a6f0f182d6c5f80ef08f93bb5fdf5b7df6533"} Mar 08 04:18:00.325901 master-0 kubenswrapper[18592]: I0308 04:18:00.323003 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-afe2b-default-internal-api-0"] Mar 08 04:18:00.325901 master-0 kubenswrapper[18592]: I0308 04:18:00.323342 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-afe2b-default-internal-api-0" podUID="fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" containerName="glance-log" containerID="cri-o://5754359ee075d07402e8f61461cb8a36550cb96bb16b4338c76ba1df26aca2ca" gracePeriod=30 Mar 08 04:18:00.325901 master-0 kubenswrapper[18592]: I0308 04:18:00.324017 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-afe2b-default-internal-api-0" podUID="fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" containerName="glance-httpd" containerID="cri-o://9f477e1a96bfab138a86ff6cf71109073bb51bdcf5d50327fc3da0149cb0c5c1" gracePeriod=30 Mar 08 04:18:01.255648 master-0 kubenswrapper[18592]: I0308 04:18:01.255576 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-6jgd8"] Mar 08 04:18:01.257188 master-0 kubenswrapper[18592]: I0308 04:18:01.257157 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6jgd8" Mar 08 04:18:01.279944 master-0 kubenswrapper[18592]: I0308 04:18:01.279892 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6jgd8"] Mar 08 04:18:01.334491 master-0 kubenswrapper[18592]: I0308 04:18:01.334432 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2306-account-create-update-c95sx"] Mar 08 04:18:01.336789 master-0 kubenswrapper[18592]: I0308 04:18:01.336744 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2306-account-create-update-c95sx" Mar 08 04:18:01.344716 master-0 kubenswrapper[18592]: I0308 04:18:01.344676 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 08 04:18:01.372145 master-0 kubenswrapper[18592]: I0308 04:18:01.371849 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-f4tg6"] Mar 08 04:18:01.373884 master-0 kubenswrapper[18592]: I0308 04:18:01.373840 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f4tg6" Mar 08 04:18:01.388987 master-0 kubenswrapper[18592]: I0308 04:18:01.381504 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-f4tg6"] Mar 08 04:18:01.398342 master-0 kubenswrapper[18592]: I0308 04:18:01.398292 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2306-account-create-update-c95sx"] Mar 08 04:18:01.424884 master-0 kubenswrapper[18592]: I0308 04:18:01.424158 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2a5661f-15ae-4df4-ab1f-8539afd4d339-operator-scripts\") pod \"nova-api-db-create-6jgd8\" (UID: \"f2a5661f-15ae-4df4-ab1f-8539afd4d339\") " pod="openstack/nova-api-db-create-6jgd8" Mar 08 04:18:01.424884 master-0 kubenswrapper[18592]: I0308 04:18:01.424575 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j22z\" (UniqueName: \"kubernetes.io/projected/f2a5661f-15ae-4df4-ab1f-8539afd4d339-kube-api-access-7j22z\") pod \"nova-api-db-create-6jgd8\" (UID: \"f2a5661f-15ae-4df4-ab1f-8539afd4d339\") " pod="openstack/nova-api-db-create-6jgd8" Mar 08 04:18:01.443523 master-0 kubenswrapper[18592]: I0308 04:18:01.443453 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 04:18:01.526899 master-0 kubenswrapper[18592]: I0308 04:18:01.526768 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwl6w\" (UniqueName: \"kubernetes.io/projected/3fefd13e-e178-4790-823e-458456886a84-kube-api-access-qwl6w\") pod \"nova-api-2306-account-create-update-c95sx\" (UID: \"3fefd13e-e178-4790-823e-458456886a84\") " pod="openstack/nova-api-2306-account-create-update-c95sx" Mar 08 04:18:01.526899 master-0 kubenswrapper[18592]: I0308 04:18:01.526843 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2a5661f-15ae-4df4-ab1f-8539afd4d339-operator-scripts\") pod \"nova-api-db-create-6jgd8\" (UID: \"f2a5661f-15ae-4df4-ab1f-8539afd4d339\") " pod="openstack/nova-api-db-create-6jgd8" Mar 08 04:18:01.527116 master-0 kubenswrapper[18592]: I0308 04:18:01.526915 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj5dq\" (UniqueName: \"kubernetes.io/projected/bafa1006-fce0-4733-9706-a4de6df10ac7-kube-api-access-sj5dq\") pod \"nova-cell0-db-create-f4tg6\" (UID: \"bafa1006-fce0-4733-9706-a4de6df10ac7\") " pod="openstack/nova-cell0-db-create-f4tg6" Mar 08 04:18:01.527251 master-0 kubenswrapper[18592]: I0308 04:18:01.527216 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j22z\" (UniqueName: \"kubernetes.io/projected/f2a5661f-15ae-4df4-ab1f-8539afd4d339-kube-api-access-7j22z\") pod \"nova-api-db-create-6jgd8\" (UID: \"f2a5661f-15ae-4df4-ab1f-8539afd4d339\") " pod="openstack/nova-api-db-create-6jgd8" Mar 08 04:18:01.527418 master-0 kubenswrapper[18592]: I0308 04:18:01.527393 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fefd13e-e178-4790-823e-458456886a84-operator-scripts\") pod \"nova-api-2306-account-create-update-c95sx\" (UID: \"3fefd13e-e178-4790-823e-458456886a84\") " pod="openstack/nova-api-2306-account-create-update-c95sx" Mar 08 04:18:01.527500 master-0 kubenswrapper[18592]: I0308 04:18:01.527479 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafa1006-fce0-4733-9706-a4de6df10ac7-operator-scripts\") pod \"nova-cell0-db-create-f4tg6\" (UID: \"bafa1006-fce0-4733-9706-a4de6df10ac7\") " pod="openstack/nova-cell0-db-create-f4tg6" Mar 08 04:18:01.527660 master-0 kubenswrapper[18592]: I0308 04:18:01.527622 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2a5661f-15ae-4df4-ab1f-8539afd4d339-operator-scripts\") pod \"nova-api-db-create-6jgd8\" (UID: \"f2a5661f-15ae-4df4-ab1f-8539afd4d339\") " pod="openstack/nova-api-db-create-6jgd8" Mar 08 04:18:01.542843 master-0 kubenswrapper[18592]: I0308 04:18:01.542771 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-qrztf"] Mar 08 04:18:01.544886 master-0 kubenswrapper[18592]: I0308 04:18:01.544717 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qrztf" Mar 08 04:18:01.575216 master-0 kubenswrapper[18592]: I0308 04:18:01.574039 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j22z\" (UniqueName: \"kubernetes.io/projected/f2a5661f-15ae-4df4-ab1f-8539afd4d339-kube-api-access-7j22z\") pod \"nova-api-db-create-6jgd8\" (UID: \"f2a5661f-15ae-4df4-ab1f-8539afd4d339\") " pod="openstack/nova-api-db-create-6jgd8" Mar 08 04:18:01.605994 master-0 kubenswrapper[18592]: I0308 04:18:01.592290 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-e145-account-create-update-nccdt"] Mar 08 04:18:01.605994 master-0 kubenswrapper[18592]: I0308 04:18:01.594879 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e145-account-create-update-nccdt" Mar 08 04:18:01.605994 master-0 kubenswrapper[18592]: I0308 04:18:01.595795 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6jgd8" Mar 08 04:18:01.607070 master-0 kubenswrapper[18592]: I0308 04:18:01.606615 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 08 04:18:01.627578 master-0 kubenswrapper[18592]: I0308 04:18:01.621967 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-qrztf"] Mar 08 04:18:01.638980 master-0 kubenswrapper[18592]: I0308 04:18:01.634340 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fefd13e-e178-4790-823e-458456886a84-operator-scripts\") pod \"nova-api-2306-account-create-update-c95sx\" (UID: \"3fefd13e-e178-4790-823e-458456886a84\") " pod="openstack/nova-api-2306-account-create-update-c95sx" Mar 08 04:18:01.638980 master-0 kubenswrapper[18592]: I0308 04:18:01.634413 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafa1006-fce0-4733-9706-a4de6df10ac7-operator-scripts\") pod \"nova-cell0-db-create-f4tg6\" (UID: \"bafa1006-fce0-4733-9706-a4de6df10ac7\") " pod="openstack/nova-cell0-db-create-f4tg6" Mar 08 04:18:01.638980 master-0 kubenswrapper[18592]: I0308 04:18:01.634497 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwl6w\" (UniqueName: \"kubernetes.io/projected/3fefd13e-e178-4790-823e-458456886a84-kube-api-access-qwl6w\") pod \"nova-api-2306-account-create-update-c95sx\" (UID: \"3fefd13e-e178-4790-823e-458456886a84\") " pod="openstack/nova-api-2306-account-create-update-c95sx" Mar 08 04:18:01.638980 master-0 kubenswrapper[18592]: I0308 04:18:01.634515 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj5dq\" (UniqueName: \"kubernetes.io/projected/bafa1006-fce0-4733-9706-a4de6df10ac7-kube-api-access-sj5dq\") pod \"nova-cell0-db-create-f4tg6\" (UID: \"bafa1006-fce0-4733-9706-a4de6df10ac7\") " pod="openstack/nova-cell0-db-create-f4tg6" Mar 08 04:18:01.638980 master-0 kubenswrapper[18592]: I0308 04:18:01.635518 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fefd13e-e178-4790-823e-458456886a84-operator-scripts\") pod \"nova-api-2306-account-create-update-c95sx\" (UID: \"3fefd13e-e178-4790-823e-458456886a84\") " pod="openstack/nova-api-2306-account-create-update-c95sx" Mar 08 04:18:01.638980 master-0 kubenswrapper[18592]: I0308 04:18:01.635994 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafa1006-fce0-4733-9706-a4de6df10ac7-operator-scripts\") pod \"nova-cell0-db-create-f4tg6\" (UID: \"bafa1006-fce0-4733-9706-a4de6df10ac7\") " pod="openstack/nova-cell0-db-create-f4tg6" Mar 08 04:18:01.686954 master-0 kubenswrapper[18592]: I0308 04:18:01.679934 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj5dq\" (UniqueName: \"kubernetes.io/projected/bafa1006-fce0-4733-9706-a4de6df10ac7-kube-api-access-sj5dq\") pod \"nova-cell0-db-create-f4tg6\" (UID: \"bafa1006-fce0-4733-9706-a4de6df10ac7\") " pod="openstack/nova-cell0-db-create-f4tg6" Mar 08 04:18:01.686954 master-0 kubenswrapper[18592]: I0308 04:18:01.682563 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e145-account-create-update-nccdt"] Mar 08 04:18:01.696489 master-0 kubenswrapper[18592]: I0308 04:18:01.696428 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwl6w\" (UniqueName: \"kubernetes.io/projected/3fefd13e-e178-4790-823e-458456886a84-kube-api-access-qwl6w\") pod \"nova-api-2306-account-create-update-c95sx\" (UID: \"3fefd13e-e178-4790-823e-458456886a84\") " pod="openstack/nova-api-2306-account-create-update-c95sx" Mar 08 04:18:01.734969 master-0 kubenswrapper[18592]: I0308 04:18:01.732893 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f4tg6" Mar 08 04:18:01.736939 master-0 kubenswrapper[18592]: I0308 04:18:01.736207 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c678d7e-e3ea-40d5-b265-cf42ac1139c6-operator-scripts\") pod \"nova-cell1-db-create-qrztf\" (UID: \"7c678d7e-e3ea-40d5-b265-cf42ac1139c6\") " pod="openstack/nova-cell1-db-create-qrztf" Mar 08 04:18:01.736939 master-0 kubenswrapper[18592]: I0308 04:18:01.736270 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/955ce42c-5d68-4659-a993-85d566eb7c0c-operator-scripts\") pod \"nova-cell0-e145-account-create-update-nccdt\" (UID: \"955ce42c-5d68-4659-a993-85d566eb7c0c\") " pod="openstack/nova-cell0-e145-account-create-update-nccdt" Mar 08 04:18:01.736939 master-0 kubenswrapper[18592]: I0308 04:18:01.736369 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqmq9\" (UniqueName: \"kubernetes.io/projected/7c678d7e-e3ea-40d5-b265-cf42ac1139c6-kube-api-access-lqmq9\") pod \"nova-cell1-db-create-qrztf\" (UID: \"7c678d7e-e3ea-40d5-b265-cf42ac1139c6\") " pod="openstack/nova-cell1-db-create-qrztf" Mar 08 04:18:01.736939 master-0 kubenswrapper[18592]: I0308 04:18:01.736401 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzpv4\" (UniqueName: \"kubernetes.io/projected/955ce42c-5d68-4659-a993-85d566eb7c0c-kube-api-access-pzpv4\") pod \"nova-cell0-e145-account-create-update-nccdt\" (UID: \"955ce42c-5d68-4659-a993-85d566eb7c0c\") " pod="openstack/nova-cell0-e145-account-create-update-nccdt" Mar 08 04:18:01.842850 master-0 kubenswrapper[18592]: I0308 04:18:01.838702 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqmq9\" (UniqueName: \"kubernetes.io/projected/7c678d7e-e3ea-40d5-b265-cf42ac1139c6-kube-api-access-lqmq9\") pod \"nova-cell1-db-create-qrztf\" (UID: \"7c678d7e-e3ea-40d5-b265-cf42ac1139c6\") " pod="openstack/nova-cell1-db-create-qrztf" Mar 08 04:18:01.842850 master-0 kubenswrapper[18592]: I0308 04:18:01.839234 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzpv4\" (UniqueName: \"kubernetes.io/projected/955ce42c-5d68-4659-a993-85d566eb7c0c-kube-api-access-pzpv4\") pod \"nova-cell0-e145-account-create-update-nccdt\" (UID: \"955ce42c-5d68-4659-a993-85d566eb7c0c\") " pod="openstack/nova-cell0-e145-account-create-update-nccdt" Mar 08 04:18:01.842850 master-0 kubenswrapper[18592]: I0308 04:18:01.839985 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c678d7e-e3ea-40d5-b265-cf42ac1139c6-operator-scripts\") pod \"nova-cell1-db-create-qrztf\" (UID: \"7c678d7e-e3ea-40d5-b265-cf42ac1139c6\") " pod="openstack/nova-cell1-db-create-qrztf" Mar 08 04:18:01.842850 master-0 kubenswrapper[18592]: I0308 04:18:01.840040 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/955ce42c-5d68-4659-a993-85d566eb7c0c-operator-scripts\") pod \"nova-cell0-e145-account-create-update-nccdt\" (UID: \"955ce42c-5d68-4659-a993-85d566eb7c0c\") " pod="openstack/nova-cell0-e145-account-create-update-nccdt" Mar 08 04:18:01.842850 master-0 kubenswrapper[18592]: I0308 04:18:01.842084 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/955ce42c-5d68-4659-a993-85d566eb7c0c-operator-scripts\") pod \"nova-cell0-e145-account-create-update-nccdt\" (UID: \"955ce42c-5d68-4659-a993-85d566eb7c0c\") " pod="openstack/nova-cell0-e145-account-create-update-nccdt" Mar 08 04:18:01.846844 master-0 kubenswrapper[18592]: I0308 04:18:01.845296 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c678d7e-e3ea-40d5-b265-cf42ac1139c6-operator-scripts\") pod \"nova-cell1-db-create-qrztf\" (UID: \"7c678d7e-e3ea-40d5-b265-cf42ac1139c6\") " pod="openstack/nova-cell1-db-create-qrztf" Mar 08 04:18:01.846844 master-0 kubenswrapper[18592]: I0308 04:18:01.845608 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f75d-account-create-update-fxddp"] Mar 08 04:18:01.868848 master-0 kubenswrapper[18592]: I0308 04:18:01.864937 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqmq9\" (UniqueName: \"kubernetes.io/projected/7c678d7e-e3ea-40d5-b265-cf42ac1139c6-kube-api-access-lqmq9\") pod \"nova-cell1-db-create-qrztf\" (UID: \"7c678d7e-e3ea-40d5-b265-cf42ac1139c6\") " pod="openstack/nova-cell1-db-create-qrztf" Mar 08 04:18:01.885921 master-0 kubenswrapper[18592]: I0308 04:18:01.885817 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzpv4\" (UniqueName: \"kubernetes.io/projected/955ce42c-5d68-4659-a993-85d566eb7c0c-kube-api-access-pzpv4\") pod \"nova-cell0-e145-account-create-update-nccdt\" (UID: \"955ce42c-5d68-4659-a993-85d566eb7c0c\") " pod="openstack/nova-cell0-e145-account-create-update-nccdt" Mar 08 04:18:01.892484 master-0 kubenswrapper[18592]: I0308 04:18:01.892430 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f75d-account-create-update-fxddp" Mar 08 04:18:01.894942 master-0 kubenswrapper[18592]: I0308 04:18:01.894906 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 08 04:18:01.899891 master-0 kubenswrapper[18592]: I0308 04:18:01.899825 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f75d-account-create-update-fxddp"] Mar 08 04:18:01.965098 master-0 kubenswrapper[18592]: I0308 04:18:01.965070 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qrztf" Mar 08 04:18:01.976197 master-0 kubenswrapper[18592]: I0308 04:18:01.976171 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2306-account-create-update-c95sx" Mar 08 04:18:02.051077 master-0 kubenswrapper[18592]: I0308 04:18:02.050880 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2km7f\" (UniqueName: \"kubernetes.io/projected/6a5fc841-ddfd-4704-9a4c-878bcbb98bcc-kube-api-access-2km7f\") pod \"nova-cell1-f75d-account-create-update-fxddp\" (UID: \"6a5fc841-ddfd-4704-9a4c-878bcbb98bcc\") " pod="openstack/nova-cell1-f75d-account-create-update-fxddp" Mar 08 04:18:02.051077 master-0 kubenswrapper[18592]: I0308 04:18:02.051023 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a5fc841-ddfd-4704-9a4c-878bcbb98bcc-operator-scripts\") pod \"nova-cell1-f75d-account-create-update-fxddp\" (UID: \"6a5fc841-ddfd-4704-9a4c-878bcbb98bcc\") " pod="openstack/nova-cell1-f75d-account-create-update-fxddp" Mar 08 04:18:02.117116 master-0 kubenswrapper[18592]: I0308 04:18:02.116606 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e145-account-create-update-nccdt" Mar 08 04:18:02.153515 master-0 kubenswrapper[18592]: I0308 04:18:02.153207 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2km7f\" (UniqueName: \"kubernetes.io/projected/6a5fc841-ddfd-4704-9a4c-878bcbb98bcc-kube-api-access-2km7f\") pod \"nova-cell1-f75d-account-create-update-fxddp\" (UID: \"6a5fc841-ddfd-4704-9a4c-878bcbb98bcc\") " pod="openstack/nova-cell1-f75d-account-create-update-fxddp" Mar 08 04:18:02.153515 master-0 kubenswrapper[18592]: I0308 04:18:02.153305 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a5fc841-ddfd-4704-9a4c-878bcbb98bcc-operator-scripts\") pod \"nova-cell1-f75d-account-create-update-fxddp\" (UID: \"6a5fc841-ddfd-4704-9a4c-878bcbb98bcc\") " pod="openstack/nova-cell1-f75d-account-create-update-fxddp" Mar 08 04:18:02.154087 master-0 kubenswrapper[18592]: I0308 04:18:02.154041 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a5fc841-ddfd-4704-9a4c-878bcbb98bcc-operator-scripts\") pod \"nova-cell1-f75d-account-create-update-fxddp\" (UID: \"6a5fc841-ddfd-4704-9a4c-878bcbb98bcc\") " pod="openstack/nova-cell1-f75d-account-create-update-fxddp" Mar 08 04:18:02.171955 master-0 kubenswrapper[18592]: I0308 04:18:02.171893 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2km7f\" (UniqueName: \"kubernetes.io/projected/6a5fc841-ddfd-4704-9a4c-878bcbb98bcc-kube-api-access-2km7f\") pod \"nova-cell1-f75d-account-create-update-fxddp\" (UID: \"6a5fc841-ddfd-4704-9a4c-878bcbb98bcc\") " pod="openstack/nova-cell1-f75d-account-create-update-fxddp" Mar 08 04:18:02.270185 master-0 kubenswrapper[18592]: I0308 04:18:02.269629 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f75d-account-create-update-fxddp" Mar 08 04:18:04.146466 master-0 kubenswrapper[18592]: I0308 04:18:04.146326 18592 scope.go:117] "RemoveContainer" containerID="ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8" Mar 08 04:18:05.800889 master-0 kubenswrapper[18592]: I0308 04:18:05.797630 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:18:05.802302 master-0 kubenswrapper[18592]: I0308 04:18:05.802261 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-59f645d994-2wg6m" Mar 08 04:18:06.466676 master-0 kubenswrapper[18592]: I0308 04:18:06.466605 18592 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-ff301-api-0" podUID="c4dde472-61ed-49eb-aa34-0addbba05d94" containerName="cinder-api" probeResult="failure" output="Get \"http://10.128.0.225:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 04:18:10.768094 master-0 kubenswrapper[18592]: I0308 04:18:10.768036 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-api-0" Mar 08 04:18:10.859410 master-0 kubenswrapper[18592]: I0308 04:18:10.858453 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4dde472-61ed-49eb-aa34-0addbba05d94-logs\") pod \"c4dde472-61ed-49eb-aa34-0addbba05d94\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " Mar 08 04:18:10.859410 master-0 kubenswrapper[18592]: I0308 04:18:10.858603 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-config-data-custom\") pod \"c4dde472-61ed-49eb-aa34-0addbba05d94\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " Mar 08 04:18:10.859410 master-0 kubenswrapper[18592]: I0308 04:18:10.858685 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-combined-ca-bundle\") pod \"c4dde472-61ed-49eb-aa34-0addbba05d94\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " Mar 08 04:18:10.859410 master-0 kubenswrapper[18592]: I0308 04:18:10.858766 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s8kw\" (UniqueName: \"kubernetes.io/projected/c4dde472-61ed-49eb-aa34-0addbba05d94-kube-api-access-8s8kw\") pod \"c4dde472-61ed-49eb-aa34-0addbba05d94\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " Mar 08 04:18:10.859410 master-0 kubenswrapper[18592]: I0308 04:18:10.858792 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-config-data\") pod \"c4dde472-61ed-49eb-aa34-0addbba05d94\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " Mar 08 04:18:10.859410 master-0 kubenswrapper[18592]: I0308 04:18:10.858844 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c4dde472-61ed-49eb-aa34-0addbba05d94-etc-machine-id\") pod \"c4dde472-61ed-49eb-aa34-0addbba05d94\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " Mar 08 04:18:10.859410 master-0 kubenswrapper[18592]: I0308 04:18:10.858900 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-scripts\") pod \"c4dde472-61ed-49eb-aa34-0addbba05d94\" (UID: \"c4dde472-61ed-49eb-aa34-0addbba05d94\") " Mar 08 04:18:10.867750 master-0 kubenswrapper[18592]: I0308 04:18:10.867547 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4dde472-61ed-49eb-aa34-0addbba05d94-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c4dde472-61ed-49eb-aa34-0addbba05d94" (UID: "c4dde472-61ed-49eb-aa34-0addbba05d94"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 04:18:10.867750 master-0 kubenswrapper[18592]: I0308 04:18:10.867595 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4dde472-61ed-49eb-aa34-0addbba05d94-logs" (OuterVolumeSpecName: "logs") pod "c4dde472-61ed-49eb-aa34-0addbba05d94" (UID: "c4dde472-61ed-49eb-aa34-0addbba05d94"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:18:10.915731 master-0 kubenswrapper[18592]: I0308 04:18:10.914460 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-scripts" (OuterVolumeSpecName: "scripts") pod "c4dde472-61ed-49eb-aa34-0addbba05d94" (UID: "c4dde472-61ed-49eb-aa34-0addbba05d94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:10.926072 master-0 kubenswrapper[18592]: I0308 04:18:10.926014 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4dde472-61ed-49eb-aa34-0addbba05d94-kube-api-access-8s8kw" (OuterVolumeSpecName: "kube-api-access-8s8kw") pod "c4dde472-61ed-49eb-aa34-0addbba05d94" (UID: "c4dde472-61ed-49eb-aa34-0addbba05d94"). InnerVolumeSpecName "kube-api-access-8s8kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:18:10.978366 master-0 kubenswrapper[18592]: I0308 04:18:10.972259 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s8kw\" (UniqueName: \"kubernetes.io/projected/c4dde472-61ed-49eb-aa34-0addbba05d94-kube-api-access-8s8kw\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:10.978366 master-0 kubenswrapper[18592]: I0308 04:18:10.972293 18592 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c4dde472-61ed-49eb-aa34-0addbba05d94-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:10.978366 master-0 kubenswrapper[18592]: I0308 04:18:10.972303 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:10.978366 master-0 kubenswrapper[18592]: I0308 04:18:10.972313 18592 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4dde472-61ed-49eb-aa34-0addbba05d94-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:10.984489 master-0 kubenswrapper[18592]: I0308 04:18:10.984302 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c4dde472-61ed-49eb-aa34-0addbba05d94" (UID: "c4dde472-61ed-49eb-aa34-0addbba05d94"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:11.070693 master-0 kubenswrapper[18592]: I0308 04:18:11.070659 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4dde472-61ed-49eb-aa34-0addbba05d94" (UID: "c4dde472-61ed-49eb-aa34-0addbba05d94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:11.074381 master-0 kubenswrapper[18592]: I0308 04:18:11.074337 18592 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.074381 master-0 kubenswrapper[18592]: I0308 04:18:11.074376 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.158882 master-0 kubenswrapper[18592]: I0308 04:18:11.158778 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:11.177062 master-0 kubenswrapper[18592]: I0308 04:18:11.176928 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-config-data" (OuterVolumeSpecName: "config-data") pod "c4dde472-61ed-49eb-aa34-0addbba05d94" (UID: "c4dde472-61ed-49eb-aa34-0addbba05d94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:11.177529 master-0 kubenswrapper[18592]: I0308 04:18:11.177483 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4dde472-61ed-49eb-aa34-0addbba05d94-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.282108 master-0 kubenswrapper[18592]: I0308 04:18:11.279238 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb\") pod \"0eff6f29-794c-4597-b53f-c030263b2080\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " Mar 08 04:18:11.282108 master-0 kubenswrapper[18592]: I0308 04:18:11.279289 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-combined-ca-bundle\") pod \"0eff6f29-794c-4597-b53f-c030263b2080\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " Mar 08 04:18:11.282108 master-0 kubenswrapper[18592]: I0308 04:18:11.279325 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-public-tls-certs\") pod \"0eff6f29-794c-4597-b53f-c030263b2080\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " Mar 08 04:18:11.282108 master-0 kubenswrapper[18592]: I0308 04:18:11.279597 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-config-data\") pod \"0eff6f29-794c-4597-b53f-c030263b2080\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " Mar 08 04:18:11.282108 master-0 kubenswrapper[18592]: I0308 04:18:11.279623 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0eff6f29-794c-4597-b53f-c030263b2080-logs\") pod \"0eff6f29-794c-4597-b53f-c030263b2080\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " Mar 08 04:18:11.282108 master-0 kubenswrapper[18592]: I0308 04:18:11.279669 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dktq\" (UniqueName: \"kubernetes.io/projected/0eff6f29-794c-4597-b53f-c030263b2080-kube-api-access-5dktq\") pod \"0eff6f29-794c-4597-b53f-c030263b2080\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " Mar 08 04:18:11.282108 master-0 kubenswrapper[18592]: I0308 04:18:11.279707 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0eff6f29-794c-4597-b53f-c030263b2080-httpd-run\") pod \"0eff6f29-794c-4597-b53f-c030263b2080\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " Mar 08 04:18:11.282108 master-0 kubenswrapper[18592]: I0308 04:18:11.279760 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-scripts\") pod \"0eff6f29-794c-4597-b53f-c030263b2080\" (UID: \"0eff6f29-794c-4597-b53f-c030263b2080\") " Mar 08 04:18:11.289885 master-0 kubenswrapper[18592]: I0308 04:18:11.283444 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eff6f29-794c-4597-b53f-c030263b2080-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0eff6f29-794c-4597-b53f-c030263b2080" (UID: "0eff6f29-794c-4597-b53f-c030263b2080"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:18:11.289885 master-0 kubenswrapper[18592]: I0308 04:18:11.286724 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eff6f29-794c-4597-b53f-c030263b2080-logs" (OuterVolumeSpecName: "logs") pod "0eff6f29-794c-4597-b53f-c030263b2080" (UID: "0eff6f29-794c-4597-b53f-c030263b2080"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:18:11.298043 master-0 kubenswrapper[18592]: I0308 04:18:11.297982 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eff6f29-794c-4597-b53f-c030263b2080-kube-api-access-5dktq" (OuterVolumeSpecName: "kube-api-access-5dktq") pod "0eff6f29-794c-4597-b53f-c030263b2080" (UID: "0eff6f29-794c-4597-b53f-c030263b2080"). InnerVolumeSpecName "kube-api-access-5dktq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:18:11.305964 master-0 kubenswrapper[18592]: I0308 04:18:11.303814 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-scripts" (OuterVolumeSpecName: "scripts") pod "0eff6f29-794c-4597-b53f-c030263b2080" (UID: "0eff6f29-794c-4597-b53f-c030263b2080"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:11.310653 master-0 kubenswrapper[18592]: I0308 04:18:11.306534 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" event={"ID":"05ba9b98-7d2f-4a9b-80ad-60793d8279e8","Type":"ContainerStarted","Data":"1988a339b07b0c491c35385954db8882edaee30eeede55407a4d33cef13b5291"} Mar 08 04:18:11.316563 master-0 kubenswrapper[18592]: I0308 04:18:11.315258 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb" (OuterVolumeSpecName: "glance") pod "0eff6f29-794c-4597-b53f-c030263b2080" (UID: "0eff6f29-794c-4597-b53f-c030263b2080"). InnerVolumeSpecName "pvc-acff9521-23da-47da-b539-1fad9dc0c8dd". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 04:18:11.316563 master-0 kubenswrapper[18592]: I0308 04:18:11.315333 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:18:11.327254 master-0 kubenswrapper[18592]: I0308 04:18:11.325449 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0eff6f29-794c-4597-b53f-c030263b2080" (UID: "0eff6f29-794c-4597-b53f-c030263b2080"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:11.337155 master-0 kubenswrapper[18592]: I0308 04:18:11.337091 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d","Type":"ContainerStarted","Data":"5e7c35eacf799e359f26f39dac7fad3fbeae0ea0a390eb330b78c17d794fa5d9"} Mar 08 04:18:11.353178 master-0 kubenswrapper[18592]: I0308 04:18:11.353121 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-api-0" event={"ID":"c4dde472-61ed-49eb-aa34-0addbba05d94","Type":"ContainerDied","Data":"4de883371e6530231daf13ab4069477c6bd9a0c8628952b090f46d31a4cadd84"} Mar 08 04:18:11.353178 master-0 kubenswrapper[18592]: I0308 04:18:11.353175 18592 scope.go:117] "RemoveContainer" containerID="0d4de652b4f62c68bbd94344ea7a6f0f182d6c5f80ef08f93bb5fdf5b7df6533" Mar 08 04:18:11.353392 master-0 kubenswrapper[18592]: I0308 04:18:11.353301 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.375558 master-0 kubenswrapper[18592]: I0308 04:18:11.375415 18592 generic.go:334] "Generic (PLEG): container finished" podID="0eff6f29-794c-4597-b53f-c030263b2080" containerID="acb051cdf721850d97db6528a56f259be53e65432bf2f5a93ccebdde9aac4e11" exitCode=0 Mar 08 04:18:11.375558 master-0 kubenswrapper[18592]: I0308 04:18:11.375496 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-afe2b-default-external-api-0" event={"ID":"0eff6f29-794c-4597-b53f-c030263b2080","Type":"ContainerDied","Data":"acb051cdf721850d97db6528a56f259be53e65432bf2f5a93ccebdde9aac4e11"} Mar 08 04:18:11.375558 master-0 kubenswrapper[18592]: I0308 04:18:11.375522 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-afe2b-default-external-api-0" event={"ID":"0eff6f29-794c-4597-b53f-c030263b2080","Type":"ContainerDied","Data":"dfdbb9253f240f054c23481b82fe627e31b5016d69e891ea9e38d0591252690d"} Mar 08 04:18:11.375698 master-0 kubenswrapper[18592]: I0308 04:18:11.375593 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:11.385845 master-0 kubenswrapper[18592]: I0308 04:18:11.384436 18592 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0eff6f29-794c-4597-b53f-c030263b2080-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.385845 master-0 kubenswrapper[18592]: I0308 04:18:11.384594 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dktq\" (UniqueName: \"kubernetes.io/projected/0eff6f29-794c-4597-b53f-c030263b2080-kube-api-access-5dktq\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.385845 master-0 kubenswrapper[18592]: I0308 04:18:11.384609 18592 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0eff6f29-794c-4597-b53f-c030263b2080-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.385845 master-0 kubenswrapper[18592]: I0308 04:18:11.384618 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.385845 master-0 kubenswrapper[18592]: I0308 04:18:11.384673 18592 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-acff9521-23da-47da-b539-1fad9dc0c8dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb\") on node \"master-0\" " Mar 08 04:18:11.385845 master-0 kubenswrapper[18592]: I0308 04:18:11.384688 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.385845 master-0 kubenswrapper[18592]: I0308 04:18:11.385193 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ee2a865f-ef65-4965-abfc-1425b7f0bf84","Type":"ContainerStarted","Data":"6df4c9987545dd52658911e0d1b2afbe102f2b65abb89831df19ffa11d31588b"} Mar 08 04:18:11.391319 master-0 kubenswrapper[18592]: I0308 04:18:11.390030 18592 generic.go:334] "Generic (PLEG): container finished" podID="fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" containerID="9f477e1a96bfab138a86ff6cf71109073bb51bdcf5d50327fc3da0149cb0c5c1" exitCode=0 Mar 08 04:18:11.391319 master-0 kubenswrapper[18592]: I0308 04:18:11.390059 18592 generic.go:334] "Generic (PLEG): container finished" podID="fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" containerID="5754359ee075d07402e8f61461cb8a36550cb96bb16b4338c76ba1df26aca2ca" exitCode=143 Mar 08 04:18:11.391319 master-0 kubenswrapper[18592]: I0308 04:18:11.390091 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-afe2b-default-internal-api-0" event={"ID":"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1","Type":"ContainerDied","Data":"9f477e1a96bfab138a86ff6cf71109073bb51bdcf5d50327fc3da0149cb0c5c1"} Mar 08 04:18:11.391319 master-0 kubenswrapper[18592]: I0308 04:18:11.390111 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-afe2b-default-internal-api-0" event={"ID":"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1","Type":"ContainerDied","Data":"5754359ee075d07402e8f61461cb8a36550cb96bb16b4338c76ba1df26aca2ca"} Mar 08 04:18:11.393560 master-0 kubenswrapper[18592]: I0308 04:18:11.392349 18592 generic.go:334] "Generic (PLEG): container finished" podID="09bdbbb9-e374-47e8-8ed6-48c2ffd8e252" containerID="3020c1f4e33f3277089f2e743f943eff3521ab8f7bc288529f4544ace38be504" exitCode=0 Mar 08 04:18:11.393560 master-0 kubenswrapper[18592]: I0308 04:18:11.392377 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" event={"ID":"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252","Type":"ContainerDied","Data":"3020c1f4e33f3277089f2e743f943eff3521ab8f7bc288529f4544ace38be504"} Mar 08 04:18:11.415117 master-0 kubenswrapper[18592]: I0308 04:18:11.415018 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.317404324 podStartE2EDuration="20.414998325s" podCreationTimestamp="2026-03-08 04:17:51 +0000 UTC" firstStartedPulling="2026-03-08 04:17:52.707442569 +0000 UTC m=+1484.806196919" lastFinishedPulling="2026-03-08 04:18:10.80503657 +0000 UTC m=+1502.903790920" observedRunningTime="2026-03-08 04:18:11.402044083 +0000 UTC m=+1503.500798433" watchObservedRunningTime="2026-03-08 04:18:11.414998325 +0000 UTC m=+1503.513752675" Mar 08 04:18:11.422002 master-0 kubenswrapper[18592]: I0308 04:18:11.421935 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:11.424086 master-0 kubenswrapper[18592]: I0308 04:18:11.424033 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0eff6f29-794c-4597-b53f-c030263b2080" (UID: "0eff6f29-794c-4597-b53f-c030263b2080"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:11.436388 master-0 kubenswrapper[18592]: I0308 04:18:11.436334 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-config-data" (OuterVolumeSpecName: "config-data") pod "0eff6f29-794c-4597-b53f-c030263b2080" (UID: "0eff6f29-794c-4597-b53f-c030263b2080"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:11.440867 master-0 kubenswrapper[18592]: I0308 04:18:11.440838 18592 scope.go:117] "RemoveContainer" containerID="a139db48489a58eb26371d9b815d217b5366fbff77d3c77ff9b8e2feabd3a84d" Mar 08 04:18:11.442755 master-0 kubenswrapper[18592]: I0308 04:18:11.442732 18592 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 04:18:11.442899 master-0 kubenswrapper[18592]: I0308 04:18:11.442881 18592 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-acff9521-23da-47da-b539-1fad9dc0c8dd" (UniqueName: "kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb") on node "master-0" Mar 08 04:18:11.470652 master-0 kubenswrapper[18592]: I0308 04:18:11.446943 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ff301-api-0"] Mar 08 04:18:11.470652 master-0 kubenswrapper[18592]: I0308 04:18:11.467004 18592 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-ff301-api-0" podUID="c4dde472-61ed-49eb-aa34-0addbba05d94" containerName="cinder-api" probeResult="failure" output="Get \"http://10.128.0.225:8776/healthcheck\": dial tcp 10.128.0.225:8776: i/o timeout (Client.Timeout exceeded while awaiting headers)" Mar 08 04:18:11.488459 master-0 kubenswrapper[18592]: I0308 04:18:11.486647 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-ff301-api-0"] Mar 08 04:18:11.493655 master-0 kubenswrapper[18592]: I0308 04:18:11.491555 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-scripts\") pod \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " Mar 08 04:18:11.493655 master-0 kubenswrapper[18592]: I0308 04:18:11.491673 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-internal-tls-certs\") pod \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " Mar 08 04:18:11.493655 master-0 kubenswrapper[18592]: I0308 04:18:11.491741 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-httpd-run\") pod \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " Mar 08 04:18:11.493655 master-0 kubenswrapper[18592]: I0308 04:18:11.491778 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5qhv\" (UniqueName: \"kubernetes.io/projected/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-kube-api-access-t5qhv\") pod \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " Mar 08 04:18:11.493655 master-0 kubenswrapper[18592]: I0308 04:18:11.491955 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^613c9bf7-76cc-44ce-8d9f-8cdae5e6db9e\") pod \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " Mar 08 04:18:11.493655 master-0 kubenswrapper[18592]: I0308 04:18:11.491976 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-combined-ca-bundle\") pod \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " Mar 08 04:18:11.493655 master-0 kubenswrapper[18592]: I0308 04:18:11.491995 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-logs\") pod \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " Mar 08 04:18:11.493655 master-0 kubenswrapper[18592]: I0308 04:18:11.492090 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-config-data\") pod \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\" (UID: \"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1\") " Mar 08 04:18:11.493655 master-0 kubenswrapper[18592]: I0308 04:18:11.492740 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.493655 master-0 kubenswrapper[18592]: I0308 04:18:11.492754 18592 reconciler_common.go:293] "Volume detached for volume \"pvc-acff9521-23da-47da-b539-1fad9dc0c8dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.493655 master-0 kubenswrapper[18592]: I0308 04:18:11.492767 18592 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eff6f29-794c-4597-b53f-c030263b2080-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.508913 master-0 kubenswrapper[18592]: I0308 04:18:11.501761 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" (UID: "fa83b8bc-f9dc-4376-855b-59ba17c2c0e1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:18:11.509461 master-0 kubenswrapper[18592]: I0308 04:18:11.509387 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-logs" (OuterVolumeSpecName: "logs") pod "fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" (UID: "fa83b8bc-f9dc-4376-855b-59ba17c2c0e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: I0308 04:18:11.520439 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ff301-api-0"] Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: E0308 04:18:11.520983 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eff6f29-794c-4597-b53f-c030263b2080" containerName="glance-httpd" Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: I0308 04:18:11.520997 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eff6f29-794c-4597-b53f-c030263b2080" containerName="glance-httpd" Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: E0308 04:18:11.521190 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" containerName="glance-log" Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: I0308 04:18:11.521197 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" containerName="glance-log" Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: E0308 04:18:11.521224 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4dde472-61ed-49eb-aa34-0addbba05d94" containerName="cinder-ff301-api-log" Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: I0308 04:18:11.521238 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4dde472-61ed-49eb-aa34-0addbba05d94" containerName="cinder-ff301-api-log" Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: E0308 04:18:11.521251 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" containerName="glance-httpd" Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: I0308 04:18:11.521257 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" containerName="glance-httpd" Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: E0308 04:18:11.521267 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eff6f29-794c-4597-b53f-c030263b2080" containerName="glance-log" Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: I0308 04:18:11.521273 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eff6f29-794c-4597-b53f-c030263b2080" containerName="glance-log" Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: E0308 04:18:11.521287 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4dde472-61ed-49eb-aa34-0addbba05d94" containerName="cinder-api" Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: I0308 04:18:11.521293 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4dde472-61ed-49eb-aa34-0addbba05d94" containerName="cinder-api" Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: I0308 04:18:11.521512 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" containerName="glance-log" Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: I0308 04:18:11.521522 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4dde472-61ed-49eb-aa34-0addbba05d94" containerName="cinder-api" Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: I0308 04:18:11.521536 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eff6f29-794c-4597-b53f-c030263b2080" containerName="glance-httpd" Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: I0308 04:18:11.521548 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4dde472-61ed-49eb-aa34-0addbba05d94" containerName="cinder-ff301-api-log" Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: I0308 04:18:11.521583 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eff6f29-794c-4597-b53f-c030263b2080" containerName="glance-log" Mar 08 04:18:11.521813 master-0 kubenswrapper[18592]: I0308 04:18:11.521590 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" containerName="glance-httpd" Mar 08 04:18:11.527132 master-0 kubenswrapper[18592]: I0308 04:18:11.522765 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.541643 master-0 kubenswrapper[18592]: I0308 04:18:11.540865 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-scripts" (OuterVolumeSpecName: "scripts") pod "fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" (UID: "fa83b8bc-f9dc-4376-855b-59ba17c2c0e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:11.541643 master-0 kubenswrapper[18592]: I0308 04:18:11.541362 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-ff301-api-config-data" Mar 08 04:18:11.541643 master-0 kubenswrapper[18592]: I0308 04:18:11.541386 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 08 04:18:11.541643 master-0 kubenswrapper[18592]: I0308 04:18:11.541555 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-kube-api-access-t5qhv" (OuterVolumeSpecName: "kube-api-access-t5qhv") pod "fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" (UID: "fa83b8bc-f9dc-4376-855b-59ba17c2c0e1"). InnerVolumeSpecName "kube-api-access-t5qhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:18:11.541643 master-0 kubenswrapper[18592]: I0308 04:18:11.541578 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 08 04:18:11.571056 master-0 kubenswrapper[18592]: I0308 04:18:11.570948 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^613c9bf7-76cc-44ce-8d9f-8cdae5e6db9e" (OuterVolumeSpecName: "glance") pod "fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" (UID: "fa83b8bc-f9dc-4376-855b-59ba17c2c0e1"). InnerVolumeSpecName "pvc-9bbcd16a-3cea-4572-8bc5-480804def335". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 04:18:11.582394 master-0 kubenswrapper[18592]: I0308 04:18:11.582055 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" (UID: "fa83b8bc-f9dc-4376-855b-59ba17c2c0e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:11.583171 master-0 kubenswrapper[18592]: I0308 04:18:11.583122 18592 scope.go:117] "RemoveContainer" containerID="acb051cdf721850d97db6528a56f259be53e65432bf2f5a93ccebdde9aac4e11" Mar 08 04:18:11.595016 master-0 kubenswrapper[18592]: I0308 04:18:11.594926 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-config-data-custom\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.595016 master-0 kubenswrapper[18592]: I0308 04:18:11.594998 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-config-data\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.595156 master-0 kubenswrapper[18592]: I0308 04:18:11.595098 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4kfb\" (UniqueName: \"kubernetes.io/projected/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-kube-api-access-b4kfb\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.595439 master-0 kubenswrapper[18592]: I0308 04:18:11.595346 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-logs\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.595753 master-0 kubenswrapper[18592]: I0308 04:18:11.595447 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-etc-machine-id\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.595753 master-0 kubenswrapper[18592]: I0308 04:18:11.595550 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-combined-ca-bundle\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.595753 master-0 kubenswrapper[18592]: I0308 04:18:11.595623 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-internal-tls-certs\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.595753 master-0 kubenswrapper[18592]: I0308 04:18:11.595652 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-scripts\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.596242 master-0 kubenswrapper[18592]: I0308 04:18:11.595782 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-public-tls-certs\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.596242 master-0 kubenswrapper[18592]: I0308 04:18:11.595983 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.596242 master-0 kubenswrapper[18592]: I0308 04:18:11.596000 18592 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.596242 master-0 kubenswrapper[18592]: I0308 04:18:11.596015 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5qhv\" (UniqueName: \"kubernetes.io/projected/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-kube-api-access-t5qhv\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.596242 master-0 kubenswrapper[18592]: I0308 04:18:11.596040 18592 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-9bbcd16a-3cea-4572-8bc5-480804def335\" (UniqueName: \"kubernetes.io/csi/topolvm.io^613c9bf7-76cc-44ce-8d9f-8cdae5e6db9e\") on node \"master-0\" " Mar 08 04:18:11.596242 master-0 kubenswrapper[18592]: I0308 04:18:11.596050 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.596242 master-0 kubenswrapper[18592]: I0308 04:18:11.596059 18592 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.603750 master-0 kubenswrapper[18592]: I0308 04:18:11.603706 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ff301-api-0"] Mar 08 04:18:11.606048 master-0 kubenswrapper[18592]: I0308 04:18:11.605939 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" (UID: "fa83b8bc-f9dc-4376-855b-59ba17c2c0e1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:11.624791 master-0 kubenswrapper[18592]: I0308 04:18:11.624265 18592 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 04:18:11.624791 master-0 kubenswrapper[18592]: I0308 04:18:11.624599 18592 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-9bbcd16a-3cea-4572-8bc5-480804def335" (UniqueName: "kubernetes.io/csi/topolvm.io^613c9bf7-76cc-44ce-8d9f-8cdae5e6db9e") on node "master-0" Mar 08 04:18:11.665800 master-0 kubenswrapper[18592]: I0308 04:18:11.665732 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-config-data" (OuterVolumeSpecName: "config-data") pod "fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" (UID: "fa83b8bc-f9dc-4376-855b-59ba17c2c0e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:11.708321 master-0 kubenswrapper[18592]: I0308 04:18:11.706921 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-internal-tls-certs\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.708321 master-0 kubenswrapper[18592]: I0308 04:18:11.706963 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-scripts\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.708321 master-0 kubenswrapper[18592]: I0308 04:18:11.707024 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-public-tls-certs\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.708321 master-0 kubenswrapper[18592]: I0308 04:18:11.707044 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-config-data-custom\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.708321 master-0 kubenswrapper[18592]: I0308 04:18:11.707068 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-config-data\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.708321 master-0 kubenswrapper[18592]: I0308 04:18:11.707121 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4kfb\" (UniqueName: \"kubernetes.io/projected/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-kube-api-access-b4kfb\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.708321 master-0 kubenswrapper[18592]: I0308 04:18:11.707190 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-logs\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.708321 master-0 kubenswrapper[18592]: I0308 04:18:11.707223 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-etc-machine-id\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.708321 master-0 kubenswrapper[18592]: I0308 04:18:11.707268 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-combined-ca-bundle\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.708321 master-0 kubenswrapper[18592]: I0308 04:18:11.707330 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.708321 master-0 kubenswrapper[18592]: I0308 04:18:11.707343 18592 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.708321 master-0 kubenswrapper[18592]: I0308 04:18:11.707354 18592 reconciler_common.go:293] "Volume detached for volume \"pvc-9bbcd16a-3cea-4572-8bc5-480804def335\" (UniqueName: \"kubernetes.io/csi/topolvm.io^613c9bf7-76cc-44ce-8d9f-8cdae5e6db9e\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:11.710204 master-0 kubenswrapper[18592]: I0308 04:18:11.709895 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-logs\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.710204 master-0 kubenswrapper[18592]: I0308 04:18:11.709994 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-etc-machine-id\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.719527 master-0 kubenswrapper[18592]: I0308 04:18:11.715816 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-scripts\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.738106 master-0 kubenswrapper[18592]: I0308 04:18:11.737303 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4kfb\" (UniqueName: \"kubernetes.io/projected/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-kube-api-access-b4kfb\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.738292 master-0 kubenswrapper[18592]: I0308 04:18:11.738237 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-config-data-custom\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.739062 master-0 kubenswrapper[18592]: I0308 04:18:11.738741 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-config-data\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.739062 master-0 kubenswrapper[18592]: I0308 04:18:11.738918 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-combined-ca-bundle\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.742006 master-0 kubenswrapper[18592]: I0308 04:18:11.741231 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-internal-tls-certs\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.776984 master-0 kubenswrapper[18592]: I0308 04:18:11.776940 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/588d2fd8-2c47-44e1-b3d9-d1f95c7f1616-public-tls-certs\") pod \"cinder-ff301-api-0\" (UID: \"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616\") " pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.835128 master-0 kubenswrapper[18592]: I0308 04:18:11.834148 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2306-account-create-update-c95sx"] Mar 08 04:18:11.856303 master-0 kubenswrapper[18592]: I0308 04:18:11.851781 18592 scope.go:117] "RemoveContainer" containerID="9bdd4546b9740edeb6f0f584dc8c3feaebf706adb02f7a37561eaa759eeb812f" Mar 08 04:18:11.856303 master-0 kubenswrapper[18592]: I0308 04:18:11.852788 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ff301-api-0" Mar 08 04:18:11.883711 master-0 kubenswrapper[18592]: I0308 04:18:11.875250 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6jgd8"] Mar 08 04:18:11.942844 master-0 kubenswrapper[18592]: I0308 04:18:11.940074 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-qrztf"] Mar 08 04:18:11.959932 master-0 kubenswrapper[18592]: I0308 04:18:11.959841 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-f4tg6"] Mar 08 04:18:12.003479 master-0 kubenswrapper[18592]: I0308 04:18:11.999889 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-afe2b-default-external-api-0"] Mar 08 04:18:12.037077 master-0 kubenswrapper[18592]: I0308 04:18:12.037025 18592 scope.go:117] "RemoveContainer" containerID="acb051cdf721850d97db6528a56f259be53e65432bf2f5a93ccebdde9aac4e11" Mar 08 04:18:12.037788 master-0 kubenswrapper[18592]: E0308 04:18:12.037759 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acb051cdf721850d97db6528a56f259be53e65432bf2f5a93ccebdde9aac4e11\": container with ID starting with acb051cdf721850d97db6528a56f259be53e65432bf2f5a93ccebdde9aac4e11 not found: ID does not exist" containerID="acb051cdf721850d97db6528a56f259be53e65432bf2f5a93ccebdde9aac4e11" Mar 08 04:18:12.037864 master-0 kubenswrapper[18592]: I0308 04:18:12.037807 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb051cdf721850d97db6528a56f259be53e65432bf2f5a93ccebdde9aac4e11"} err="failed to get container status \"acb051cdf721850d97db6528a56f259be53e65432bf2f5a93ccebdde9aac4e11\": rpc error: code = NotFound desc = could not find container \"acb051cdf721850d97db6528a56f259be53e65432bf2f5a93ccebdde9aac4e11\": container with ID starting with acb051cdf721850d97db6528a56f259be53e65432bf2f5a93ccebdde9aac4e11 not found: ID does not exist" Mar 08 04:18:12.037864 master-0 kubenswrapper[18592]: I0308 04:18:12.037852 18592 scope.go:117] "RemoveContainer" containerID="9bdd4546b9740edeb6f0f584dc8c3feaebf706adb02f7a37561eaa759eeb812f" Mar 08 04:18:12.041932 master-0 kubenswrapper[18592]: E0308 04:18:12.041794 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bdd4546b9740edeb6f0f584dc8c3feaebf706adb02f7a37561eaa759eeb812f\": container with ID starting with 9bdd4546b9740edeb6f0f584dc8c3feaebf706adb02f7a37561eaa759eeb812f not found: ID does not exist" containerID="9bdd4546b9740edeb6f0f584dc8c3feaebf706adb02f7a37561eaa759eeb812f" Mar 08 04:18:12.041932 master-0 kubenswrapper[18592]: I0308 04:18:12.041922 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bdd4546b9740edeb6f0f584dc8c3feaebf706adb02f7a37561eaa759eeb812f"} err="failed to get container status \"9bdd4546b9740edeb6f0f584dc8c3feaebf706adb02f7a37561eaa759eeb812f\": rpc error: code = NotFound desc = could not find container \"9bdd4546b9740edeb6f0f584dc8c3feaebf706adb02f7a37561eaa759eeb812f\": container with ID starting with 9bdd4546b9740edeb6f0f584dc8c3feaebf706adb02f7a37561eaa759eeb812f not found: ID does not exist" Mar 08 04:18:12.073786 master-0 kubenswrapper[18592]: I0308 04:18:12.073561 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 08 04:18:12.084936 master-0 kubenswrapper[18592]: I0308 04:18:12.084888 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-afe2b-default-external-api-0"] Mar 08 04:18:12.113420 master-0 kubenswrapper[18592]: I0308 04:18:12.105284 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 08 04:18:12.115576 master-0 kubenswrapper[18592]: I0308 04:18:12.115535 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-afe2b-default-external-api-0"] Mar 08 04:18:12.118808 master-0 kubenswrapper[18592]: I0308 04:18:12.118625 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.124186 master-0 kubenswrapper[18592]: I0308 04:18:12.121666 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 04:18:12.124186 master-0 kubenswrapper[18592]: I0308 04:18:12.122063 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-afe2b-default-external-config-data" Mar 08 04:18:12.125639 master-0 kubenswrapper[18592]: I0308 04:18:12.125610 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 08 04:18:12.152339 master-0 kubenswrapper[18592]: I0308 04:18:12.152287 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-combined-ca-bundle\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.154044 master-0 kubenswrapper[18592]: I0308 04:18:12.152418 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnzhr\" (UniqueName: \"kubernetes.io/projected/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-kube-api-access-bnzhr\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.154044 master-0 kubenswrapper[18592]: I0308 04:18:12.152469 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-acff9521-23da-47da-b539-1fad9dc0c8dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.154044 master-0 kubenswrapper[18592]: I0308 04:18:12.152491 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-httpd-run\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.154044 master-0 kubenswrapper[18592]: I0308 04:18:12.152545 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-public-tls-certs\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.154044 master-0 kubenswrapper[18592]: I0308 04:18:12.152566 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-logs\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.154044 master-0 kubenswrapper[18592]: I0308 04:18:12.152600 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-config-data\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.154044 master-0 kubenswrapper[18592]: I0308 04:18:12.152655 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-scripts\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.230073 master-0 kubenswrapper[18592]: I0308 04:18:12.221668 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eff6f29-794c-4597-b53f-c030263b2080" path="/var/lib/kubelet/pods/0eff6f29-794c-4597-b53f-c030263b2080/volumes" Mar 08 04:18:12.259594 master-0 kubenswrapper[18592]: I0308 04:18:12.259541 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-config-data\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.259808 master-0 kubenswrapper[18592]: I0308 04:18:12.259644 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-scripts\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.259808 master-0 kubenswrapper[18592]: I0308 04:18:12.259683 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-combined-ca-bundle\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.259808 master-0 kubenswrapper[18592]: I0308 04:18:12.259760 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnzhr\" (UniqueName: \"kubernetes.io/projected/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-kube-api-access-bnzhr\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.259808 master-0 kubenswrapper[18592]: I0308 04:18:12.259804 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-acff9521-23da-47da-b539-1fad9dc0c8dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.259950 master-0 kubenswrapper[18592]: I0308 04:18:12.259845 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-httpd-run\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.259950 master-0 kubenswrapper[18592]: I0308 04:18:12.259924 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-public-tls-certs\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.260019 master-0 kubenswrapper[18592]: I0308 04:18:12.259948 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-logs\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.267132 master-0 kubenswrapper[18592]: I0308 04:18:12.260442 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-logs\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.267132 master-0 kubenswrapper[18592]: I0308 04:18:12.266528 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-config-data\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.273858 master-0 kubenswrapper[18592]: I0308 04:18:12.269526 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-scripts\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.273858 master-0 kubenswrapper[18592]: I0308 04:18:12.269689 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-httpd-run\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.278422 master-0 kubenswrapper[18592]: I0308 04:18:12.278374 18592 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 04:18:12.278516 master-0 kubenswrapper[18592]: I0308 04:18:12.278426 18592 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-acff9521-23da-47da-b539-1fad9dc0c8dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/4fa84c0fff972147ae10cb80fb512dbc343a3b9b44d5fb87069bb7869e732a9b/globalmount\"" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.279484 master-0 kubenswrapper[18592]: I0308 04:18:12.279452 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4dde472-61ed-49eb-aa34-0addbba05d94" path="/var/lib/kubelet/pods/c4dde472-61ed-49eb-aa34-0addbba05d94/volumes" Mar 08 04:18:12.285973 master-0 kubenswrapper[18592]: I0308 04:18:12.280497 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-combined-ca-bundle\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.285973 master-0 kubenswrapper[18592]: I0308 04:18:12.285634 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-public-tls-certs\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.300970 master-0 kubenswrapper[18592]: I0308 04:18:12.296816 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnzhr\" (UniqueName: \"kubernetes.io/projected/b1f5ab90-a4e1-47d7-9d79-22cc65e4295d-kube-api-access-bnzhr\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:12.301748 master-0 kubenswrapper[18592]: I0308 04:18:12.301704 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-afe2b-default-external-api-0"] Mar 08 04:18:12.301878 master-0 kubenswrapper[18592]: I0308 04:18:12.301864 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e145-account-create-update-nccdt"] Mar 08 04:18:12.354758 master-0 kubenswrapper[18592]: I0308 04:18:12.353150 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f75d-account-create-update-fxddp"] Mar 08 04:18:12.554051 master-0 kubenswrapper[18592]: I0308 04:18:12.553656 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2306-account-create-update-c95sx" event={"ID":"3fefd13e-e178-4790-823e-458456886a84","Type":"ContainerStarted","Data":"0e5fb03d1b4b6e2402a387859a530ac2182eb8b50593092aff1fa76afe8555e5"} Mar 08 04:18:12.569429 master-0 kubenswrapper[18592]: I0308 04:18:12.569311 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f4tg6" event={"ID":"bafa1006-fce0-4733-9706-a4de6df10ac7","Type":"ContainerStarted","Data":"3b658c1400db45e6a5d3b6f5b1f36b1efb3f5efc89b4df81fda23f6b1d4a0fd2"} Mar 08 04:18:12.575161 master-0 kubenswrapper[18592]: I0308 04:18:12.575103 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" event={"ID":"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252","Type":"ContainerStarted","Data":"97c60d17ee3243d1ac5dc1fdcc1a240bb0578b08472552c1cb07b9e8c4f1bb6a"} Mar 08 04:18:12.575882 master-0 kubenswrapper[18592]: I0308 04:18:12.575834 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:18:12.578873 master-0 kubenswrapper[18592]: I0308 04:18:12.578833 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qrztf" event={"ID":"7c678d7e-e3ea-40d5-b265-cf42ac1139c6","Type":"ContainerStarted","Data":"486cc29f84f61fa9cb79dc47d04d562c693dd43408de31003c55a9747a0be8b6"} Mar 08 04:18:12.581719 master-0 kubenswrapper[18592]: I0308 04:18:12.581432 18592 generic.go:334] "Generic (PLEG): container finished" podID="3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d" containerID="35f1c9766cb803ecc656092343100209989a43b0160e4c7dfd99c0740eab8879" exitCode=0 Mar 08 04:18:12.581719 master-0 kubenswrapper[18592]: I0308 04:18:12.581525 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d","Type":"ContainerDied","Data":"35f1c9766cb803ecc656092343100209989a43b0160e4c7dfd99c0740eab8879"} Mar 08 04:18:12.599568 master-0 kubenswrapper[18592]: I0308 04:18:12.599526 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"abe912ba-4a33-4634-a3fb-b6fb09b38d8e","Type":"ContainerStarted","Data":"b8a8913f0f4e234c799b2e4a7219880168db2e3381aa36a6d86a2dd1d0f4d190"} Mar 08 04:18:12.602681 master-0 kubenswrapper[18592]: I0308 04:18:12.602643 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-afe2b-default-internal-api-0" event={"ID":"fa83b8bc-f9dc-4376-855b-59ba17c2c0e1","Type":"ContainerDied","Data":"a3cda40a372fe08964e712fd6c2673cdcdb0a13bd49c0058145d1da01eec478c"} Mar 08 04:18:12.602751 master-0 kubenswrapper[18592]: I0308 04:18:12.602682 18592 scope.go:117] "RemoveContainer" containerID="9f477e1a96bfab138a86ff6cf71109073bb51bdcf5d50327fc3da0149cb0c5c1" Mar 08 04:18:12.602806 master-0 kubenswrapper[18592]: I0308 04:18:12.602786 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:12.604784 master-0 kubenswrapper[18592]: I0308 04:18:12.604732 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f75d-account-create-update-fxddp" event={"ID":"6a5fc841-ddfd-4704-9a4c-878bcbb98bcc","Type":"ContainerStarted","Data":"c90e11f982fc4650194b8d8ad4c966d088303c068b5708a70ac534b4391e8989"} Mar 08 04:18:12.606495 master-0 kubenswrapper[18592]: I0308 04:18:12.606453 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6jgd8" event={"ID":"f2a5661f-15ae-4df4-ab1f-8539afd4d339","Type":"ContainerStarted","Data":"01b43a434a0b3b6ca898483c325e86d85f9584e1279ccba2667f71198698f23f"} Mar 08 04:18:12.607542 master-0 kubenswrapper[18592]: I0308 04:18:12.607504 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e145-account-create-update-nccdt" event={"ID":"955ce42c-5d68-4659-a993-85d566eb7c0c","Type":"ContainerStarted","Data":"775808f9a1284a8c4fc8300ddf5a19e25df26151f4468c426b1edd1f947b26eb"} Mar 08 04:18:12.701256 master-0 kubenswrapper[18592]: I0308 04:18:12.698870 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" podStartSLOduration=14.698850461 podStartE2EDuration="14.698850461s" podCreationTimestamp="2026-03-08 04:17:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:18:12.684461781 +0000 UTC m=+1504.783216131" watchObservedRunningTime="2026-03-08 04:18:12.698850461 +0000 UTC m=+1504.797604811" Mar 08 04:18:12.747220 master-0 kubenswrapper[18592]: I0308 04:18:12.746107 18592 scope.go:117] "RemoveContainer" containerID="5754359ee075d07402e8f61461cb8a36550cb96bb16b4338c76ba1df26aca2ca" Mar 08 04:18:12.815009 master-0 kubenswrapper[18592]: I0308 04:18:12.814954 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ff301-api-0"] Mar 08 04:18:12.869420 master-0 kubenswrapper[18592]: W0308 04:18:12.869265 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod588d2fd8_2c47_44e1_b3d9_d1f95c7f1616.slice/crio-ad9c182c1577de31532c7d43257167ab666379de87448051718af95748a4b636 WatchSource:0}: Error finding container ad9c182c1577de31532c7d43257167ab666379de87448051718af95748a4b636: Status 404 returned error can't find the container with id ad9c182c1577de31532c7d43257167ab666379de87448051718af95748a4b636 Mar 08 04:18:13.345940 master-0 kubenswrapper[18592]: I0308 04:18:13.345778 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-afe2b-default-internal-api-0"] Mar 08 04:18:13.376024 master-0 kubenswrapper[18592]: I0308 04:18:13.375961 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-afe2b-default-internal-api-0"] Mar 08 04:18:13.419940 master-0 kubenswrapper[18592]: I0308 04:18:13.419881 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-afe2b-default-internal-api-0"] Mar 08 04:18:13.422072 master-0 kubenswrapper[18592]: I0308 04:18:13.422041 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.425223 master-0 kubenswrapper[18592]: I0308 04:18:13.425181 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-afe2b-default-internal-config-data" Mar 08 04:18:13.425397 master-0 kubenswrapper[18592]: I0308 04:18:13.425371 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 08 04:18:13.453532 master-0 kubenswrapper[18592]: I0308 04:18:13.452962 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-httpd-run\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.453532 master-0 kubenswrapper[18592]: I0308 04:18:13.453030 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-logs\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.453532 master-0 kubenswrapper[18592]: I0308 04:18:13.453084 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-config-data\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.453532 master-0 kubenswrapper[18592]: I0308 04:18:13.453105 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9bbcd16a-3cea-4572-8bc5-480804def335\" (UniqueName: \"kubernetes.io/csi/topolvm.io^613c9bf7-76cc-44ce-8d9f-8cdae5e6db9e\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.453532 master-0 kubenswrapper[18592]: I0308 04:18:13.453140 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlgsb\" (UniqueName: \"kubernetes.io/projected/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-kube-api-access-wlgsb\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.453532 master-0 kubenswrapper[18592]: I0308 04:18:13.453160 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-scripts\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.453532 master-0 kubenswrapper[18592]: I0308 04:18:13.453215 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-combined-ca-bundle\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.453532 master-0 kubenswrapper[18592]: I0308 04:18:13.453257 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-internal-tls-certs\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.457799 master-0 kubenswrapper[18592]: I0308 04:18:13.455455 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-afe2b-default-internal-api-0"] Mar 08 04:18:13.555725 master-0 kubenswrapper[18592]: I0308 04:18:13.555650 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-internal-tls-certs\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.555953 master-0 kubenswrapper[18592]: I0308 04:18:13.555790 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-httpd-run\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.555953 master-0 kubenswrapper[18592]: I0308 04:18:13.555851 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-logs\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.555953 master-0 kubenswrapper[18592]: I0308 04:18:13.555925 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-config-data\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.555953 master-0 kubenswrapper[18592]: I0308 04:18:13.555947 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9bbcd16a-3cea-4572-8bc5-480804def335\" (UniqueName: \"kubernetes.io/csi/topolvm.io^613c9bf7-76cc-44ce-8d9f-8cdae5e6db9e\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.556088 master-0 kubenswrapper[18592]: I0308 04:18:13.555999 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlgsb\" (UniqueName: \"kubernetes.io/projected/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-kube-api-access-wlgsb\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.556088 master-0 kubenswrapper[18592]: I0308 04:18:13.556024 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-scripts\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.556155 master-0 kubenswrapper[18592]: I0308 04:18:13.556096 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-combined-ca-bundle\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.557010 master-0 kubenswrapper[18592]: I0308 04:18:13.556616 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-httpd-run\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.557382 master-0 kubenswrapper[18592]: I0308 04:18:13.557366 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-logs\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.558599 master-0 kubenswrapper[18592]: I0308 04:18:13.558487 18592 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 04:18:13.558599 master-0 kubenswrapper[18592]: I0308 04:18:13.558529 18592 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9bbcd16a-3cea-4572-8bc5-480804def335\" (UniqueName: \"kubernetes.io/csi/topolvm.io^613c9bf7-76cc-44ce-8d9f-8cdae5e6db9e\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/13a0545ec1feabd7833445e4e7b9c29b3f2190095c44424ae5035fff8a8ec5c4/globalmount\"" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.561385 master-0 kubenswrapper[18592]: I0308 04:18:13.561353 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-scripts\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.574410 master-0 kubenswrapper[18592]: I0308 04:18:13.569323 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-internal-tls-certs\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.574410 master-0 kubenswrapper[18592]: I0308 04:18:13.570582 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-combined-ca-bundle\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.575951 master-0 kubenswrapper[18592]: I0308 04:18:13.575918 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-config-data\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.592342 master-0 kubenswrapper[18592]: I0308 04:18:13.576329 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlgsb\" (UniqueName: \"kubernetes.io/projected/7ff800ca-3f34-40c3-a4d1-329fe69e0c3a-kube-api-access-wlgsb\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:13.723847 master-0 kubenswrapper[18592]: I0308 04:18:13.720999 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e145-account-create-update-nccdt" event={"ID":"955ce42c-5d68-4659-a993-85d566eb7c0c","Type":"ContainerStarted","Data":"3f72611aa2a83f58ed2ac02536d92f6772c99e81d4ff1f180b1b640f08dc25a9"} Mar 08 04:18:13.733842 master-0 kubenswrapper[18592]: I0308 04:18:13.725586 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2306-account-create-update-c95sx" event={"ID":"3fefd13e-e178-4790-823e-458456886a84","Type":"ContainerStarted","Data":"1870beb02130dec3bf711629c227a521d8a55fdc7f4ce516dc9d419fdbd15d7c"} Mar 08 04:18:13.743263 master-0 kubenswrapper[18592]: I0308 04:18:13.743191 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f4tg6" event={"ID":"bafa1006-fce0-4733-9706-a4de6df10ac7","Type":"ContainerStarted","Data":"a9efdc51bdd11fca1685cb40aa17e5acbaa143e7a3e9f12915b823c98dc81e5e"} Mar 08 04:18:13.754072 master-0 kubenswrapper[18592]: I0308 04:18:13.754002 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-e145-account-create-update-nccdt" podStartSLOduration=12.753981823 podStartE2EDuration="12.753981823s" podCreationTimestamp="2026-03-08 04:18:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:18:13.737486885 +0000 UTC m=+1505.836241235" watchObservedRunningTime="2026-03-08 04:18:13.753981823 +0000 UTC m=+1505.852736193" Mar 08 04:18:13.782910 master-0 kubenswrapper[18592]: I0308 04:18:13.763878 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-2306-account-create-update-c95sx" podStartSLOduration=12.763865781 podStartE2EDuration="12.763865781s" podCreationTimestamp="2026-03-08 04:18:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:18:13.760606903 +0000 UTC m=+1505.859361243" watchObservedRunningTime="2026-03-08 04:18:13.763865781 +0000 UTC m=+1505.862620131" Mar 08 04:18:13.782910 master-0 kubenswrapper[18592]: I0308 04:18:13.780278 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qrztf" event={"ID":"7c678d7e-e3ea-40d5-b265-cf42ac1139c6","Type":"ContainerStarted","Data":"3cacfb137573a88f0a197cf79ea71f9406089db6fbd3e3c02e9515c621c0ccdf"} Mar 08 04:18:13.790853 master-0 kubenswrapper[18592]: I0308 04:18:13.784257 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-api-0" event={"ID":"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616","Type":"ContainerStarted","Data":"ad9c182c1577de31532c7d43257167ab666379de87448051718af95748a4b636"} Mar 08 04:18:13.790853 master-0 kubenswrapper[18592]: I0308 04:18:13.787762 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f75d-account-create-update-fxddp" event={"ID":"6a5fc841-ddfd-4704-9a4c-878bcbb98bcc","Type":"ContainerStarted","Data":"a9e4fea2ba64e24a9bb73af0e5ee5e6cfb072593d414a9dd1d559cd6d7fd9248"} Mar 08 04:18:13.833979 master-0 kubenswrapper[18592]: I0308 04:18:13.833927 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6jgd8" event={"ID":"f2a5661f-15ae-4df4-ab1f-8539afd4d339","Type":"ContainerStarted","Data":"2b74154b1248b8c3b12da0ca994416c32181cd7fe176de0ad730f3c342b5c2bb"} Mar 08 04:18:13.913861 master-0 kubenswrapper[18592]: I0308 04:18:13.913514 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 08 04:18:13.921870 master-0 kubenswrapper[18592]: I0308 04:18:13.921445 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-f4tg6" podStartSLOduration=12.921423692 podStartE2EDuration="12.921423692s" podCreationTimestamp="2026-03-08 04:18:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:18:13.800176648 +0000 UTC m=+1505.898930998" watchObservedRunningTime="2026-03-08 04:18:13.921423692 +0000 UTC m=+1506.020178042" Mar 08 04:18:13.922195 master-0 kubenswrapper[18592]: I0308 04:18:13.922157 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-qrztf" podStartSLOduration=12.922151573 podStartE2EDuration="12.922151573s" podCreationTimestamp="2026-03-08 04:18:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:18:13.821418856 +0000 UTC m=+1505.920173206" watchObservedRunningTime="2026-03-08 04:18:13.922151573 +0000 UTC m=+1506.020905923" Mar 08 04:18:13.931928 master-0 kubenswrapper[18592]: I0308 04:18:13.931325 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-f75d-account-create-update-fxddp" podStartSLOduration=12.931311972 podStartE2EDuration="12.931311972s" podCreationTimestamp="2026-03-08 04:18:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:18:13.84955602 +0000 UTC m=+1505.948310370" watchObservedRunningTime="2026-03-08 04:18:13.931311972 +0000 UTC m=+1506.030066322" Mar 08 04:18:13.941435 master-0 kubenswrapper[18592]: I0308 04:18:13.941375 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-6jgd8" podStartSLOduration=12.941361524 podStartE2EDuration="12.941361524s" podCreationTimestamp="2026-03-08 04:18:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:18:13.874347844 +0000 UTC m=+1505.973102194" watchObservedRunningTime="2026-03-08 04:18:13.941361524 +0000 UTC m=+1506.040115874" Mar 08 04:18:14.072025 master-0 kubenswrapper[18592]: I0308 04:18:14.071660 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-config\") pod \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " Mar 08 04:18:14.072025 master-0 kubenswrapper[18592]: I0308 04:18:14.071806 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-var-lib-ironic\") pod \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " Mar 08 04:18:14.072025 master-0 kubenswrapper[18592]: I0308 04:18:14.071887 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qprv\" (UniqueName: \"kubernetes.io/projected/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-kube-api-access-2qprv\") pod \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " Mar 08 04:18:14.072327 master-0 kubenswrapper[18592]: I0308 04:18:14.072082 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-etc-podinfo\") pod \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " Mar 08 04:18:14.072327 master-0 kubenswrapper[18592]: I0308 04:18:14.072139 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " Mar 08 04:18:14.072327 master-0 kubenswrapper[18592]: I0308 04:18:14.072224 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-scripts\") pod \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " Mar 08 04:18:14.072327 master-0 kubenswrapper[18592]: I0308 04:18:14.072312 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-combined-ca-bundle\") pod \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\" (UID: \"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d\") " Mar 08 04:18:14.075562 master-0 kubenswrapper[18592]: I0308 04:18:14.075330 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d" (UID: "3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:18:14.076288 master-0 kubenswrapper[18592]: I0308 04:18:14.076234 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d" (UID: "3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:18:14.081045 master-0 kubenswrapper[18592]: I0308 04:18:14.080989 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-config" (OuterVolumeSpecName: "config") pod "3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d" (UID: "3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:14.082992 master-0 kubenswrapper[18592]: I0308 04:18:14.082934 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d" (UID: "3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 04:18:14.096863 master-0 kubenswrapper[18592]: I0308 04:18:14.096776 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-kube-api-access-2qprv" (OuterVolumeSpecName: "kube-api-access-2qprv") pod "3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d" (UID: "3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d"). InnerVolumeSpecName "kube-api-access-2qprv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:18:14.101492 master-0 kubenswrapper[18592]: I0308 04:18:14.101419 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-scripts" (OuterVolumeSpecName: "scripts") pod "3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d" (UID: "3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:14.137243 master-0 kubenswrapper[18592]: I0308 04:18:14.135042 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d" (UID: "3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:14.175081 master-0 kubenswrapper[18592]: I0308 04:18:14.173694 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa83b8bc-f9dc-4376-855b-59ba17c2c0e1" path="/var/lib/kubelet/pods/fa83b8bc-f9dc-4376-855b-59ba17c2c0e1/volumes" Mar 08 04:18:14.176316 master-0 kubenswrapper[18592]: I0308 04:18:14.175363 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:14.176316 master-0 kubenswrapper[18592]: I0308 04:18:14.175404 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:14.176316 master-0 kubenswrapper[18592]: I0308 04:18:14.175416 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:14.176316 master-0 kubenswrapper[18592]: I0308 04:18:14.175425 18592 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:14.176316 master-0 kubenswrapper[18592]: I0308 04:18:14.175436 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qprv\" (UniqueName: \"kubernetes.io/projected/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-kube-api-access-2qprv\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:14.176316 master-0 kubenswrapper[18592]: I0308 04:18:14.175445 18592 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:14.176316 master-0 kubenswrapper[18592]: I0308 04:18:14.175477 18592 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:14.805122 master-0 kubenswrapper[18592]: I0308 04:18:14.805047 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:18:14.877517 master-0 kubenswrapper[18592]: I0308 04:18:14.877452 18592 generic.go:334] "Generic (PLEG): container finished" podID="3fefd13e-e178-4790-823e-458456886a84" containerID="1870beb02130dec3bf711629c227a521d8a55fdc7f4ce516dc9d419fdbd15d7c" exitCode=0 Mar 08 04:18:14.877975 master-0 kubenswrapper[18592]: I0308 04:18:14.877541 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2306-account-create-update-c95sx" event={"ID":"3fefd13e-e178-4790-823e-458456886a84","Type":"ContainerDied","Data":"1870beb02130dec3bf711629c227a521d8a55fdc7f4ce516dc9d419fdbd15d7c"} Mar 08 04:18:14.885665 master-0 kubenswrapper[18592]: I0308 04:18:14.885593 18592 generic.go:334] "Generic (PLEG): container finished" podID="bafa1006-fce0-4733-9706-a4de6df10ac7" containerID="a9efdc51bdd11fca1685cb40aa17e5acbaa143e7a3e9f12915b823c98dc81e5e" exitCode=0 Mar 08 04:18:14.885876 master-0 kubenswrapper[18592]: I0308 04:18:14.885669 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f4tg6" event={"ID":"bafa1006-fce0-4733-9706-a4de6df10ac7","Type":"ContainerDied","Data":"a9efdc51bdd11fca1685cb40aa17e5acbaa143e7a3e9f12915b823c98dc81e5e"} Mar 08 04:18:14.896090 master-0 kubenswrapper[18592]: I0308 04:18:14.895026 18592 generic.go:334] "Generic (PLEG): container finished" podID="7c678d7e-e3ea-40d5-b265-cf42ac1139c6" containerID="3cacfb137573a88f0a197cf79ea71f9406089db6fbd3e3c02e9515c621c0ccdf" exitCode=0 Mar 08 04:18:14.896090 master-0 kubenswrapper[18592]: I0308 04:18:14.895104 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qrztf" event={"ID":"7c678d7e-e3ea-40d5-b265-cf42ac1139c6","Type":"ContainerDied","Data":"3cacfb137573a88f0a197cf79ea71f9406089db6fbd3e3c02e9515c621c0ccdf"} Mar 08 04:18:14.902590 master-0 kubenswrapper[18592]: I0308 04:18:14.902536 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d","Type":"ContainerDied","Data":"5e7c35eacf799e359f26f39dac7fad3fbeae0ea0a390eb330b78c17d794fa5d9"} Mar 08 04:18:14.902590 master-0 kubenswrapper[18592]: I0308 04:18:14.902593 18592 scope.go:117] "RemoveContainer" containerID="35f1c9766cb803ecc656092343100209989a43b0160e4c7dfd99c0740eab8879" Mar 08 04:18:14.902791 master-0 kubenswrapper[18592]: I0308 04:18:14.902727 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 08 04:18:14.908848 master-0 kubenswrapper[18592]: I0308 04:18:14.908788 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-api-0" event={"ID":"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616","Type":"ContainerStarted","Data":"4095528e702b808a2b354d3b70c6ace89eb862bc758aac78dd2d6352ef95bd12"} Mar 08 04:18:14.919286 master-0 kubenswrapper[18592]: I0308 04:18:14.919234 18592 generic.go:334] "Generic (PLEG): container finished" podID="6a5fc841-ddfd-4704-9a4c-878bcbb98bcc" containerID="a9e4fea2ba64e24a9bb73af0e5ee5e6cfb072593d414a9dd1d559cd6d7fd9248" exitCode=0 Mar 08 04:18:14.923160 master-0 kubenswrapper[18592]: I0308 04:18:14.919311 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f75d-account-create-update-fxddp" event={"ID":"6a5fc841-ddfd-4704-9a4c-878bcbb98bcc","Type":"ContainerDied","Data":"a9e4fea2ba64e24a9bb73af0e5ee5e6cfb072593d414a9dd1d559cd6d7fd9248"} Mar 08 04:18:14.931660 master-0 kubenswrapper[18592]: I0308 04:18:14.931597 18592 generic.go:334] "Generic (PLEG): container finished" podID="f2a5661f-15ae-4df4-ab1f-8539afd4d339" containerID="2b74154b1248b8c3b12da0ca994416c32181cd7fe176de0ad730f3c342b5c2bb" exitCode=0 Mar 08 04:18:14.931781 master-0 kubenswrapper[18592]: I0308 04:18:14.931684 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6jgd8" event={"ID":"f2a5661f-15ae-4df4-ab1f-8539afd4d339","Type":"ContainerDied","Data":"2b74154b1248b8c3b12da0ca994416c32181cd7fe176de0ad730f3c342b5c2bb"} Mar 08 04:18:14.933406 master-0 kubenswrapper[18592]: I0308 04:18:14.933375 18592 generic.go:334] "Generic (PLEG): container finished" podID="955ce42c-5d68-4659-a993-85d566eb7c0c" containerID="3f72611aa2a83f58ed2ac02536d92f6772c99e81d4ff1f180b1b640f08dc25a9" exitCode=0 Mar 08 04:18:14.933406 master-0 kubenswrapper[18592]: I0308 04:18:14.933402 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e145-account-create-update-nccdt" event={"ID":"955ce42c-5d68-4659-a993-85d566eb7c0c","Type":"ContainerDied","Data":"3f72611aa2a83f58ed2ac02536d92f6772c99e81d4ff1f180b1b640f08dc25a9"} Mar 08 04:18:15.003355 master-0 kubenswrapper[18592]: I0308 04:18:15.003305 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-59ffff478d-tll4r" Mar 08 04:18:15.127902 master-0 kubenswrapper[18592]: I0308 04:18:15.123187 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-57f9f57fc6-pfwg4"] Mar 08 04:18:15.127902 master-0 kubenswrapper[18592]: I0308 04:18:15.123531 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-57f9f57fc6-pfwg4" podUID="9c77d97d-417a-48b7-a871-6800b12fbcb7" containerName="placement-log" containerID="cri-o://921d18396cd77485dd40de28e9920f77cdd014dc6f721b25d815f0d811d8d65b" gracePeriod=30 Mar 08 04:18:15.127902 master-0 kubenswrapper[18592]: I0308 04:18:15.123666 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-57f9f57fc6-pfwg4" podUID="9c77d97d-417a-48b7-a871-6800b12fbcb7" containerName="placement-api" containerID="cri-o://0f947b349b8be56666ad7c86e4264fb8308fb6f43e75ff44640009b2c1c98fbf" gracePeriod=30 Mar 08 04:18:15.219064 master-0 kubenswrapper[18592]: I0308 04:18:15.210985 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 04:18:15.273807 master-0 kubenswrapper[18592]: I0308 04:18:15.273526 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 04:18:15.290210 master-0 kubenswrapper[18592]: I0308 04:18:15.290175 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-acff9521-23da-47da-b539-1fad9dc0c8dd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4bd1e53e-4aca-4bc2-a5d3-7e2d486d9cfb\") pod \"glance-afe2b-default-external-api-0\" (UID: \"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d\") " pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:15.319832 master-0 kubenswrapper[18592]: I0308 04:18:15.319762 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 04:18:15.320626 master-0 kubenswrapper[18592]: E0308 04:18:15.320601 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d" containerName="ironic-python-agent-init" Mar 08 04:18:15.320731 master-0 kubenswrapper[18592]: I0308 04:18:15.320698 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d" containerName="ironic-python-agent-init" Mar 08 04:18:15.321204 master-0 kubenswrapper[18592]: I0308 04:18:15.321181 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d" containerName="ironic-python-agent-init" Mar 08 04:18:15.334303 master-0 kubenswrapper[18592]: I0308 04:18:15.325924 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 08 04:18:15.336981 master-0 kubenswrapper[18592]: I0308 04:18:15.336713 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-public-svc" Mar 08 04:18:15.337336 master-0 kubenswrapper[18592]: I0308 04:18:15.337305 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 08 04:18:15.337516 master-0 kubenswrapper[18592]: I0308 04:18:15.337491 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Mar 08 04:18:15.337642 master-0 kubenswrapper[18592]: I0308 04:18:15.337617 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 08 04:18:15.343759 master-0 kubenswrapper[18592]: I0308 04:18:15.343723 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-internal-svc" Mar 08 04:18:15.355912 master-0 kubenswrapper[18592]: I0308 04:18:15.355655 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 04:18:15.428385 master-0 kubenswrapper[18592]: I0308 04:18:15.428266 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6936bf71-3ad4-47e9-8df6-9075d83086db-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.428626 master-0 kubenswrapper[18592]: I0308 04:18:15.428610 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6936bf71-3ad4-47e9-8df6-9075d83086db-scripts\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.428737 master-0 kubenswrapper[18592]: I0308 04:18:15.428720 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/6936bf71-3ad4-47e9-8df6-9075d83086db-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.428874 master-0 kubenswrapper[18592]: I0308 04:18:15.428859 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/6936bf71-3ad4-47e9-8df6-9075d83086db-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.429020 master-0 kubenswrapper[18592]: I0308 04:18:15.429005 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44gnn\" (UniqueName: \"kubernetes.io/projected/6936bf71-3ad4-47e9-8df6-9075d83086db-kube-api-access-44gnn\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.429163 master-0 kubenswrapper[18592]: I0308 04:18:15.429148 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6936bf71-3ad4-47e9-8df6-9075d83086db-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.429267 master-0 kubenswrapper[18592]: I0308 04:18:15.429253 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6936bf71-3ad4-47e9-8df6-9075d83086db-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.429370 master-0 kubenswrapper[18592]: I0308 04:18:15.429358 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6936bf71-3ad4-47e9-8df6-9075d83086db-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.429438 master-0 kubenswrapper[18592]: I0308 04:18:15.429427 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6936bf71-3ad4-47e9-8df6-9075d83086db-config\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.504857 master-0 kubenswrapper[18592]: I0308 04:18:15.502109 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:15.532913 master-0 kubenswrapper[18592]: I0308 04:18:15.531157 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6936bf71-3ad4-47e9-8df6-9075d83086db-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.532913 master-0 kubenswrapper[18592]: I0308 04:18:15.531200 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6936bf71-3ad4-47e9-8df6-9075d83086db-config\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.532913 master-0 kubenswrapper[18592]: I0308 04:18:15.531240 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6936bf71-3ad4-47e9-8df6-9075d83086db-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.532913 master-0 kubenswrapper[18592]: I0308 04:18:15.531273 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6936bf71-3ad4-47e9-8df6-9075d83086db-scripts\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.532913 master-0 kubenswrapper[18592]: I0308 04:18:15.531327 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/6936bf71-3ad4-47e9-8df6-9075d83086db-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.532913 master-0 kubenswrapper[18592]: I0308 04:18:15.531369 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/6936bf71-3ad4-47e9-8df6-9075d83086db-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.532913 master-0 kubenswrapper[18592]: I0308 04:18:15.531433 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44gnn\" (UniqueName: \"kubernetes.io/projected/6936bf71-3ad4-47e9-8df6-9075d83086db-kube-api-access-44gnn\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.532913 master-0 kubenswrapper[18592]: I0308 04:18:15.531495 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6936bf71-3ad4-47e9-8df6-9075d83086db-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.532913 master-0 kubenswrapper[18592]: I0308 04:18:15.531532 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6936bf71-3ad4-47e9-8df6-9075d83086db-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.532913 master-0 kubenswrapper[18592]: I0308 04:18:15.532722 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/6936bf71-3ad4-47e9-8df6-9075d83086db-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.536424 master-0 kubenswrapper[18592]: I0308 04:18:15.534231 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/6936bf71-3ad4-47e9-8df6-9075d83086db-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.536424 master-0 kubenswrapper[18592]: I0308 04:18:15.536370 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6936bf71-3ad4-47e9-8df6-9075d83086db-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.537805 master-0 kubenswrapper[18592]: I0308 04:18:15.537758 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6936bf71-3ad4-47e9-8df6-9075d83086db-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.538949 master-0 kubenswrapper[18592]: I0308 04:18:15.538917 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6936bf71-3ad4-47e9-8df6-9075d83086db-config\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.542492 master-0 kubenswrapper[18592]: I0308 04:18:15.542446 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6936bf71-3ad4-47e9-8df6-9075d83086db-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.542793 master-0 kubenswrapper[18592]: I0308 04:18:15.542748 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6936bf71-3ad4-47e9-8df6-9075d83086db-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.545152 master-0 kubenswrapper[18592]: I0308 04:18:15.545111 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6936bf71-3ad4-47e9-8df6-9075d83086db-scripts\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.550236 master-0 kubenswrapper[18592]: I0308 04:18:15.550185 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44gnn\" (UniqueName: \"kubernetes.io/projected/6936bf71-3ad4-47e9-8df6-9075d83086db-kube-api-access-44gnn\") pod \"ironic-inspector-0\" (UID: \"6936bf71-3ad4-47e9-8df6-9075d83086db\") " pod="openstack/ironic-inspector-0" Mar 08 04:18:15.660085 master-0 kubenswrapper[18592]: I0308 04:18:15.660027 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 08 04:18:15.953519 master-0 kubenswrapper[18592]: I0308 04:18:15.953404 18592 generic.go:334] "Generic (PLEG): container finished" podID="9c77d97d-417a-48b7-a871-6800b12fbcb7" containerID="921d18396cd77485dd40de28e9920f77cdd014dc6f721b25d815f0d811d8d65b" exitCode=143 Mar 08 04:18:15.954021 master-0 kubenswrapper[18592]: I0308 04:18:15.953991 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57f9f57fc6-pfwg4" event={"ID":"9c77d97d-417a-48b7-a871-6800b12fbcb7","Type":"ContainerDied","Data":"921d18396cd77485dd40de28e9920f77cdd014dc6f721b25d815f0d811d8d65b"} Mar 08 04:18:15.960263 master-0 kubenswrapper[18592]: I0308 04:18:15.960216 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ff301-api-0" event={"ID":"588d2fd8-2c47-44e1-b3d9-d1f95c7f1616","Type":"ContainerStarted","Data":"146c41bdd73306ae3b7eee2b6945e665f633fd9e334337adcc086c85adeaf264"} Mar 08 04:18:15.960609 master-0 kubenswrapper[18592]: I0308 04:18:15.960577 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-ff301-api-0" Mar 08 04:18:15.998129 master-0 kubenswrapper[18592]: I0308 04:18:15.996636 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-ff301-api-0" podStartSLOduration=4.9966107619999995 podStartE2EDuration="4.996610762s" podCreationTimestamp="2026-03-08 04:18:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:18:15.988188473 +0000 UTC m=+1508.086942823" watchObservedRunningTime="2026-03-08 04:18:15.996610762 +0000 UTC m=+1508.095365112" Mar 08 04:18:16.145062 master-0 kubenswrapper[18592]: I0308 04:18:16.144919 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9bbcd16a-3cea-4572-8bc5-480804def335\" (UniqueName: \"kubernetes.io/csi/topolvm.io^613c9bf7-76cc-44ce-8d9f-8cdae5e6db9e\") pod \"glance-afe2b-default-internal-api-0\" (UID: \"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a\") " pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:16.149506 master-0 kubenswrapper[18592]: I0308 04:18:16.149200 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:16.244837 master-0 kubenswrapper[18592]: I0308 04:18:16.239943 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d" path="/var/lib/kubelet/pods/3dd3d4c3-c2d6-4ba5-8ea4-a020a22dbb2d/volumes" Mar 08 04:18:16.244837 master-0 kubenswrapper[18592]: I0308 04:18:16.240640 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-afe2b-default-external-api-0"] Mar 08 04:18:16.353086 master-0 kubenswrapper[18592]: I0308 04:18:16.346367 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 04:18:16.400686 master-0 kubenswrapper[18592]: I0308 04:18:16.398561 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f75d-account-create-update-fxddp" Mar 08 04:18:16.467514 master-0 kubenswrapper[18592]: I0308 04:18:16.467398 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2km7f\" (UniqueName: \"kubernetes.io/projected/6a5fc841-ddfd-4704-9a4c-878bcbb98bcc-kube-api-access-2km7f\") pod \"6a5fc841-ddfd-4704-9a4c-878bcbb98bcc\" (UID: \"6a5fc841-ddfd-4704-9a4c-878bcbb98bcc\") " Mar 08 04:18:16.467688 master-0 kubenswrapper[18592]: I0308 04:18:16.467585 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a5fc841-ddfd-4704-9a4c-878bcbb98bcc-operator-scripts\") pod \"6a5fc841-ddfd-4704-9a4c-878bcbb98bcc\" (UID: \"6a5fc841-ddfd-4704-9a4c-878bcbb98bcc\") " Mar 08 04:18:16.468747 master-0 kubenswrapper[18592]: I0308 04:18:16.468592 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a5fc841-ddfd-4704-9a4c-878bcbb98bcc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6a5fc841-ddfd-4704-9a4c-878bcbb98bcc" (UID: "6a5fc841-ddfd-4704-9a4c-878bcbb98bcc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:18:16.502334 master-0 kubenswrapper[18592]: I0308 04:18:16.501229 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a5fc841-ddfd-4704-9a4c-878bcbb98bcc-kube-api-access-2km7f" (OuterVolumeSpecName: "kube-api-access-2km7f") pod "6a5fc841-ddfd-4704-9a4c-878bcbb98bcc" (UID: "6a5fc841-ddfd-4704-9a4c-878bcbb98bcc"). InnerVolumeSpecName "kube-api-access-2km7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:18:16.572405 master-0 kubenswrapper[18592]: I0308 04:18:16.572234 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2km7f\" (UniqueName: \"kubernetes.io/projected/6a5fc841-ddfd-4704-9a4c-878bcbb98bcc-kube-api-access-2km7f\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:16.572405 master-0 kubenswrapper[18592]: I0308 04:18:16.572305 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a5fc841-ddfd-4704-9a4c-878bcbb98bcc-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:16.805657 master-0 kubenswrapper[18592]: I0308 04:18:16.798131 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6jgd8" Mar 08 04:18:16.805657 master-0 kubenswrapper[18592]: I0308 04:18:16.801550 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qrztf" Mar 08 04:18:16.886101 master-0 kubenswrapper[18592]: I0308 04:18:16.882458 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c678d7e-e3ea-40d5-b265-cf42ac1139c6-operator-scripts\") pod \"7c678d7e-e3ea-40d5-b265-cf42ac1139c6\" (UID: \"7c678d7e-e3ea-40d5-b265-cf42ac1139c6\") " Mar 08 04:18:16.886101 master-0 kubenswrapper[18592]: I0308 04:18:16.882595 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j22z\" (UniqueName: \"kubernetes.io/projected/f2a5661f-15ae-4df4-ab1f-8539afd4d339-kube-api-access-7j22z\") pod \"f2a5661f-15ae-4df4-ab1f-8539afd4d339\" (UID: \"f2a5661f-15ae-4df4-ab1f-8539afd4d339\") " Mar 08 04:18:16.886101 master-0 kubenswrapper[18592]: I0308 04:18:16.882841 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2a5661f-15ae-4df4-ab1f-8539afd4d339-operator-scripts\") pod \"f2a5661f-15ae-4df4-ab1f-8539afd4d339\" (UID: \"f2a5661f-15ae-4df4-ab1f-8539afd4d339\") " Mar 08 04:18:16.886101 master-0 kubenswrapper[18592]: I0308 04:18:16.882872 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqmq9\" (UniqueName: \"kubernetes.io/projected/7c678d7e-e3ea-40d5-b265-cf42ac1139c6-kube-api-access-lqmq9\") pod \"7c678d7e-e3ea-40d5-b265-cf42ac1139c6\" (UID: \"7c678d7e-e3ea-40d5-b265-cf42ac1139c6\") " Mar 08 04:18:16.886101 master-0 kubenswrapper[18592]: I0308 04:18:16.883907 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a5661f-15ae-4df4-ab1f-8539afd4d339-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2a5661f-15ae-4df4-ab1f-8539afd4d339" (UID: "f2a5661f-15ae-4df4-ab1f-8539afd4d339"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:18:16.886101 master-0 kubenswrapper[18592]: I0308 04:18:16.884072 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c678d7e-e3ea-40d5-b265-cf42ac1139c6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c678d7e-e3ea-40d5-b265-cf42ac1139c6" (UID: "7c678d7e-e3ea-40d5-b265-cf42ac1139c6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:18:16.891040 master-0 kubenswrapper[18592]: I0308 04:18:16.890929 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2a5661f-15ae-4df4-ab1f-8539afd4d339-kube-api-access-7j22z" (OuterVolumeSpecName: "kube-api-access-7j22z") pod "f2a5661f-15ae-4df4-ab1f-8539afd4d339" (UID: "f2a5661f-15ae-4df4-ab1f-8539afd4d339"). InnerVolumeSpecName "kube-api-access-7j22z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:18:16.895598 master-0 kubenswrapper[18592]: I0308 04:18:16.895537 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c678d7e-e3ea-40d5-b265-cf42ac1139c6-kube-api-access-lqmq9" (OuterVolumeSpecName: "kube-api-access-lqmq9") pod "7c678d7e-e3ea-40d5-b265-cf42ac1139c6" (UID: "7c678d7e-e3ea-40d5-b265-cf42ac1139c6"). InnerVolumeSpecName "kube-api-access-lqmq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:18:16.987865 master-0 kubenswrapper[18592]: I0308 04:18:16.987801 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2a5661f-15ae-4df4-ab1f-8539afd4d339-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:16.988453 master-0 kubenswrapper[18592]: I0308 04:18:16.988433 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqmq9\" (UniqueName: \"kubernetes.io/projected/7c678d7e-e3ea-40d5-b265-cf42ac1139c6-kube-api-access-lqmq9\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:16.988548 master-0 kubenswrapper[18592]: I0308 04:18:16.988534 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c678d7e-e3ea-40d5-b265-cf42ac1139c6-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:16.988633 master-0 kubenswrapper[18592]: I0308 04:18:16.988619 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j22z\" (UniqueName: \"kubernetes.io/projected/f2a5661f-15ae-4df4-ab1f-8539afd4d339-kube-api-access-7j22z\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:17.092075 master-0 kubenswrapper[18592]: I0308 04:18:17.091630 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6jgd8" event={"ID":"f2a5661f-15ae-4df4-ab1f-8539afd4d339","Type":"ContainerDied","Data":"01b43a434a0b3b6ca898483c325e86d85f9584e1279ccba2667f71198698f23f"} Mar 08 04:18:17.092075 master-0 kubenswrapper[18592]: I0308 04:18:17.091682 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01b43a434a0b3b6ca898483c325e86d85f9584e1279ccba2667f71198698f23f" Mar 08 04:18:17.092075 master-0 kubenswrapper[18592]: I0308 04:18:17.091770 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6jgd8" Mar 08 04:18:17.127639 master-0 kubenswrapper[18592]: I0308 04:18:17.127488 18592 generic.go:334] "Generic (PLEG): container finished" podID="6936bf71-3ad4-47e9-8df6-9075d83086db" containerID="c6aee6364ae89a7ec913186f391bca2e71e56afd4570406d72083350409f713d" exitCode=0 Mar 08 04:18:17.127639 master-0 kubenswrapper[18592]: I0308 04:18:17.127639 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"6936bf71-3ad4-47e9-8df6-9075d83086db","Type":"ContainerDied","Data":"c6aee6364ae89a7ec913186f391bca2e71e56afd4570406d72083350409f713d"} Mar 08 04:18:17.127988 master-0 kubenswrapper[18592]: I0308 04:18:17.127668 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"6936bf71-3ad4-47e9-8df6-9075d83086db","Type":"ContainerStarted","Data":"0c7745b41a584adce4694183c951863db2cacae149f27061669f64bba1331828"} Mar 08 04:18:17.170869 master-0 kubenswrapper[18592]: I0308 04:18:17.168671 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-afe2b-default-external-api-0" event={"ID":"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d","Type":"ContainerStarted","Data":"f4579a240ae9ae834e94c1b4215d5fefa358db679d9b4a6dd3617001c65d5bd0"} Mar 08 04:18:17.177308 master-0 kubenswrapper[18592]: I0308 04:18:17.177251 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qrztf" event={"ID":"7c678d7e-e3ea-40d5-b265-cf42ac1139c6","Type":"ContainerDied","Data":"486cc29f84f61fa9cb79dc47d04d562c693dd43408de31003c55a9747a0be8b6"} Mar 08 04:18:17.177308 master-0 kubenswrapper[18592]: I0308 04:18:17.177310 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="486cc29f84f61fa9cb79dc47d04d562c693dd43408de31003c55a9747a0be8b6" Mar 08 04:18:17.177572 master-0 kubenswrapper[18592]: I0308 04:18:17.177315 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qrztf" Mar 08 04:18:17.181108 master-0 kubenswrapper[18592]: I0308 04:18:17.180749 18592 generic.go:334] "Generic (PLEG): container finished" podID="abe912ba-4a33-4634-a3fb-b6fb09b38d8e" containerID="b8a8913f0f4e234c799b2e4a7219880168db2e3381aa36a6d86a2dd1d0f4d190" exitCode=0 Mar 08 04:18:17.181108 master-0 kubenswrapper[18592]: I0308 04:18:17.180871 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"abe912ba-4a33-4634-a3fb-b6fb09b38d8e","Type":"ContainerDied","Data":"b8a8913f0f4e234c799b2e4a7219880168db2e3381aa36a6d86a2dd1d0f4d190"} Mar 08 04:18:17.188714 master-0 kubenswrapper[18592]: I0308 04:18:17.188361 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f75d-account-create-update-fxddp" Mar 08 04:18:17.188714 master-0 kubenswrapper[18592]: I0308 04:18:17.188664 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f75d-account-create-update-fxddp" event={"ID":"6a5fc841-ddfd-4704-9a4c-878bcbb98bcc","Type":"ContainerDied","Data":"c90e11f982fc4650194b8d8ad4c966d088303c068b5708a70ac534b4391e8989"} Mar 08 04:18:17.188714 master-0 kubenswrapper[18592]: I0308 04:18:17.188711 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c90e11f982fc4650194b8d8ad4c966d088303c068b5708a70ac534b4391e8989" Mar 08 04:18:17.229337 master-0 kubenswrapper[18592]: I0308 04:18:17.229279 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-afe2b-default-internal-api-0"] Mar 08 04:18:17.236952 master-0 kubenswrapper[18592]: I0308 04:18:17.236919 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2306-account-create-update-c95sx" Mar 08 04:18:17.276337 master-0 kubenswrapper[18592]: I0308 04:18:17.261192 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e145-account-create-update-nccdt" Mar 08 04:18:17.287040 master-0 kubenswrapper[18592]: I0308 04:18:17.285886 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f4tg6" Mar 08 04:18:17.432665 master-0 kubenswrapper[18592]: I0308 04:18:17.432347 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzpv4\" (UniqueName: \"kubernetes.io/projected/955ce42c-5d68-4659-a993-85d566eb7c0c-kube-api-access-pzpv4\") pod \"955ce42c-5d68-4659-a993-85d566eb7c0c\" (UID: \"955ce42c-5d68-4659-a993-85d566eb7c0c\") " Mar 08 04:18:17.432665 master-0 kubenswrapper[18592]: I0308 04:18:17.432444 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwl6w\" (UniqueName: \"kubernetes.io/projected/3fefd13e-e178-4790-823e-458456886a84-kube-api-access-qwl6w\") pod \"3fefd13e-e178-4790-823e-458456886a84\" (UID: \"3fefd13e-e178-4790-823e-458456886a84\") " Mar 08 04:18:17.432665 master-0 kubenswrapper[18592]: I0308 04:18:17.432468 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fefd13e-e178-4790-823e-458456886a84-operator-scripts\") pod \"3fefd13e-e178-4790-823e-458456886a84\" (UID: \"3fefd13e-e178-4790-823e-458456886a84\") " Mar 08 04:18:17.432665 master-0 kubenswrapper[18592]: I0308 04:18:17.432598 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj5dq\" (UniqueName: \"kubernetes.io/projected/bafa1006-fce0-4733-9706-a4de6df10ac7-kube-api-access-sj5dq\") pod \"bafa1006-fce0-4733-9706-a4de6df10ac7\" (UID: \"bafa1006-fce0-4733-9706-a4de6df10ac7\") " Mar 08 04:18:17.432665 master-0 kubenswrapper[18592]: I0308 04:18:17.432628 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafa1006-fce0-4733-9706-a4de6df10ac7-operator-scripts\") pod \"bafa1006-fce0-4733-9706-a4de6df10ac7\" (UID: \"bafa1006-fce0-4733-9706-a4de6df10ac7\") " Mar 08 04:18:17.433012 master-0 kubenswrapper[18592]: I0308 04:18:17.432684 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/955ce42c-5d68-4659-a993-85d566eb7c0c-operator-scripts\") pod \"955ce42c-5d68-4659-a993-85d566eb7c0c\" (UID: \"955ce42c-5d68-4659-a993-85d566eb7c0c\") " Mar 08 04:18:17.433678 master-0 kubenswrapper[18592]: I0308 04:18:17.433605 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fefd13e-e178-4790-823e-458456886a84-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3fefd13e-e178-4790-823e-458456886a84" (UID: "3fefd13e-e178-4790-823e-458456886a84"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:18:17.433861 master-0 kubenswrapper[18592]: I0308 04:18:17.433757 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/955ce42c-5d68-4659-a993-85d566eb7c0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "955ce42c-5d68-4659-a993-85d566eb7c0c" (UID: "955ce42c-5d68-4659-a993-85d566eb7c0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:18:17.436619 master-0 kubenswrapper[18592]: I0308 04:18:17.435578 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/955ce42c-5d68-4659-a993-85d566eb7c0c-kube-api-access-pzpv4" (OuterVolumeSpecName: "kube-api-access-pzpv4") pod "955ce42c-5d68-4659-a993-85d566eb7c0c" (UID: "955ce42c-5d68-4659-a993-85d566eb7c0c"). InnerVolumeSpecName "kube-api-access-pzpv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:18:17.437254 master-0 kubenswrapper[18592]: I0308 04:18:17.437212 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bafa1006-fce0-4733-9706-a4de6df10ac7-kube-api-access-sj5dq" (OuterVolumeSpecName: "kube-api-access-sj5dq") pod "bafa1006-fce0-4733-9706-a4de6df10ac7" (UID: "bafa1006-fce0-4733-9706-a4de6df10ac7"). InnerVolumeSpecName "kube-api-access-sj5dq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:18:17.437903 master-0 kubenswrapper[18592]: I0308 04:18:17.437849 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bafa1006-fce0-4733-9706-a4de6df10ac7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bafa1006-fce0-4733-9706-a4de6df10ac7" (UID: "bafa1006-fce0-4733-9706-a4de6df10ac7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:18:17.441229 master-0 kubenswrapper[18592]: I0308 04:18:17.441171 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fefd13e-e178-4790-823e-458456886a84-kube-api-access-qwl6w" (OuterVolumeSpecName: "kube-api-access-qwl6w") pod "3fefd13e-e178-4790-823e-458456886a84" (UID: "3fefd13e-e178-4790-823e-458456886a84"). InnerVolumeSpecName "kube-api-access-qwl6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:18:17.536588 master-0 kubenswrapper[18592]: I0308 04:18:17.535926 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzpv4\" (UniqueName: \"kubernetes.io/projected/955ce42c-5d68-4659-a993-85d566eb7c0c-kube-api-access-pzpv4\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:17.543123 master-0 kubenswrapper[18592]: I0308 04:18:17.542783 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwl6w\" (UniqueName: \"kubernetes.io/projected/3fefd13e-e178-4790-823e-458456886a84-kube-api-access-qwl6w\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:17.543123 master-0 kubenswrapper[18592]: I0308 04:18:17.543132 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fefd13e-e178-4790-823e-458456886a84-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:17.543383 master-0 kubenswrapper[18592]: I0308 04:18:17.543150 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj5dq\" (UniqueName: \"kubernetes.io/projected/bafa1006-fce0-4733-9706-a4de6df10ac7-kube-api-access-sj5dq\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:17.543383 master-0 kubenswrapper[18592]: I0308 04:18:17.543162 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafa1006-fce0-4733-9706-a4de6df10ac7-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:17.543383 master-0 kubenswrapper[18592]: I0308 04:18:17.543174 18592 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/955ce42c-5d68-4659-a993-85d566eb7c0c-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:18.221560 master-0 kubenswrapper[18592]: I0308 04:18:18.221507 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-afe2b-default-internal-api-0" event={"ID":"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a","Type":"ContainerStarted","Data":"beabb2af4bb48453261185bda728d506cec928a2bcebef58af15f7810d86263a"} Mar 08 04:18:18.221560 master-0 kubenswrapper[18592]: I0308 04:18:18.221558 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-afe2b-default-internal-api-0" event={"ID":"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a","Type":"ContainerStarted","Data":"1af0f4aefa4679522475eaaa3e2ef2b979057afa3504bd45ba67f42a4956c15e"} Mar 08 04:18:18.227418 master-0 kubenswrapper[18592]: I0308 04:18:18.226967 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e145-account-create-update-nccdt" Mar 08 04:18:18.227578 master-0 kubenswrapper[18592]: I0308 04:18:18.227551 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e145-account-create-update-nccdt" event={"ID":"955ce42c-5d68-4659-a993-85d566eb7c0c","Type":"ContainerDied","Data":"775808f9a1284a8c4fc8300ddf5a19e25df26151f4468c426b1edd1f947b26eb"} Mar 08 04:18:18.227639 master-0 kubenswrapper[18592]: I0308 04:18:18.227582 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="775808f9a1284a8c4fc8300ddf5a19e25df26151f4468c426b1edd1f947b26eb" Mar 08 04:18:18.231908 master-0 kubenswrapper[18592]: I0308 04:18:18.231844 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2306-account-create-update-c95sx" Mar 08 04:18:18.232461 master-0 kubenswrapper[18592]: I0308 04:18:18.231832 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2306-account-create-update-c95sx" event={"ID":"3fefd13e-e178-4790-823e-458456886a84","Type":"ContainerDied","Data":"0e5fb03d1b4b6e2402a387859a530ac2182eb8b50593092aff1fa76afe8555e5"} Mar 08 04:18:18.232461 master-0 kubenswrapper[18592]: I0308 04:18:18.232460 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e5fb03d1b4b6e2402a387859a530ac2182eb8b50593092aff1fa76afe8555e5" Mar 08 04:18:18.235925 master-0 kubenswrapper[18592]: I0308 04:18:18.235876 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f4tg6" event={"ID":"bafa1006-fce0-4733-9706-a4de6df10ac7","Type":"ContainerDied","Data":"3b658c1400db45e6a5d3b6f5b1f36b1efb3f5efc89b4df81fda23f6b1d4a0fd2"} Mar 08 04:18:18.235925 master-0 kubenswrapper[18592]: I0308 04:18:18.235918 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b658c1400db45e6a5d3b6f5b1f36b1efb3f5efc89b4df81fda23f6b1d4a0fd2" Mar 08 04:18:18.236027 master-0 kubenswrapper[18592]: I0308 04:18:18.235970 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f4tg6" Mar 08 04:18:18.245165 master-0 kubenswrapper[18592]: I0308 04:18:18.245130 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-afe2b-default-external-api-0" event={"ID":"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d","Type":"ContainerStarted","Data":"1d4e61387181181bf2303361396f1d2d82661bb197d0a2ae88fc58b77ef7a3d1"} Mar 08 04:18:18.253655 master-0 kubenswrapper[18592]: I0308 04:18:18.253097 18592 generic.go:334] "Generic (PLEG): container finished" podID="05ba9b98-7d2f-4a9b-80ad-60793d8279e8" containerID="1988a339b07b0c491c35385954db8882edaee30eeede55407a4d33cef13b5291" exitCode=1 Mar 08 04:18:18.253655 master-0 kubenswrapper[18592]: I0308 04:18:18.253160 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" event={"ID":"05ba9b98-7d2f-4a9b-80ad-60793d8279e8","Type":"ContainerDied","Data":"1988a339b07b0c491c35385954db8882edaee30eeede55407a4d33cef13b5291"} Mar 08 04:18:18.253655 master-0 kubenswrapper[18592]: I0308 04:18:18.253208 18592 scope.go:117] "RemoveContainer" containerID="ecb55f322362dccbbff792a642555df63b70ceae0b4b089367e3d027710563b8" Mar 08 04:18:18.254415 master-0 kubenswrapper[18592]: I0308 04:18:18.253951 18592 scope.go:117] "RemoveContainer" containerID="1988a339b07b0c491c35385954db8882edaee30eeede55407a4d33cef13b5291" Mar 08 04:18:18.254415 master-0 kubenswrapper[18592]: E0308 04:18:18.254200 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-859d47fc89-z2wvz_openstack(05ba9b98-7d2f-4a9b-80ad-60793d8279e8)\"" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" podUID="05ba9b98-7d2f-4a9b-80ad-60793d8279e8" Mar 08 04:18:18.520482 master-0 kubenswrapper[18592]: I0308 04:18:18.520371 18592 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:18:18.621338 master-0 kubenswrapper[18592]: I0308 04:18:18.621287 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:18:19.248321 master-0 kubenswrapper[18592]: I0308 04:18:19.248272 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:18:19.281916 master-0 kubenswrapper[18592]: I0308 04:18:19.275918 18592 scope.go:117] "RemoveContainer" containerID="1988a339b07b0c491c35385954db8882edaee30eeede55407a4d33cef13b5291" Mar 08 04:18:19.281916 master-0 kubenswrapper[18592]: E0308 04:18:19.276249 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-859d47fc89-z2wvz_openstack(05ba9b98-7d2f-4a9b-80ad-60793d8279e8)\"" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" podUID="05ba9b98-7d2f-4a9b-80ad-60793d8279e8" Mar 08 04:18:19.281916 master-0 kubenswrapper[18592]: I0308 04:18:19.277420 18592 generic.go:334] "Generic (PLEG): container finished" podID="9c77d97d-417a-48b7-a871-6800b12fbcb7" containerID="0f947b349b8be56666ad7c86e4264fb8308fb6f43e75ff44640009b2c1c98fbf" exitCode=0 Mar 08 04:18:19.281916 master-0 kubenswrapper[18592]: I0308 04:18:19.277474 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57f9f57fc6-pfwg4" event={"ID":"9c77d97d-417a-48b7-a871-6800b12fbcb7","Type":"ContainerDied","Data":"0f947b349b8be56666ad7c86e4264fb8308fb6f43e75ff44640009b2c1c98fbf"} Mar 08 04:18:19.281916 master-0 kubenswrapper[18592]: I0308 04:18:19.277505 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57f9f57fc6-pfwg4" event={"ID":"9c77d97d-417a-48b7-a871-6800b12fbcb7","Type":"ContainerDied","Data":"e60e5c23a54b956e63f7848ad5385e7127ffdf922977750f5b2a5d49ffe3bc43"} Mar 08 04:18:19.281916 master-0 kubenswrapper[18592]: I0308 04:18:19.277524 18592 scope.go:117] "RemoveContainer" containerID="0f947b349b8be56666ad7c86e4264fb8308fb6f43e75ff44640009b2c1c98fbf" Mar 08 04:18:19.281916 master-0 kubenswrapper[18592]: I0308 04:18:19.277636 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-57f9f57fc6-pfwg4" Mar 08 04:18:19.287854 master-0 kubenswrapper[18592]: I0308 04:18:19.287130 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-afe2b-default-internal-api-0" event={"ID":"7ff800ca-3f34-40c3-a4d1-329fe69e0c3a","Type":"ContainerStarted","Data":"7d70e7207e46884be79b7f202777058308a93d61dd7490061f8d8164d0fa2585"} Mar 08 04:18:19.294338 master-0 kubenswrapper[18592]: I0308 04:18:19.294286 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-afe2b-default-external-api-0" event={"ID":"b1f5ab90-a4e1-47d7-9d79-22cc65e4295d","Type":"ContainerStarted","Data":"262f3202dfdf2ac45ae90b2cbccfad9c2ad0bcff97d0efec62f24f3b9b48a333"} Mar 08 04:18:19.340538 master-0 kubenswrapper[18592]: I0308 04:18:19.340439 18592 scope.go:117] "RemoveContainer" containerID="921d18396cd77485dd40de28e9920f77cdd014dc6f721b25d815f0d811d8d65b" Mar 08 04:18:19.394245 master-0 kubenswrapper[18592]: I0308 04:18:19.394180 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlbnw\" (UniqueName: \"kubernetes.io/projected/9c77d97d-417a-48b7-a871-6800b12fbcb7-kube-api-access-rlbnw\") pod \"9c77d97d-417a-48b7-a871-6800b12fbcb7\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " Mar 08 04:18:19.394245 master-0 kubenswrapper[18592]: I0308 04:18:19.394242 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-combined-ca-bundle\") pod \"9c77d97d-417a-48b7-a871-6800b12fbcb7\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " Mar 08 04:18:19.394495 master-0 kubenswrapper[18592]: I0308 04:18:19.394271 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-scripts\") pod \"9c77d97d-417a-48b7-a871-6800b12fbcb7\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " Mar 08 04:18:19.394495 master-0 kubenswrapper[18592]: I0308 04:18:19.394364 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-internal-tls-certs\") pod \"9c77d97d-417a-48b7-a871-6800b12fbcb7\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " Mar 08 04:18:19.394495 master-0 kubenswrapper[18592]: I0308 04:18:19.394388 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c77d97d-417a-48b7-a871-6800b12fbcb7-logs\") pod \"9c77d97d-417a-48b7-a871-6800b12fbcb7\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " Mar 08 04:18:19.394495 master-0 kubenswrapper[18592]: I0308 04:18:19.394468 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-config-data\") pod \"9c77d97d-417a-48b7-a871-6800b12fbcb7\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " Mar 08 04:18:19.394612 master-0 kubenswrapper[18592]: I0308 04:18:19.394508 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-public-tls-certs\") pod \"9c77d97d-417a-48b7-a871-6800b12fbcb7\" (UID: \"9c77d97d-417a-48b7-a871-6800b12fbcb7\") " Mar 08 04:18:19.395403 master-0 kubenswrapper[18592]: I0308 04:18:19.395276 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c77d97d-417a-48b7-a871-6800b12fbcb7-logs" (OuterVolumeSpecName: "logs") pod "9c77d97d-417a-48b7-a871-6800b12fbcb7" (UID: "9c77d97d-417a-48b7-a871-6800b12fbcb7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:18:19.399563 master-0 kubenswrapper[18592]: I0308 04:18:19.399518 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-scripts" (OuterVolumeSpecName: "scripts") pod "9c77d97d-417a-48b7-a871-6800b12fbcb7" (UID: "9c77d97d-417a-48b7-a871-6800b12fbcb7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:19.402275 master-0 kubenswrapper[18592]: I0308 04:18:19.402226 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c77d97d-417a-48b7-a871-6800b12fbcb7-kube-api-access-rlbnw" (OuterVolumeSpecName: "kube-api-access-rlbnw") pod "9c77d97d-417a-48b7-a871-6800b12fbcb7" (UID: "9c77d97d-417a-48b7-a871-6800b12fbcb7"). InnerVolumeSpecName "kube-api-access-rlbnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:18:19.404586 master-0 kubenswrapper[18592]: I0308 04:18:19.404027 18592 scope.go:117] "RemoveContainer" containerID="0f947b349b8be56666ad7c86e4264fb8308fb6f43e75ff44640009b2c1c98fbf" Mar 08 04:18:19.404586 master-0 kubenswrapper[18592]: E0308 04:18:19.404554 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f947b349b8be56666ad7c86e4264fb8308fb6f43e75ff44640009b2c1c98fbf\": container with ID starting with 0f947b349b8be56666ad7c86e4264fb8308fb6f43e75ff44640009b2c1c98fbf not found: ID does not exist" containerID="0f947b349b8be56666ad7c86e4264fb8308fb6f43e75ff44640009b2c1c98fbf" Mar 08 04:18:19.404691 master-0 kubenswrapper[18592]: I0308 04:18:19.404584 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f947b349b8be56666ad7c86e4264fb8308fb6f43e75ff44640009b2c1c98fbf"} err="failed to get container status \"0f947b349b8be56666ad7c86e4264fb8308fb6f43e75ff44640009b2c1c98fbf\": rpc error: code = NotFound desc = could not find container \"0f947b349b8be56666ad7c86e4264fb8308fb6f43e75ff44640009b2c1c98fbf\": container with ID starting with 0f947b349b8be56666ad7c86e4264fb8308fb6f43e75ff44640009b2c1c98fbf not found: ID does not exist" Mar 08 04:18:19.404691 master-0 kubenswrapper[18592]: I0308 04:18:19.404603 18592 scope.go:117] "RemoveContainer" containerID="921d18396cd77485dd40de28e9920f77cdd014dc6f721b25d815f0d811d8d65b" Mar 08 04:18:19.405176 master-0 kubenswrapper[18592]: E0308 04:18:19.404858 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"921d18396cd77485dd40de28e9920f77cdd014dc6f721b25d815f0d811d8d65b\": container with ID starting with 921d18396cd77485dd40de28e9920f77cdd014dc6f721b25d815f0d811d8d65b not found: ID does not exist" containerID="921d18396cd77485dd40de28e9920f77cdd014dc6f721b25d815f0d811d8d65b" Mar 08 04:18:19.405176 master-0 kubenswrapper[18592]: I0308 04:18:19.404899 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"921d18396cd77485dd40de28e9920f77cdd014dc6f721b25d815f0d811d8d65b"} err="failed to get container status \"921d18396cd77485dd40de28e9920f77cdd014dc6f721b25d815f0d811d8d65b\": rpc error: code = NotFound desc = could not find container \"921d18396cd77485dd40de28e9920f77cdd014dc6f721b25d815f0d811d8d65b\": container with ID starting with 921d18396cd77485dd40de28e9920f77cdd014dc6f721b25d815f0d811d8d65b not found: ID does not exist" Mar 08 04:18:19.495431 master-0 kubenswrapper[18592]: I0308 04:18:19.495349 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-config-data" (OuterVolumeSpecName: "config-data") pod "9c77d97d-417a-48b7-a871-6800b12fbcb7" (UID: "9c77d97d-417a-48b7-a871-6800b12fbcb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:19.497314 master-0 kubenswrapper[18592]: I0308 04:18:19.497221 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlbnw\" (UniqueName: \"kubernetes.io/projected/9c77d97d-417a-48b7-a871-6800b12fbcb7-kube-api-access-rlbnw\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:19.497639 master-0 kubenswrapper[18592]: I0308 04:18:19.497328 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:19.497639 master-0 kubenswrapper[18592]: I0308 04:18:19.497341 18592 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c77d97d-417a-48b7-a871-6800b12fbcb7-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:19.497639 master-0 kubenswrapper[18592]: I0308 04:18:19.497350 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:19.516016 master-0 kubenswrapper[18592]: I0308 04:18:19.515920 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c77d97d-417a-48b7-a871-6800b12fbcb7" (UID: "9c77d97d-417a-48b7-a871-6800b12fbcb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:19.529289 master-0 kubenswrapper[18592]: I0308 04:18:19.529214 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9c77d97d-417a-48b7-a871-6800b12fbcb7" (UID: "9c77d97d-417a-48b7-a871-6800b12fbcb7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:19.545749 master-0 kubenswrapper[18592]: I0308 04:18:19.545674 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9c77d97d-417a-48b7-a871-6800b12fbcb7" (UID: "9c77d97d-417a-48b7-a871-6800b12fbcb7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:19.599695 master-0 kubenswrapper[18592]: I0308 04:18:19.599643 18592 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:19.599695 master-0 kubenswrapper[18592]: I0308 04:18:19.599686 18592 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:19.599695 master-0 kubenswrapper[18592]: I0308 04:18:19.599697 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c77d97d-417a-48b7-a871-6800b12fbcb7-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:19.948204 master-0 kubenswrapper[18592]: I0308 04:18:19.945886 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76b6c769dc-vwrbs"] Mar 08 04:18:19.948204 master-0 kubenswrapper[18592]: I0308 04:18:19.946124 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" podUID="9d16547d-20e7-4839-b923-a3a01531ae0d" containerName="dnsmasq-dns" containerID="cri-o://c384a30be75094a05190f4bcfd0ae9c347bb91927d000c9e7d35d9d82e6139e3" gracePeriod=10 Mar 08 04:18:20.672929 master-0 kubenswrapper[18592]: I0308 04:18:20.672876 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-57f9f57fc6-pfwg4"] Mar 08 04:18:20.721184 master-0 kubenswrapper[18592]: I0308 04:18:20.720982 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-57f9f57fc6-pfwg4"] Mar 08 04:18:20.731427 master-0 kubenswrapper[18592]: I0308 04:18:20.727243 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-afe2b-default-external-api-0" podStartSLOduration=9.727188248000001 podStartE2EDuration="9.727188248s" podCreationTimestamp="2026-03-08 04:18:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:18:20.690173493 +0000 UTC m=+1512.788927843" watchObservedRunningTime="2026-03-08 04:18:20.727188248 +0000 UTC m=+1512.825942588" Mar 08 04:18:20.768806 master-0 kubenswrapper[18592]: I0308 04:18:20.768724 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-afe2b-default-internal-api-0" podStartSLOduration=7.768698246 podStartE2EDuration="7.768698246s" podCreationTimestamp="2026-03-08 04:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:18:20.759154366 +0000 UTC m=+1512.857908716" watchObservedRunningTime="2026-03-08 04:18:20.768698246 +0000 UTC m=+1512.867452606" Mar 08 04:18:22.059665 master-0 kubenswrapper[18592]: I0308 04:18:22.059588 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jq2f2"] Mar 08 04:18:22.060362 master-0 kubenswrapper[18592]: E0308 04:18:22.060089 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c77d97d-417a-48b7-a871-6800b12fbcb7" containerName="placement-log" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: I0308 04:18:22.061104 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c77d97d-417a-48b7-a871-6800b12fbcb7" containerName="placement-log" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: E0308 04:18:22.061142 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c77d97d-417a-48b7-a871-6800b12fbcb7" containerName="placement-api" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: I0308 04:18:22.061149 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c77d97d-417a-48b7-a871-6800b12fbcb7" containerName="placement-api" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: E0308 04:18:22.061162 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c678d7e-e3ea-40d5-b265-cf42ac1139c6" containerName="mariadb-database-create" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: I0308 04:18:22.061169 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c678d7e-e3ea-40d5-b265-cf42ac1139c6" containerName="mariadb-database-create" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: E0308 04:18:22.061186 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fefd13e-e178-4790-823e-458456886a84" containerName="mariadb-account-create-update" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: I0308 04:18:22.061192 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fefd13e-e178-4790-823e-458456886a84" containerName="mariadb-account-create-update" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: E0308 04:18:22.061212 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a5661f-15ae-4df4-ab1f-8539afd4d339" containerName="mariadb-database-create" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: I0308 04:18:22.061218 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a5661f-15ae-4df4-ab1f-8539afd4d339" containerName="mariadb-database-create" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: E0308 04:18:22.061235 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5fc841-ddfd-4704-9a4c-878bcbb98bcc" containerName="mariadb-account-create-update" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: I0308 04:18:22.061241 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5fc841-ddfd-4704-9a4c-878bcbb98bcc" containerName="mariadb-account-create-update" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: E0308 04:18:22.061261 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafa1006-fce0-4733-9706-a4de6df10ac7" containerName="mariadb-database-create" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: I0308 04:18:22.061267 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafa1006-fce0-4733-9706-a4de6df10ac7" containerName="mariadb-database-create" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: E0308 04:18:22.061286 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="955ce42c-5d68-4659-a993-85d566eb7c0c" containerName="mariadb-account-create-update" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: I0308 04:18:22.061292 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="955ce42c-5d68-4659-a993-85d566eb7c0c" containerName="mariadb-account-create-update" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: I0308 04:18:22.061519 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="955ce42c-5d68-4659-a993-85d566eb7c0c" containerName="mariadb-account-create-update" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: I0308 04:18:22.061544 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="bafa1006-fce0-4733-9706-a4de6df10ac7" containerName="mariadb-database-create" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: I0308 04:18:22.061571 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c77d97d-417a-48b7-a871-6800b12fbcb7" containerName="placement-api" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: I0308 04:18:22.061583 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fefd13e-e178-4790-823e-458456886a84" containerName="mariadb-account-create-update" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: I0308 04:18:22.061590 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5fc841-ddfd-4704-9a4c-878bcbb98bcc" containerName="mariadb-account-create-update" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: I0308 04:18:22.061608 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c678d7e-e3ea-40d5-b265-cf42ac1139c6" containerName="mariadb-database-create" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: I0308 04:18:22.061623 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a5661f-15ae-4df4-ab1f-8539afd4d339" containerName="mariadb-database-create" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: I0308 04:18:22.061639 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c77d97d-417a-48b7-a871-6800b12fbcb7" containerName="placement-log" Mar 08 04:18:22.064854 master-0 kubenswrapper[18592]: I0308 04:18:22.063113 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jq2f2" Mar 08 04:18:22.075514 master-0 kubenswrapper[18592]: I0308 04:18:22.073210 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 08 04:18:22.075514 master-0 kubenswrapper[18592]: I0308 04:18:22.073600 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 04:18:22.084302 master-0 kubenswrapper[18592]: I0308 04:18:22.084234 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jq2f2"] Mar 08 04:18:22.166454 master-0 kubenswrapper[18592]: I0308 04:18:22.166405 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c77d97d-417a-48b7-a871-6800b12fbcb7" path="/var/lib/kubelet/pods/9c77d97d-417a-48b7-a871-6800b12fbcb7/volumes" Mar 08 04:18:22.222841 master-0 kubenswrapper[18592]: I0308 04:18:22.220939 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807694b2-9dea-4fae-b203-ee9b8331871d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jq2f2\" (UID: \"807694b2-9dea-4fae-b203-ee9b8331871d\") " pod="openstack/nova-cell0-conductor-db-sync-jq2f2" Mar 08 04:18:22.222841 master-0 kubenswrapper[18592]: I0308 04:18:22.221199 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z49df\" (UniqueName: \"kubernetes.io/projected/807694b2-9dea-4fae-b203-ee9b8331871d-kube-api-access-z49df\") pod \"nova-cell0-conductor-db-sync-jq2f2\" (UID: \"807694b2-9dea-4fae-b203-ee9b8331871d\") " pod="openstack/nova-cell0-conductor-db-sync-jq2f2" Mar 08 04:18:22.222841 master-0 kubenswrapper[18592]: I0308 04:18:22.221402 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807694b2-9dea-4fae-b203-ee9b8331871d-config-data\") pod \"nova-cell0-conductor-db-sync-jq2f2\" (UID: \"807694b2-9dea-4fae-b203-ee9b8331871d\") " pod="openstack/nova-cell0-conductor-db-sync-jq2f2" Mar 08 04:18:22.222841 master-0 kubenswrapper[18592]: I0308 04:18:22.221589 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/807694b2-9dea-4fae-b203-ee9b8331871d-scripts\") pod \"nova-cell0-conductor-db-sync-jq2f2\" (UID: \"807694b2-9dea-4fae-b203-ee9b8331871d\") " pod="openstack/nova-cell0-conductor-db-sync-jq2f2" Mar 08 04:18:22.327905 master-0 kubenswrapper[18592]: I0308 04:18:22.327721 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807694b2-9dea-4fae-b203-ee9b8331871d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jq2f2\" (UID: \"807694b2-9dea-4fae-b203-ee9b8331871d\") " pod="openstack/nova-cell0-conductor-db-sync-jq2f2" Mar 08 04:18:22.328081 master-0 kubenswrapper[18592]: I0308 04:18:22.327925 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z49df\" (UniqueName: \"kubernetes.io/projected/807694b2-9dea-4fae-b203-ee9b8331871d-kube-api-access-z49df\") pod \"nova-cell0-conductor-db-sync-jq2f2\" (UID: \"807694b2-9dea-4fae-b203-ee9b8331871d\") " pod="openstack/nova-cell0-conductor-db-sync-jq2f2" Mar 08 04:18:22.328081 master-0 kubenswrapper[18592]: I0308 04:18:22.328020 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807694b2-9dea-4fae-b203-ee9b8331871d-config-data\") pod \"nova-cell0-conductor-db-sync-jq2f2\" (UID: \"807694b2-9dea-4fae-b203-ee9b8331871d\") " pod="openstack/nova-cell0-conductor-db-sync-jq2f2" Mar 08 04:18:22.328164 master-0 kubenswrapper[18592]: I0308 04:18:22.328144 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/807694b2-9dea-4fae-b203-ee9b8331871d-scripts\") pod \"nova-cell0-conductor-db-sync-jq2f2\" (UID: \"807694b2-9dea-4fae-b203-ee9b8331871d\") " pod="openstack/nova-cell0-conductor-db-sync-jq2f2" Mar 08 04:18:22.337839 master-0 kubenswrapper[18592]: I0308 04:18:22.336204 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/807694b2-9dea-4fae-b203-ee9b8331871d-scripts\") pod \"nova-cell0-conductor-db-sync-jq2f2\" (UID: \"807694b2-9dea-4fae-b203-ee9b8331871d\") " pod="openstack/nova-cell0-conductor-db-sync-jq2f2" Mar 08 04:18:22.346887 master-0 kubenswrapper[18592]: I0308 04:18:22.340681 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807694b2-9dea-4fae-b203-ee9b8331871d-config-data\") pod \"nova-cell0-conductor-db-sync-jq2f2\" (UID: \"807694b2-9dea-4fae-b203-ee9b8331871d\") " pod="openstack/nova-cell0-conductor-db-sync-jq2f2" Mar 08 04:18:22.346887 master-0 kubenswrapper[18592]: I0308 04:18:22.341429 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807694b2-9dea-4fae-b203-ee9b8331871d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jq2f2\" (UID: \"807694b2-9dea-4fae-b203-ee9b8331871d\") " pod="openstack/nova-cell0-conductor-db-sync-jq2f2" Mar 08 04:18:22.350073 master-0 kubenswrapper[18592]: I0308 04:18:22.347668 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z49df\" (UniqueName: \"kubernetes.io/projected/807694b2-9dea-4fae-b203-ee9b8331871d-kube-api-access-z49df\") pod \"nova-cell0-conductor-db-sync-jq2f2\" (UID: \"807694b2-9dea-4fae-b203-ee9b8331871d\") " pod="openstack/nova-cell0-conductor-db-sync-jq2f2" Mar 08 04:18:22.407847 master-0 kubenswrapper[18592]: I0308 04:18:22.407538 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jq2f2" Mar 08 04:18:23.398211 master-0 kubenswrapper[18592]: I0308 04:18:23.398148 18592 generic.go:334] "Generic (PLEG): container finished" podID="9d16547d-20e7-4839-b923-a3a01531ae0d" containerID="c384a30be75094a05190f4bcfd0ae9c347bb91927d000c9e7d35d9d82e6139e3" exitCode=0 Mar 08 04:18:23.398649 master-0 kubenswrapper[18592]: I0308 04:18:23.398247 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" event={"ID":"9d16547d-20e7-4839-b923-a3a01531ae0d","Type":"ContainerDied","Data":"c384a30be75094a05190f4bcfd0ae9c347bb91927d000c9e7d35d9d82e6139e3"} Mar 08 04:18:23.492140 master-0 kubenswrapper[18592]: I0308 04:18:23.485062 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:18:23.612923 master-0 kubenswrapper[18592]: I0308 04:18:23.607505 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jq2f2"] Mar 08 04:18:23.664338 master-0 kubenswrapper[18592]: I0308 04:18:23.664282 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-ovsdbserver-nb\") pod \"9d16547d-20e7-4839-b923-a3a01531ae0d\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " Mar 08 04:18:23.664423 master-0 kubenswrapper[18592]: I0308 04:18:23.664362 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-ovsdbserver-sb\") pod \"9d16547d-20e7-4839-b923-a3a01531ae0d\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " Mar 08 04:18:23.664456 master-0 kubenswrapper[18592]: I0308 04:18:23.664422 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-dns-svc\") pod \"9d16547d-20e7-4839-b923-a3a01531ae0d\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " Mar 08 04:18:23.664604 master-0 kubenswrapper[18592]: I0308 04:18:23.664521 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnm7x\" (UniqueName: \"kubernetes.io/projected/9d16547d-20e7-4839-b923-a3a01531ae0d-kube-api-access-vnm7x\") pod \"9d16547d-20e7-4839-b923-a3a01531ae0d\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " Mar 08 04:18:23.664661 master-0 kubenswrapper[18592]: I0308 04:18:23.664619 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-dns-swift-storage-0\") pod \"9d16547d-20e7-4839-b923-a3a01531ae0d\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " Mar 08 04:18:23.664997 master-0 kubenswrapper[18592]: I0308 04:18:23.664777 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-config\") pod \"9d16547d-20e7-4839-b923-a3a01531ae0d\" (UID: \"9d16547d-20e7-4839-b923-a3a01531ae0d\") " Mar 08 04:18:23.668797 master-0 kubenswrapper[18592]: I0308 04:18:23.668757 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d16547d-20e7-4839-b923-a3a01531ae0d-kube-api-access-vnm7x" (OuterVolumeSpecName: "kube-api-access-vnm7x") pod "9d16547d-20e7-4839-b923-a3a01531ae0d" (UID: "9d16547d-20e7-4839-b923-a3a01531ae0d"). InnerVolumeSpecName "kube-api-access-vnm7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:18:23.744812 master-0 kubenswrapper[18592]: I0308 04:18:23.744429 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9d16547d-20e7-4839-b923-a3a01531ae0d" (UID: "9d16547d-20e7-4839-b923-a3a01531ae0d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:18:23.745123 master-0 kubenswrapper[18592]: I0308 04:18:23.745096 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9d16547d-20e7-4839-b923-a3a01531ae0d" (UID: "9d16547d-20e7-4839-b923-a3a01531ae0d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:18:23.754982 master-0 kubenswrapper[18592]: I0308 04:18:23.754908 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9d16547d-20e7-4839-b923-a3a01531ae0d" (UID: "9d16547d-20e7-4839-b923-a3a01531ae0d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:18:23.761343 master-0 kubenswrapper[18592]: I0308 04:18:23.761294 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9d16547d-20e7-4839-b923-a3a01531ae0d" (UID: "9d16547d-20e7-4839-b923-a3a01531ae0d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:18:23.768871 master-0 kubenswrapper[18592]: I0308 04:18:23.768177 18592 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:23.768871 master-0 kubenswrapper[18592]: I0308 04:18:23.768215 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnm7x\" (UniqueName: \"kubernetes.io/projected/9d16547d-20e7-4839-b923-a3a01531ae0d-kube-api-access-vnm7x\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:23.768871 master-0 kubenswrapper[18592]: I0308 04:18:23.768225 18592 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:23.768871 master-0 kubenswrapper[18592]: I0308 04:18:23.768235 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:23.768871 master-0 kubenswrapper[18592]: I0308 04:18:23.768243 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:23.775763 master-0 kubenswrapper[18592]: I0308 04:18:23.775719 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-config" (OuterVolumeSpecName: "config") pod "9d16547d-20e7-4839-b923-a3a01531ae0d" (UID: "9d16547d-20e7-4839-b923-a3a01531ae0d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:18:23.870210 master-0 kubenswrapper[18592]: I0308 04:18:23.870143 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d16547d-20e7-4839-b923-a3a01531ae0d-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:24.039096 master-0 kubenswrapper[18592]: I0308 04:18:24.038956 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-ff301-api-0" Mar 08 04:18:24.416940 master-0 kubenswrapper[18592]: I0308 04:18:24.416874 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jq2f2" event={"ID":"807694b2-9dea-4fae-b203-ee9b8331871d","Type":"ContainerStarted","Data":"350c8445385cd81b530a000141ac55ab664ad8ee886ec71e205afd0686b27b08"} Mar 08 04:18:24.421612 master-0 kubenswrapper[18592]: I0308 04:18:24.420310 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"abe912ba-4a33-4634-a3fb-b6fb09b38d8e","Type":"ContainerStarted","Data":"436fa8830779ba8629c29fae95b09526f25d16ee956367773b8f253f32f5e7d8"} Mar 08 04:18:24.428321 master-0 kubenswrapper[18592]: I0308 04:18:24.427563 18592 generic.go:334] "Generic (PLEG): container finished" podID="6936bf71-3ad4-47e9-8df6-9075d83086db" containerID="1ac98d8c61f93e5f82a4e76c8d3f9ba82753b1f85b47deda4f055f8d8688aad7" exitCode=0 Mar 08 04:18:24.428321 master-0 kubenswrapper[18592]: I0308 04:18:24.427665 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"6936bf71-3ad4-47e9-8df6-9075d83086db","Type":"ContainerDied","Data":"1ac98d8c61f93e5f82a4e76c8d3f9ba82753b1f85b47deda4f055f8d8688aad7"} Mar 08 04:18:24.433866 master-0 kubenswrapper[18592]: I0308 04:18:24.433799 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" event={"ID":"9d16547d-20e7-4839-b923-a3a01531ae0d","Type":"ContainerDied","Data":"b7195fc3433b5d8f4b052236ef5a60bebcffbf9ade104bdad8c75baad2aa0c13"} Mar 08 04:18:24.433969 master-0 kubenswrapper[18592]: I0308 04:18:24.433878 18592 scope.go:117] "RemoveContainer" containerID="c384a30be75094a05190f4bcfd0ae9c347bb91927d000c9e7d35d9d82e6139e3" Mar 08 04:18:24.434016 master-0 kubenswrapper[18592]: I0308 04:18:24.433996 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b6c769dc-vwrbs" Mar 08 04:18:24.613449 master-0 kubenswrapper[18592]: I0308 04:18:24.613387 18592 scope.go:117] "RemoveContainer" containerID="75860010ea3ac6d113be7b206770088f41b8ca19b7b7d09993bfc651f1bf0c0f" Mar 08 04:18:24.789837 master-0 kubenswrapper[18592]: I0308 04:18:24.789746 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76b6c769dc-vwrbs"] Mar 08 04:18:25.003651 master-0 kubenswrapper[18592]: I0308 04:18:25.003595 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76b6c769dc-vwrbs"] Mar 08 04:18:25.455291 master-0 kubenswrapper[18592]: I0308 04:18:25.455156 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"6936bf71-3ad4-47e9-8df6-9075d83086db","Type":"ContainerStarted","Data":"2dc0477320aed025d1944015c9bc7a19e8055afdd6fb5ffc7da53023fcf402a5"} Mar 08 04:18:25.502880 master-0 kubenswrapper[18592]: I0308 04:18:25.502801 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:25.502880 master-0 kubenswrapper[18592]: I0308 04:18:25.502882 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:25.539118 master-0 kubenswrapper[18592]: I0308 04:18:25.539072 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:25.557038 master-0 kubenswrapper[18592]: I0308 04:18:25.557007 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:26.162764 master-0 kubenswrapper[18592]: I0308 04:18:26.162644 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d16547d-20e7-4839-b923-a3a01531ae0d" path="/var/lib/kubelet/pods/9d16547d-20e7-4839-b923-a3a01531ae0d/volumes" Mar 08 04:18:26.163479 master-0 kubenswrapper[18592]: I0308 04:18:26.163452 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:26.163534 master-0 kubenswrapper[18592]: I0308 04:18:26.163486 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:26.199125 master-0 kubenswrapper[18592]: I0308 04:18:26.198905 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:26.213105 master-0 kubenswrapper[18592]: I0308 04:18:26.209899 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:26.476247 master-0 kubenswrapper[18592]: I0308 04:18:26.472413 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"6936bf71-3ad4-47e9-8df6-9075d83086db","Type":"ContainerStarted","Data":"acf9a1a15965f8fe84a993011cd78c3034175b8a33c9693e69d62cf837ba75e2"} Mar 08 04:18:26.476247 master-0 kubenswrapper[18592]: I0308 04:18:26.472466 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"6936bf71-3ad4-47e9-8df6-9075d83086db","Type":"ContainerStarted","Data":"75a68482b6de1bc9b9d984085b0b732eeeafc365e78c5bfd666324da46b4cd52"} Mar 08 04:18:26.476247 master-0 kubenswrapper[18592]: I0308 04:18:26.472614 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:26.476247 master-0 kubenswrapper[18592]: I0308 04:18:26.472639 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:26.476247 master-0 kubenswrapper[18592]: I0308 04:18:26.472649 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:26.476247 master-0 kubenswrapper[18592]: I0308 04:18:26.472659 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:27.501467 master-0 kubenswrapper[18592]: I0308 04:18:27.501411 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"6936bf71-3ad4-47e9-8df6-9075d83086db","Type":"ContainerStarted","Data":"4a25d8359ca0cfe6c980b0d2690414b1a564b0f9fd432432c1a5a7dbcabfa84c"} Mar 08 04:18:27.501467 master-0 kubenswrapper[18592]: I0308 04:18:27.501463 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"6936bf71-3ad4-47e9-8df6-9075d83086db","Type":"ContainerStarted","Data":"db0de84f7c1abc579f988c140bf280e9d019cd98f1be8b90de72927c0d5ea012"} Mar 08 04:18:27.502124 master-0 kubenswrapper[18592]: I0308 04:18:27.501778 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 08 04:18:27.503416 master-0 kubenswrapper[18592]: I0308 04:18:27.502515 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 08 04:18:27.557498 master-0 kubenswrapper[18592]: I0308 04:18:27.556945 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-0" podStartSLOduration=6.582017756 podStartE2EDuration="12.556925274s" podCreationTimestamp="2026-03-08 04:18:15 +0000 UTC" firstStartedPulling="2026-03-08 04:18:17.130462743 +0000 UTC m=+1509.229217083" lastFinishedPulling="2026-03-08 04:18:23.105370251 +0000 UTC m=+1515.204124601" observedRunningTime="2026-03-08 04:18:27.529726725 +0000 UTC m=+1519.628481115" watchObservedRunningTime="2026-03-08 04:18:27.556925274 +0000 UTC m=+1519.655679624" Mar 08 04:18:28.514196 master-0 kubenswrapper[18592]: I0308 04:18:28.514144 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:28.514863 master-0 kubenswrapper[18592]: I0308 04:18:28.514810 18592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 04:18:28.529335 master-0 kubenswrapper[18592]: I0308 04:18:28.529287 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:28.529595 master-0 kubenswrapper[18592]: I0308 04:18:28.529444 18592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 04:18:28.623248 master-0 kubenswrapper[18592]: I0308 04:18:28.623194 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-afe2b-default-external-api-0" Mar 08 04:18:28.712853 master-0 kubenswrapper[18592]: I0308 04:18:28.709771 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-afe2b-default-internal-api-0" Mar 08 04:18:30.661737 master-0 kubenswrapper[18592]: I0308 04:18:30.661088 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 08 04:18:30.661737 master-0 kubenswrapper[18592]: I0308 04:18:30.661148 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 08 04:18:30.697850 master-0 kubenswrapper[18592]: I0308 04:18:30.696682 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 08 04:18:31.562453 master-0 kubenswrapper[18592]: I0308 04:18:31.562410 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 08 04:18:32.143430 master-0 kubenswrapper[18592]: I0308 04:18:32.143365 18592 scope.go:117] "RemoveContainer" containerID="1988a339b07b0c491c35385954db8882edaee30eeede55407a4d33cef13b5291" Mar 08 04:18:32.143999 master-0 kubenswrapper[18592]: E0308 04:18:32.143974 18592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-859d47fc89-z2wvz_openstack(05ba9b98-7d2f-4a9b-80ad-60793d8279e8)\"" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" podUID="05ba9b98-7d2f-4a9b-80ad-60793d8279e8" Mar 08 04:18:34.604866 master-0 kubenswrapper[18592]: I0308 04:18:34.604745 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jq2f2" event={"ID":"807694b2-9dea-4fae-b203-ee9b8331871d","Type":"ContainerStarted","Data":"0b668349c154eaae9bcda090cd2e5e3d59b08c2f81653f2dae0d8d0fece4fdae"} Mar 08 04:18:34.634014 master-0 kubenswrapper[18592]: I0308 04:18:34.633923 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-jq2f2" podStartSLOduration=2.781473917 podStartE2EDuration="12.633897969s" podCreationTimestamp="2026-03-08 04:18:22 +0000 UTC" firstStartedPulling="2026-03-08 04:18:23.630770888 +0000 UTC m=+1515.729525238" lastFinishedPulling="2026-03-08 04:18:33.48319494 +0000 UTC m=+1525.581949290" observedRunningTime="2026-03-08 04:18:34.632236633 +0000 UTC m=+1526.730990983" watchObservedRunningTime="2026-03-08 04:18:34.633897969 +0000 UTC m=+1526.732652319" Mar 08 04:18:35.660496 master-0 kubenswrapper[18592]: I0308 04:18:35.660436 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Mar 08 04:18:35.661615 master-0 kubenswrapper[18592]: I0308 04:18:35.661538 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Mar 08 04:18:35.702427 master-0 kubenswrapper[18592]: I0308 04:18:35.702359 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Mar 08 04:18:35.708530 master-0 kubenswrapper[18592]: I0308 04:18:35.708483 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Mar 08 04:18:36.639148 master-0 kubenswrapper[18592]: I0308 04:18:36.639099 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 08 04:18:36.643733 master-0 kubenswrapper[18592]: I0308 04:18:36.643688 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 08 04:18:43.144245 master-0 kubenswrapper[18592]: I0308 04:18:43.144161 18592 scope.go:117] "RemoveContainer" containerID="1988a339b07b0c491c35385954db8882edaee30eeede55407a4d33cef13b5291" Mar 08 04:18:43.768548 master-0 kubenswrapper[18592]: I0308 04:18:43.768444 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" event={"ID":"05ba9b98-7d2f-4a9b-80ad-60793d8279e8","Type":"ContainerStarted","Data":"4f249c1429a227fe7e1b3f4b1392af6cc1162f7acb1e2f3a0dd407dd9602cb4e"} Mar 08 04:18:43.770109 master-0 kubenswrapper[18592]: I0308 04:18:43.770017 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:18:48.556869 master-0 kubenswrapper[18592]: I0308 04:18:48.554988 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-859d47fc89-z2wvz" Mar 08 04:18:50.857843 master-0 kubenswrapper[18592]: I0308 04:18:50.855756 18592 generic.go:334] "Generic (PLEG): container finished" podID="807694b2-9dea-4fae-b203-ee9b8331871d" containerID="0b668349c154eaae9bcda090cd2e5e3d59b08c2f81653f2dae0d8d0fece4fdae" exitCode=0 Mar 08 04:18:50.857843 master-0 kubenswrapper[18592]: I0308 04:18:50.855817 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jq2f2" event={"ID":"807694b2-9dea-4fae-b203-ee9b8331871d","Type":"ContainerDied","Data":"0b668349c154eaae9bcda090cd2e5e3d59b08c2f81653f2dae0d8d0fece4fdae"} Mar 08 04:18:52.394486 master-0 kubenswrapper[18592]: I0308 04:18:52.394446 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jq2f2" Mar 08 04:18:52.518750 master-0 kubenswrapper[18592]: I0308 04:18:52.518707 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807694b2-9dea-4fae-b203-ee9b8331871d-config-data\") pod \"807694b2-9dea-4fae-b203-ee9b8331871d\" (UID: \"807694b2-9dea-4fae-b203-ee9b8331871d\") " Mar 08 04:18:52.519545 master-0 kubenswrapper[18592]: I0308 04:18:52.519488 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/807694b2-9dea-4fae-b203-ee9b8331871d-scripts\") pod \"807694b2-9dea-4fae-b203-ee9b8331871d\" (UID: \"807694b2-9dea-4fae-b203-ee9b8331871d\") " Mar 08 04:18:52.519883 master-0 kubenswrapper[18592]: I0308 04:18:52.519864 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807694b2-9dea-4fae-b203-ee9b8331871d-combined-ca-bundle\") pod \"807694b2-9dea-4fae-b203-ee9b8331871d\" (UID: \"807694b2-9dea-4fae-b203-ee9b8331871d\") " Mar 08 04:18:52.520101 master-0 kubenswrapper[18592]: I0308 04:18:52.520063 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z49df\" (UniqueName: \"kubernetes.io/projected/807694b2-9dea-4fae-b203-ee9b8331871d-kube-api-access-z49df\") pod \"807694b2-9dea-4fae-b203-ee9b8331871d\" (UID: \"807694b2-9dea-4fae-b203-ee9b8331871d\") " Mar 08 04:18:52.538785 master-0 kubenswrapper[18592]: I0308 04:18:52.537790 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/807694b2-9dea-4fae-b203-ee9b8331871d-kube-api-access-z49df" (OuterVolumeSpecName: "kube-api-access-z49df") pod "807694b2-9dea-4fae-b203-ee9b8331871d" (UID: "807694b2-9dea-4fae-b203-ee9b8331871d"). InnerVolumeSpecName "kube-api-access-z49df". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:18:52.541843 master-0 kubenswrapper[18592]: I0308 04:18:52.541741 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/807694b2-9dea-4fae-b203-ee9b8331871d-scripts" (OuterVolumeSpecName: "scripts") pod "807694b2-9dea-4fae-b203-ee9b8331871d" (UID: "807694b2-9dea-4fae-b203-ee9b8331871d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:52.553342 master-0 kubenswrapper[18592]: I0308 04:18:52.553251 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/807694b2-9dea-4fae-b203-ee9b8331871d-config-data" (OuterVolumeSpecName: "config-data") pod "807694b2-9dea-4fae-b203-ee9b8331871d" (UID: "807694b2-9dea-4fae-b203-ee9b8331871d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:52.553623 master-0 kubenswrapper[18592]: I0308 04:18:52.553542 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/807694b2-9dea-4fae-b203-ee9b8331871d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "807694b2-9dea-4fae-b203-ee9b8331871d" (UID: "807694b2-9dea-4fae-b203-ee9b8331871d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:18:52.621906 master-0 kubenswrapper[18592]: I0308 04:18:52.621850 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/807694b2-9dea-4fae-b203-ee9b8331871d-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:52.621906 master-0 kubenswrapper[18592]: I0308 04:18:52.621889 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807694b2-9dea-4fae-b203-ee9b8331871d-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:52.621906 master-0 kubenswrapper[18592]: I0308 04:18:52.621900 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z49df\" (UniqueName: \"kubernetes.io/projected/807694b2-9dea-4fae-b203-ee9b8331871d-kube-api-access-z49df\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:52.621906 master-0 kubenswrapper[18592]: I0308 04:18:52.621909 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807694b2-9dea-4fae-b203-ee9b8331871d-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:18:52.888915 master-0 kubenswrapper[18592]: I0308 04:18:52.888785 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jq2f2" event={"ID":"807694b2-9dea-4fae-b203-ee9b8331871d","Type":"ContainerDied","Data":"350c8445385cd81b530a000141ac55ab664ad8ee886ec71e205afd0686b27b08"} Mar 08 04:18:52.888915 master-0 kubenswrapper[18592]: I0308 04:18:52.888859 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="350c8445385cd81b530a000141ac55ab664ad8ee886ec71e205afd0686b27b08" Mar 08 04:18:52.889136 master-0 kubenswrapper[18592]: I0308 04:18:52.888935 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jq2f2" Mar 08 04:18:53.024582 master-0 kubenswrapper[18592]: I0308 04:18:53.024448 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 04:18:53.025256 master-0 kubenswrapper[18592]: E0308 04:18:53.025233 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d16547d-20e7-4839-b923-a3a01531ae0d" containerName="dnsmasq-dns" Mar 08 04:18:53.025256 master-0 kubenswrapper[18592]: I0308 04:18:53.025257 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d16547d-20e7-4839-b923-a3a01531ae0d" containerName="dnsmasq-dns" Mar 08 04:18:53.025453 master-0 kubenswrapper[18592]: E0308 04:18:53.025302 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807694b2-9dea-4fae-b203-ee9b8331871d" containerName="nova-cell0-conductor-db-sync" Mar 08 04:18:53.025453 master-0 kubenswrapper[18592]: I0308 04:18:53.025311 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="807694b2-9dea-4fae-b203-ee9b8331871d" containerName="nova-cell0-conductor-db-sync" Mar 08 04:18:53.025453 master-0 kubenswrapper[18592]: E0308 04:18:53.025349 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d16547d-20e7-4839-b923-a3a01531ae0d" containerName="init" Mar 08 04:18:53.025453 master-0 kubenswrapper[18592]: I0308 04:18:53.025357 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d16547d-20e7-4839-b923-a3a01531ae0d" containerName="init" Mar 08 04:18:53.025668 master-0 kubenswrapper[18592]: I0308 04:18:53.025637 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d16547d-20e7-4839-b923-a3a01531ae0d" containerName="dnsmasq-dns" Mar 08 04:18:53.025712 master-0 kubenswrapper[18592]: I0308 04:18:53.025674 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="807694b2-9dea-4fae-b203-ee9b8331871d" containerName="nova-cell0-conductor-db-sync" Mar 08 04:18:53.026550 master-0 kubenswrapper[18592]: I0308 04:18:53.026520 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 04:18:53.029572 master-0 kubenswrapper[18592]: I0308 04:18:53.029533 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 04:18:53.043061 master-0 kubenswrapper[18592]: I0308 04:18:53.043010 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 04:18:53.132930 master-0 kubenswrapper[18592]: I0308 04:18:53.132871 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/017acd52-088f-439e-8ff2-97c079c31eb3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"017acd52-088f-439e-8ff2-97c079c31eb3\") " pod="openstack/nova-cell0-conductor-0" Mar 08 04:18:53.133173 master-0 kubenswrapper[18592]: I0308 04:18:53.133004 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7lw9\" (UniqueName: \"kubernetes.io/projected/017acd52-088f-439e-8ff2-97c079c31eb3-kube-api-access-j7lw9\") pod \"nova-cell0-conductor-0\" (UID: \"017acd52-088f-439e-8ff2-97c079c31eb3\") " pod="openstack/nova-cell0-conductor-0" Mar 08 04:18:53.133173 master-0 kubenswrapper[18592]: I0308 04:18:53.133054 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/017acd52-088f-439e-8ff2-97c079c31eb3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"017acd52-088f-439e-8ff2-97c079c31eb3\") " pod="openstack/nova-cell0-conductor-0" Mar 08 04:18:53.236652 master-0 kubenswrapper[18592]: I0308 04:18:53.235019 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/017acd52-088f-439e-8ff2-97c079c31eb3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"017acd52-088f-439e-8ff2-97c079c31eb3\") " pod="openstack/nova-cell0-conductor-0" Mar 08 04:18:53.236652 master-0 kubenswrapper[18592]: I0308 04:18:53.235116 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7lw9\" (UniqueName: \"kubernetes.io/projected/017acd52-088f-439e-8ff2-97c079c31eb3-kube-api-access-j7lw9\") pod \"nova-cell0-conductor-0\" (UID: \"017acd52-088f-439e-8ff2-97c079c31eb3\") " pod="openstack/nova-cell0-conductor-0" Mar 08 04:18:53.236652 master-0 kubenswrapper[18592]: I0308 04:18:53.235147 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/017acd52-088f-439e-8ff2-97c079c31eb3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"017acd52-088f-439e-8ff2-97c079c31eb3\") " pod="openstack/nova-cell0-conductor-0" Mar 08 04:18:53.239052 master-0 kubenswrapper[18592]: I0308 04:18:53.239012 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/017acd52-088f-439e-8ff2-97c079c31eb3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"017acd52-088f-439e-8ff2-97c079c31eb3\") " pod="openstack/nova-cell0-conductor-0" Mar 08 04:18:53.239344 master-0 kubenswrapper[18592]: I0308 04:18:53.239284 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/017acd52-088f-439e-8ff2-97c079c31eb3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"017acd52-088f-439e-8ff2-97c079c31eb3\") " pod="openstack/nova-cell0-conductor-0" Mar 08 04:18:53.249453 master-0 kubenswrapper[18592]: I0308 04:18:53.249429 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7lw9\" (UniqueName: \"kubernetes.io/projected/017acd52-088f-439e-8ff2-97c079c31eb3-kube-api-access-j7lw9\") pod \"nova-cell0-conductor-0\" (UID: \"017acd52-088f-439e-8ff2-97c079c31eb3\") " pod="openstack/nova-cell0-conductor-0" Mar 08 04:18:53.369234 master-0 kubenswrapper[18592]: I0308 04:18:53.369195 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 04:18:54.115031 master-0 kubenswrapper[18592]: I0308 04:18:54.114577 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 04:18:54.119685 master-0 kubenswrapper[18592]: W0308 04:18:54.119332 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod017acd52_088f_439e_8ff2_97c079c31eb3.slice/crio-64720a76f22f05bafc7b71f340b7226fb28416ed5a7203301d17be4bd024660e WatchSource:0}: Error finding container 64720a76f22f05bafc7b71f340b7226fb28416ed5a7203301d17be4bd024660e: Status 404 returned error can't find the container with id 64720a76f22f05bafc7b71f340b7226fb28416ed5a7203301d17be4bd024660e Mar 08 04:18:54.918081 master-0 kubenswrapper[18592]: I0308 04:18:54.917992 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"017acd52-088f-439e-8ff2-97c079c31eb3","Type":"ContainerStarted","Data":"60267ec69317be5e7032f6c3eeba5424677a851533653d527e58399ba9567d77"} Mar 08 04:18:54.918081 master-0 kubenswrapper[18592]: I0308 04:18:54.918047 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"017acd52-088f-439e-8ff2-97c079c31eb3","Type":"ContainerStarted","Data":"64720a76f22f05bafc7b71f340b7226fb28416ed5a7203301d17be4bd024660e"} Mar 08 04:18:54.918410 master-0 kubenswrapper[18592]: I0308 04:18:54.918183 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 08 04:18:54.947204 master-0 kubenswrapper[18592]: I0308 04:18:54.947124 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.947098756 podStartE2EDuration="2.947098756s" podCreationTimestamp="2026-03-08 04:18:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:18:54.937217178 +0000 UTC m=+1547.035971568" watchObservedRunningTime="2026-03-08 04:18:54.947098756 +0000 UTC m=+1547.045853106" Mar 08 04:19:03.397754 master-0 kubenswrapper[18592]: I0308 04:19:03.397650 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 08 04:19:03.928188 master-0 kubenswrapper[18592]: I0308 04:19:03.928119 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-gz2ch"] Mar 08 04:19:03.929563 master-0 kubenswrapper[18592]: I0308 04:19:03.929507 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gz2ch" Mar 08 04:19:03.931746 master-0 kubenswrapper[18592]: I0308 04:19:03.931687 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 08 04:19:03.931881 master-0 kubenswrapper[18592]: I0308 04:19:03.931849 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 08 04:19:03.959457 master-0 kubenswrapper[18592]: I0308 04:19:03.959387 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gz2ch"] Mar 08 04:19:03.974257 master-0 kubenswrapper[18592]: I0308 04:19:03.974205 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8dqd\" (UniqueName: \"kubernetes.io/projected/7b079179-65b3-4f1a-8c57-b4f84a718761-kube-api-access-j8dqd\") pod \"nova-cell0-cell-mapping-gz2ch\" (UID: \"7b079179-65b3-4f1a-8c57-b4f84a718761\") " pod="openstack/nova-cell0-cell-mapping-gz2ch" Mar 08 04:19:03.974936 master-0 kubenswrapper[18592]: I0308 04:19:03.974900 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b079179-65b3-4f1a-8c57-b4f84a718761-config-data\") pod \"nova-cell0-cell-mapping-gz2ch\" (UID: \"7b079179-65b3-4f1a-8c57-b4f84a718761\") " pod="openstack/nova-cell0-cell-mapping-gz2ch" Mar 08 04:19:03.975031 master-0 kubenswrapper[18592]: I0308 04:19:03.975007 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b079179-65b3-4f1a-8c57-b4f84a718761-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gz2ch\" (UID: \"7b079179-65b3-4f1a-8c57-b4f84a718761\") " pod="openstack/nova-cell0-cell-mapping-gz2ch" Mar 08 04:19:03.975077 master-0 kubenswrapper[18592]: I0308 04:19:03.975047 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b079179-65b3-4f1a-8c57-b4f84a718761-scripts\") pod \"nova-cell0-cell-mapping-gz2ch\" (UID: \"7b079179-65b3-4f1a-8c57-b4f84a718761\") " pod="openstack/nova-cell0-cell-mapping-gz2ch" Mar 08 04:19:04.073625 master-0 kubenswrapper[18592]: I0308 04:19:04.073396 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 08 04:19:04.076815 master-0 kubenswrapper[18592]: I0308 04:19:04.076024 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 04:19:04.079299 master-0 kubenswrapper[18592]: I0308 04:19:04.077044 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b079179-65b3-4f1a-8c57-b4f84a718761-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gz2ch\" (UID: \"7b079179-65b3-4f1a-8c57-b4f84a718761\") " pod="openstack/nova-cell0-cell-mapping-gz2ch" Mar 08 04:19:04.079299 master-0 kubenswrapper[18592]: I0308 04:19:04.077098 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b079179-65b3-4f1a-8c57-b4f84a718761-scripts\") pod \"nova-cell0-cell-mapping-gz2ch\" (UID: \"7b079179-65b3-4f1a-8c57-b4f84a718761\") " pod="openstack/nova-cell0-cell-mapping-gz2ch" Mar 08 04:19:04.079299 master-0 kubenswrapper[18592]: I0308 04:19:04.077244 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8dqd\" (UniqueName: \"kubernetes.io/projected/7b079179-65b3-4f1a-8c57-b4f84a718761-kube-api-access-j8dqd\") pod \"nova-cell0-cell-mapping-gz2ch\" (UID: \"7b079179-65b3-4f1a-8c57-b4f84a718761\") " pod="openstack/nova-cell0-cell-mapping-gz2ch" Mar 08 04:19:04.079299 master-0 kubenswrapper[18592]: I0308 04:19:04.077272 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b079179-65b3-4f1a-8c57-b4f84a718761-config-data\") pod \"nova-cell0-cell-mapping-gz2ch\" (UID: \"7b079179-65b3-4f1a-8c57-b4f84a718761\") " pod="openstack/nova-cell0-cell-mapping-gz2ch" Mar 08 04:19:04.094952 master-0 kubenswrapper[18592]: I0308 04:19:04.094645 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 08 04:19:04.097882 master-0 kubenswrapper[18592]: I0308 04:19:04.097489 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b079179-65b3-4f1a-8c57-b4f84a718761-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gz2ch\" (UID: \"7b079179-65b3-4f1a-8c57-b4f84a718761\") " pod="openstack/nova-cell0-cell-mapping-gz2ch" Mar 08 04:19:04.107050 master-0 kubenswrapper[18592]: I0308 04:19:04.098053 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b079179-65b3-4f1a-8c57-b4f84a718761-scripts\") pod \"nova-cell0-cell-mapping-gz2ch\" (UID: \"7b079179-65b3-4f1a-8c57-b4f84a718761\") " pod="openstack/nova-cell0-cell-mapping-gz2ch" Mar 08 04:19:04.107050 master-0 kubenswrapper[18592]: I0308 04:19:04.098069 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-ironic-compute-config-data" Mar 08 04:19:04.107050 master-0 kubenswrapper[18592]: I0308 04:19:04.098197 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b079179-65b3-4f1a-8c57-b4f84a718761-config-data\") pod \"nova-cell0-cell-mapping-gz2ch\" (UID: \"7b079179-65b3-4f1a-8c57-b4f84a718761\") " pod="openstack/nova-cell0-cell-mapping-gz2ch" Mar 08 04:19:04.136860 master-0 kubenswrapper[18592]: I0308 04:19:04.134961 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8dqd\" (UniqueName: \"kubernetes.io/projected/7b079179-65b3-4f1a-8c57-b4f84a718761-kube-api-access-j8dqd\") pod \"nova-cell0-cell-mapping-gz2ch\" (UID: \"7b079179-65b3-4f1a-8c57-b4f84a718761\") " pod="openstack/nova-cell0-cell-mapping-gz2ch" Mar 08 04:19:04.180007 master-0 kubenswrapper[18592]: I0308 04:19:04.178976 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 04:19:04.180007 master-0 kubenswrapper[18592]: I0308 04:19:04.179190 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x8nk\" (UniqueName: \"kubernetes.io/projected/0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d-kube-api-access-2x8nk\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 04:19:04.180007 master-0 kubenswrapper[18592]: I0308 04:19:04.179255 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 04:19:04.220937 master-0 kubenswrapper[18592]: I0308 04:19:04.220877 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 04:19:04.229144 master-0 kubenswrapper[18592]: I0308 04:19:04.227602 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 04:19:04.234396 master-0 kubenswrapper[18592]: I0308 04:19:04.233943 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 04:19:04.246292 master-0 kubenswrapper[18592]: I0308 04:19:04.246220 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 04:19:04.258635 master-0 kubenswrapper[18592]: I0308 04:19:04.257328 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gz2ch" Mar 08 04:19:04.283252 master-0 kubenswrapper[18592]: I0308 04:19:04.281909 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 04:19:04.288510 master-0 kubenswrapper[18592]: I0308 04:19:04.288445 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:04.291760 master-0 kubenswrapper[18592]: I0308 04:19:04.291652 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 04:19:04.295578 master-0 kubenswrapper[18592]: I0308 04:19:04.294046 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 08 04:19:04.300190 master-0 kubenswrapper[18592]: I0308 04:19:04.298113 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3baf7c23-212f-4e46-8734-6aebd220851f-logs\") pod \"nova-api-0\" (UID: \"3baf7c23-212f-4e46-8734-6aebd220851f\") " pod="openstack/nova-api-0" Mar 08 04:19:04.300190 master-0 kubenswrapper[18592]: I0308 04:19:04.298265 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 04:19:04.300190 master-0 kubenswrapper[18592]: I0308 04:19:04.298308 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3baf7c23-212f-4e46-8734-6aebd220851f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3baf7c23-212f-4e46-8734-6aebd220851f\") " pod="openstack/nova-api-0" Mar 08 04:19:04.300190 master-0 kubenswrapper[18592]: I0308 04:19:04.298400 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x8nk\" (UniqueName: \"kubernetes.io/projected/0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d-kube-api-access-2x8nk\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 04:19:04.300190 master-0 kubenswrapper[18592]: I0308 04:19:04.298428 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7rkz\" (UniqueName: \"kubernetes.io/projected/3baf7c23-212f-4e46-8734-6aebd220851f-kube-api-access-k7rkz\") pod \"nova-api-0\" (UID: \"3baf7c23-212f-4e46-8734-6aebd220851f\") " pod="openstack/nova-api-0" Mar 08 04:19:04.300190 master-0 kubenswrapper[18592]: I0308 04:19:04.298454 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3baf7c23-212f-4e46-8734-6aebd220851f-config-data\") pod \"nova-api-0\" (UID: \"3baf7c23-212f-4e46-8734-6aebd220851f\") " pod="openstack/nova-api-0" Mar 08 04:19:04.300190 master-0 kubenswrapper[18592]: I0308 04:19:04.298486 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 04:19:04.304345 master-0 kubenswrapper[18592]: I0308 04:19:04.304181 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 04:19:04.304715 master-0 kubenswrapper[18592]: I0308 04:19:04.304646 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 04:19:04.328346 master-0 kubenswrapper[18592]: I0308 04:19:04.328313 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x8nk\" (UniqueName: \"kubernetes.io/projected/0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d-kube-api-access-2x8nk\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 04:19:04.372710 master-0 kubenswrapper[18592]: I0308 04:19:04.371608 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 04:19:04.377858 master-0 kubenswrapper[18592]: I0308 04:19:04.377238 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 04:19:04.380584 master-0 kubenswrapper[18592]: I0308 04:19:04.380554 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 04:19:04.406427 master-0 kubenswrapper[18592]: I0308 04:19:04.406381 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7rkz\" (UniqueName: \"kubernetes.io/projected/3baf7c23-212f-4e46-8734-6aebd220851f-kube-api-access-k7rkz\") pod \"nova-api-0\" (UID: \"3baf7c23-212f-4e46-8734-6aebd220851f\") " pod="openstack/nova-api-0" Mar 08 04:19:04.412068 master-0 kubenswrapper[18592]: I0308 04:19:04.412033 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3baf7c23-212f-4e46-8734-6aebd220851f-config-data\") pod \"nova-api-0\" (UID: \"3baf7c23-212f-4e46-8734-6aebd220851f\") " pod="openstack/nova-api-0" Mar 08 04:19:04.412332 master-0 kubenswrapper[18592]: I0308 04:19:04.412318 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3baf7c23-212f-4e46-8734-6aebd220851f-logs\") pod \"nova-api-0\" (UID: \"3baf7c23-212f-4e46-8734-6aebd220851f\") " pod="openstack/nova-api-0" Mar 08 04:19:04.425981 master-0 kubenswrapper[18592]: I0308 04:19:04.412613 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3baf7c23-212f-4e46-8734-6aebd220851f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3baf7c23-212f-4e46-8734-6aebd220851f\") " pod="openstack/nova-api-0" Mar 08 04:19:04.426539 master-0 kubenswrapper[18592]: I0308 04:19:04.413376 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 04:19:04.426629 master-0 kubenswrapper[18592]: I0308 04:19:04.421550 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3baf7c23-212f-4e46-8734-6aebd220851f-logs\") pod \"nova-api-0\" (UID: \"3baf7c23-212f-4e46-8734-6aebd220851f\") " pod="openstack/nova-api-0" Mar 08 04:19:04.451878 master-0 kubenswrapper[18592]: I0308 04:19:04.443498 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3baf7c23-212f-4e46-8734-6aebd220851f-config-data\") pod \"nova-api-0\" (UID: \"3baf7c23-212f-4e46-8734-6aebd220851f\") " pod="openstack/nova-api-0" Mar 08 04:19:04.459921 master-0 kubenswrapper[18592]: I0308 04:19:04.457622 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7rkz\" (UniqueName: \"kubernetes.io/projected/3baf7c23-212f-4e46-8734-6aebd220851f-kube-api-access-k7rkz\") pod \"nova-api-0\" (UID: \"3baf7c23-212f-4e46-8734-6aebd220851f\") " pod="openstack/nova-api-0" Mar 08 04:19:04.472252 master-0 kubenswrapper[18592]: I0308 04:19:04.463522 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 04:19:04.472252 master-0 kubenswrapper[18592]: I0308 04:19:04.465195 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 04:19:04.472252 master-0 kubenswrapper[18592]: I0308 04:19:04.467123 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 04:19:04.502503 master-0 kubenswrapper[18592]: I0308 04:19:04.499353 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3baf7c23-212f-4e46-8734-6aebd220851f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3baf7c23-212f-4e46-8734-6aebd220851f\") " pod="openstack/nova-api-0" Mar 08 04:19:04.529995 master-0 kubenswrapper[18592]: I0308 04:19:04.529486 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3987d444-6d52-4bec-bcac-e9f562019bbf-config-data\") pod \"nova-metadata-0\" (UID: \"3987d444-6d52-4bec-bcac-e9f562019bbf\") " pod="openstack/nova-metadata-0" Mar 08 04:19:04.529995 master-0 kubenswrapper[18592]: I0308 04:19:04.529567 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkcc5\" (UniqueName: \"kubernetes.io/projected/3987d444-6d52-4bec-bcac-e9f562019bbf-kube-api-access-pkcc5\") pod \"nova-metadata-0\" (UID: \"3987d444-6d52-4bec-bcac-e9f562019bbf\") " pod="openstack/nova-metadata-0" Mar 08 04:19:04.529995 master-0 kubenswrapper[18592]: I0308 04:19:04.529609 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3987d444-6d52-4bec-bcac-e9f562019bbf-logs\") pod \"nova-metadata-0\" (UID: \"3987d444-6d52-4bec-bcac-e9f562019bbf\") " pod="openstack/nova-metadata-0" Mar 08 04:19:04.529995 master-0 kubenswrapper[18592]: I0308 04:19:04.529703 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886aa90f-56dc-4e58-bafe-d3828e4f8781-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"886aa90f-56dc-4e58-bafe-d3828e4f8781\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:04.529995 master-0 kubenswrapper[18592]: I0308 04:19:04.529727 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886aa90f-56dc-4e58-bafe-d3828e4f8781-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"886aa90f-56dc-4e58-bafe-d3828e4f8781\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:04.529995 master-0 kubenswrapper[18592]: I0308 04:19:04.529750 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74qjt\" (UniqueName: \"kubernetes.io/projected/886aa90f-56dc-4e58-bafe-d3828e4f8781-kube-api-access-74qjt\") pod \"nova-cell1-novncproxy-0\" (UID: \"886aa90f-56dc-4e58-bafe-d3828e4f8781\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:04.529995 master-0 kubenswrapper[18592]: I0308 04:19:04.529773 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrc85\" (UniqueName: \"kubernetes.io/projected/eb204bfb-9945-45b5-bf3c-e6d60826b959-kube-api-access-wrc85\") pod \"nova-scheduler-0\" (UID: \"eb204bfb-9945-45b5-bf3c-e6d60826b959\") " pod="openstack/nova-scheduler-0" Mar 08 04:19:04.529995 master-0 kubenswrapper[18592]: I0308 04:19:04.529799 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb204bfb-9945-45b5-bf3c-e6d60826b959-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"eb204bfb-9945-45b5-bf3c-e6d60826b959\") " pod="openstack/nova-scheduler-0" Mar 08 04:19:04.529995 master-0 kubenswrapper[18592]: I0308 04:19:04.529855 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3987d444-6d52-4bec-bcac-e9f562019bbf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3987d444-6d52-4bec-bcac-e9f562019bbf\") " pod="openstack/nova-metadata-0" Mar 08 04:19:04.529995 master-0 kubenswrapper[18592]: I0308 04:19:04.529874 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb204bfb-9945-45b5-bf3c-e6d60826b959-config-data\") pod \"nova-scheduler-0\" (UID: \"eb204bfb-9945-45b5-bf3c-e6d60826b959\") " pod="openstack/nova-scheduler-0" Mar 08 04:19:04.552045 master-0 kubenswrapper[18592]: I0308 04:19:04.550546 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 04:19:04.552045 master-0 kubenswrapper[18592]: I0308 04:19:04.551800 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 04:19:04.671641 master-0 kubenswrapper[18592]: I0308 04:19:04.628049 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd7bf96c-57r52"] Mar 08 04:19:04.671641 master-0 kubenswrapper[18592]: I0308 04:19:04.636436 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886aa90f-56dc-4e58-bafe-d3828e4f8781-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"886aa90f-56dc-4e58-bafe-d3828e4f8781\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:04.671641 master-0 kubenswrapper[18592]: I0308 04:19:04.636495 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886aa90f-56dc-4e58-bafe-d3828e4f8781-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"886aa90f-56dc-4e58-bafe-d3828e4f8781\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:04.671641 master-0 kubenswrapper[18592]: I0308 04:19:04.636529 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74qjt\" (UniqueName: \"kubernetes.io/projected/886aa90f-56dc-4e58-bafe-d3828e4f8781-kube-api-access-74qjt\") pod \"nova-cell1-novncproxy-0\" (UID: \"886aa90f-56dc-4e58-bafe-d3828e4f8781\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:04.671641 master-0 kubenswrapper[18592]: I0308 04:19:04.636557 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrc85\" (UniqueName: \"kubernetes.io/projected/eb204bfb-9945-45b5-bf3c-e6d60826b959-kube-api-access-wrc85\") pod \"nova-scheduler-0\" (UID: \"eb204bfb-9945-45b5-bf3c-e6d60826b959\") " pod="openstack/nova-scheduler-0" Mar 08 04:19:04.671641 master-0 kubenswrapper[18592]: I0308 04:19:04.636589 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb204bfb-9945-45b5-bf3c-e6d60826b959-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"eb204bfb-9945-45b5-bf3c-e6d60826b959\") " pod="openstack/nova-scheduler-0" Mar 08 04:19:04.671641 master-0 kubenswrapper[18592]: I0308 04:19:04.636615 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3987d444-6d52-4bec-bcac-e9f562019bbf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3987d444-6d52-4bec-bcac-e9f562019bbf\") " pod="openstack/nova-metadata-0" Mar 08 04:19:04.671641 master-0 kubenswrapper[18592]: I0308 04:19:04.636632 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb204bfb-9945-45b5-bf3c-e6d60826b959-config-data\") pod \"nova-scheduler-0\" (UID: \"eb204bfb-9945-45b5-bf3c-e6d60826b959\") " pod="openstack/nova-scheduler-0" Mar 08 04:19:04.671641 master-0 kubenswrapper[18592]: I0308 04:19:04.636802 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3987d444-6d52-4bec-bcac-e9f562019bbf-config-data\") pod \"nova-metadata-0\" (UID: \"3987d444-6d52-4bec-bcac-e9f562019bbf\") " pod="openstack/nova-metadata-0" Mar 08 04:19:04.671641 master-0 kubenswrapper[18592]: I0308 04:19:04.636902 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkcc5\" (UniqueName: \"kubernetes.io/projected/3987d444-6d52-4bec-bcac-e9f562019bbf-kube-api-access-pkcc5\") pod \"nova-metadata-0\" (UID: \"3987d444-6d52-4bec-bcac-e9f562019bbf\") " pod="openstack/nova-metadata-0" Mar 08 04:19:04.671641 master-0 kubenswrapper[18592]: I0308 04:19:04.637157 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:04.671641 master-0 kubenswrapper[18592]: I0308 04:19:04.646295 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3987d444-6d52-4bec-bcac-e9f562019bbf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3987d444-6d52-4bec-bcac-e9f562019bbf\") " pod="openstack/nova-metadata-0" Mar 08 04:19:04.671641 master-0 kubenswrapper[18592]: I0308 04:19:04.657898 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3987d444-6d52-4bec-bcac-e9f562019bbf-logs\") pod \"nova-metadata-0\" (UID: \"3987d444-6d52-4bec-bcac-e9f562019bbf\") " pod="openstack/nova-metadata-0" Mar 08 04:19:04.671641 master-0 kubenswrapper[18592]: I0308 04:19:04.658444 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3987d444-6d52-4bec-bcac-e9f562019bbf-logs\") pod \"nova-metadata-0\" (UID: \"3987d444-6d52-4bec-bcac-e9f562019bbf\") " pod="openstack/nova-metadata-0" Mar 08 04:19:04.671641 master-0 kubenswrapper[18592]: I0308 04:19:04.662321 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886aa90f-56dc-4e58-bafe-d3828e4f8781-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"886aa90f-56dc-4e58-bafe-d3828e4f8781\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:04.671641 master-0 kubenswrapper[18592]: I0308 04:19:04.670495 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb204bfb-9945-45b5-bf3c-e6d60826b959-config-data\") pod \"nova-scheduler-0\" (UID: \"eb204bfb-9945-45b5-bf3c-e6d60826b959\") " pod="openstack/nova-scheduler-0" Mar 08 04:19:04.675664 master-0 kubenswrapper[18592]: I0308 04:19:04.675539 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3987d444-6d52-4bec-bcac-e9f562019bbf-config-data\") pod \"nova-metadata-0\" (UID: \"3987d444-6d52-4bec-bcac-e9f562019bbf\") " pod="openstack/nova-metadata-0" Mar 08 04:19:04.689146 master-0 kubenswrapper[18592]: I0308 04:19:04.678761 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886aa90f-56dc-4e58-bafe-d3828e4f8781-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"886aa90f-56dc-4e58-bafe-d3828e4f8781\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:04.689146 master-0 kubenswrapper[18592]: I0308 04:19:04.680049 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkcc5\" (UniqueName: \"kubernetes.io/projected/3987d444-6d52-4bec-bcac-e9f562019bbf-kube-api-access-pkcc5\") pod \"nova-metadata-0\" (UID: \"3987d444-6d52-4bec-bcac-e9f562019bbf\") " pod="openstack/nova-metadata-0" Mar 08 04:19:04.689146 master-0 kubenswrapper[18592]: I0308 04:19:04.680438 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb204bfb-9945-45b5-bf3c-e6d60826b959-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"eb204bfb-9945-45b5-bf3c-e6d60826b959\") " pod="openstack/nova-scheduler-0" Mar 08 04:19:04.689146 master-0 kubenswrapper[18592]: I0308 04:19:04.680497 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74qjt\" (UniqueName: \"kubernetes.io/projected/886aa90f-56dc-4e58-bafe-d3828e4f8781-kube-api-access-74qjt\") pod \"nova-cell1-novncproxy-0\" (UID: \"886aa90f-56dc-4e58-bafe-d3828e4f8781\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:04.689146 master-0 kubenswrapper[18592]: I0308 04:19:04.683458 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrc85\" (UniqueName: \"kubernetes.io/projected/eb204bfb-9945-45b5-bf3c-e6d60826b959-kube-api-access-wrc85\") pod \"nova-scheduler-0\" (UID: \"eb204bfb-9945-45b5-bf3c-e6d60826b959\") " pod="openstack/nova-scheduler-0" Mar 08 04:19:04.689146 master-0 kubenswrapper[18592]: I0308 04:19:04.688610 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd7bf96c-57r52"] Mar 08 04:19:04.786292 master-0 kubenswrapper[18592]: I0308 04:19:04.786215 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:04.794576 master-0 kubenswrapper[18592]: I0308 04:19:04.794319 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp75r\" (UniqueName: \"kubernetes.io/projected/5fbc5df3-3059-42a1-b753-e325df83f3bd-kube-api-access-fp75r\") pod \"dnsmasq-dns-58dd7bf96c-57r52\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:04.794576 master-0 kubenswrapper[18592]: I0308 04:19:04.794447 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd7bf96c-57r52\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:04.794576 master-0 kubenswrapper[18592]: I0308 04:19:04.794490 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-config\") pod \"dnsmasq-dns-58dd7bf96c-57r52\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:04.794987 master-0 kubenswrapper[18592]: I0308 04:19:04.794591 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-dns-svc\") pod \"dnsmasq-dns-58dd7bf96c-57r52\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:04.794987 master-0 kubenswrapper[18592]: I0308 04:19:04.794615 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd7bf96c-57r52\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:04.794987 master-0 kubenswrapper[18592]: I0308 04:19:04.794629 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd7bf96c-57r52\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:04.799088 master-0 kubenswrapper[18592]: I0308 04:19:04.799045 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 04:19:04.885705 master-0 kubenswrapper[18592]: I0308 04:19:04.885322 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 04:19:04.902798 master-0 kubenswrapper[18592]: I0308 04:19:04.897537 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd7bf96c-57r52\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:04.902798 master-0 kubenswrapper[18592]: I0308 04:19:04.897620 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-config\") pod \"dnsmasq-dns-58dd7bf96c-57r52\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:04.902798 master-0 kubenswrapper[18592]: I0308 04:19:04.897720 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-dns-svc\") pod \"dnsmasq-dns-58dd7bf96c-57r52\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:04.902798 master-0 kubenswrapper[18592]: I0308 04:19:04.897788 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd7bf96c-57r52\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:04.902798 master-0 kubenswrapper[18592]: I0308 04:19:04.897805 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd7bf96c-57r52\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:04.902798 master-0 kubenswrapper[18592]: I0308 04:19:04.897874 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp75r\" (UniqueName: \"kubernetes.io/projected/5fbc5df3-3059-42a1-b753-e325df83f3bd-kube-api-access-fp75r\") pod \"dnsmasq-dns-58dd7bf96c-57r52\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:04.904019 master-0 kubenswrapper[18592]: I0308 04:19:04.903899 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-dns-svc\") pod \"dnsmasq-dns-58dd7bf96c-57r52\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:04.904271 master-0 kubenswrapper[18592]: I0308 04:19:04.904214 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd7bf96c-57r52\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:04.906645 master-0 kubenswrapper[18592]: I0308 04:19:04.906293 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-config\") pod \"dnsmasq-dns-58dd7bf96c-57r52\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:04.908410 master-0 kubenswrapper[18592]: I0308 04:19:04.908359 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd7bf96c-57r52\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:04.916324 master-0 kubenswrapper[18592]: I0308 04:19:04.915739 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 04:19:04.918388 master-0 kubenswrapper[18592]: I0308 04:19:04.918359 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp75r\" (UniqueName: \"kubernetes.io/projected/5fbc5df3-3059-42a1-b753-e325df83f3bd-kube-api-access-fp75r\") pod \"dnsmasq-dns-58dd7bf96c-57r52\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:04.919747 master-0 kubenswrapper[18592]: I0308 04:19:04.919708 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd7bf96c-57r52\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:05.012609 master-0 kubenswrapper[18592]: I0308 04:19:05.010483 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:05.067899 master-0 kubenswrapper[18592]: I0308 04:19:05.064053 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gz2ch"] Mar 08 04:19:05.086888 master-0 kubenswrapper[18592]: I0308 04:19:05.085654 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gz2ch" event={"ID":"7b079179-65b3-4f1a-8c57-b4f84a718761","Type":"ContainerStarted","Data":"d8a24d0f3cb8b9e6c9edf95cb614b93f962110b11a405c31fe6d6fdea0de752a"} Mar 08 04:19:05.251686 master-0 kubenswrapper[18592]: I0308 04:19:05.250619 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zqczq"] Mar 08 04:19:05.254618 master-0 kubenswrapper[18592]: I0308 04:19:05.252238 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zqczq" Mar 08 04:19:05.267746 master-0 kubenswrapper[18592]: I0308 04:19:05.260162 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 08 04:19:05.267746 master-0 kubenswrapper[18592]: I0308 04:19:05.262010 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 08 04:19:05.286910 master-0 kubenswrapper[18592]: I0308 04:19:05.274873 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zqczq"] Mar 08 04:19:05.331398 master-0 kubenswrapper[18592]: I0308 04:19:05.330075 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1953d5ca-ae5f-488f-998f-bea80ea7a09c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zqczq\" (UID: \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\") " pod="openstack/nova-cell1-conductor-db-sync-zqczq" Mar 08 04:19:05.331398 master-0 kubenswrapper[18592]: I0308 04:19:05.330258 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn7x4\" (UniqueName: \"kubernetes.io/projected/1953d5ca-ae5f-488f-998f-bea80ea7a09c-kube-api-access-qn7x4\") pod \"nova-cell1-conductor-db-sync-zqczq\" (UID: \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\") " pod="openstack/nova-cell1-conductor-db-sync-zqczq" Mar 08 04:19:05.331398 master-0 kubenswrapper[18592]: I0308 04:19:05.330283 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1953d5ca-ae5f-488f-998f-bea80ea7a09c-scripts\") pod \"nova-cell1-conductor-db-sync-zqczq\" (UID: \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\") " pod="openstack/nova-cell1-conductor-db-sync-zqczq" Mar 08 04:19:05.331398 master-0 kubenswrapper[18592]: I0308 04:19:05.330345 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1953d5ca-ae5f-488f-998f-bea80ea7a09c-config-data\") pod \"nova-cell1-conductor-db-sync-zqczq\" (UID: \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\") " pod="openstack/nova-cell1-conductor-db-sync-zqczq" Mar 08 04:19:05.514237 master-0 kubenswrapper[18592]: I0308 04:19:05.514176 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn7x4\" (UniqueName: \"kubernetes.io/projected/1953d5ca-ae5f-488f-998f-bea80ea7a09c-kube-api-access-qn7x4\") pod \"nova-cell1-conductor-db-sync-zqczq\" (UID: \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\") " pod="openstack/nova-cell1-conductor-db-sync-zqczq" Mar 08 04:19:05.526482 master-0 kubenswrapper[18592]: I0308 04:19:05.520984 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 08 04:19:05.527167 master-0 kubenswrapper[18592]: I0308 04:19:05.526951 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1953d5ca-ae5f-488f-998f-bea80ea7a09c-scripts\") pod \"nova-cell1-conductor-db-sync-zqczq\" (UID: \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\") " pod="openstack/nova-cell1-conductor-db-sync-zqczq" Mar 08 04:19:05.527783 master-0 kubenswrapper[18592]: I0308 04:19:05.527718 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1953d5ca-ae5f-488f-998f-bea80ea7a09c-config-data\") pod \"nova-cell1-conductor-db-sync-zqczq\" (UID: \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\") " pod="openstack/nova-cell1-conductor-db-sync-zqczq" Mar 08 04:19:05.528080 master-0 kubenswrapper[18592]: I0308 04:19:05.528033 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1953d5ca-ae5f-488f-998f-bea80ea7a09c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zqczq\" (UID: \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\") " pod="openstack/nova-cell1-conductor-db-sync-zqczq" Mar 08 04:19:05.567235 master-0 kubenswrapper[18592]: I0308 04:19:05.567172 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1953d5ca-ae5f-488f-998f-bea80ea7a09c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zqczq\" (UID: \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\") " pod="openstack/nova-cell1-conductor-db-sync-zqczq" Mar 08 04:19:05.617810 master-0 kubenswrapper[18592]: W0308 04:19:05.606959 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3baf7c23_212f_4e46_8734_6aebd220851f.slice/crio-70179208212f89cc314195ace3f70728cbf9f9394a33d5ce568dcd2ed70e70ae WatchSource:0}: Error finding container 70179208212f89cc314195ace3f70728cbf9f9394a33d5ce568dcd2ed70e70ae: Status 404 returned error can't find the container with id 70179208212f89cc314195ace3f70728cbf9f9394a33d5ce568dcd2ed70e70ae Mar 08 04:19:05.619091 master-0 kubenswrapper[18592]: I0308 04:19:05.619023 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1953d5ca-ae5f-488f-998f-bea80ea7a09c-scripts\") pod \"nova-cell1-conductor-db-sync-zqczq\" (UID: \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\") " pod="openstack/nova-cell1-conductor-db-sync-zqczq" Mar 08 04:19:05.630726 master-0 kubenswrapper[18592]: I0308 04:19:05.621050 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 04:19:05.630726 master-0 kubenswrapper[18592]: I0308 04:19:05.623254 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn7x4\" (UniqueName: \"kubernetes.io/projected/1953d5ca-ae5f-488f-998f-bea80ea7a09c-kube-api-access-qn7x4\") pod \"nova-cell1-conductor-db-sync-zqczq\" (UID: \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\") " pod="openstack/nova-cell1-conductor-db-sync-zqczq" Mar 08 04:19:05.630726 master-0 kubenswrapper[18592]: I0308 04:19:05.629646 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1953d5ca-ae5f-488f-998f-bea80ea7a09c-config-data\") pod \"nova-cell1-conductor-db-sync-zqczq\" (UID: \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\") " pod="openstack/nova-cell1-conductor-db-sync-zqczq" Mar 08 04:19:05.655360 master-0 kubenswrapper[18592]: I0308 04:19:05.651648 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 04:19:05.680567 master-0 kubenswrapper[18592]: I0308 04:19:05.680287 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zqczq" Mar 08 04:19:05.886062 master-0 kubenswrapper[18592]: I0308 04:19:05.886008 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd7bf96c-57r52"] Mar 08 04:19:05.904870 master-0 kubenswrapper[18592]: I0308 04:19:05.903936 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 04:19:05.922537 master-0 kubenswrapper[18592]: W0308 04:19:05.921981 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fbc5df3_3059_42a1_b753_e325df83f3bd.slice/crio-42171a16a3ca3dcb56e03a2da8f82d627b245ccfc2cd1d284816a44e3380ad26 WatchSource:0}: Error finding container 42171a16a3ca3dcb56e03a2da8f82d627b245ccfc2cd1d284816a44e3380ad26: Status 404 returned error can't find the container with id 42171a16a3ca3dcb56e03a2da8f82d627b245ccfc2cd1d284816a44e3380ad26 Mar 08 04:19:05.922537 master-0 kubenswrapper[18592]: W0308 04:19:05.922257 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3987d444_6d52_4bec_bcac_e9f562019bbf.slice/crio-4e25c7e7d2933a09f7fef50c9ad43bd81aefaf03f0d3b64caef52986875b5b78 WatchSource:0}: Error finding container 4e25c7e7d2933a09f7fef50c9ad43bd81aefaf03f0d3b64caef52986875b5b78: Status 404 returned error can't find the container with id 4e25c7e7d2933a09f7fef50c9ad43bd81aefaf03f0d3b64caef52986875b5b78 Mar 08 04:19:06.054847 master-0 kubenswrapper[18592]: I0308 04:19:06.054597 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 04:19:06.172240 master-0 kubenswrapper[18592]: I0308 04:19:06.169524 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"eb204bfb-9945-45b5-bf3c-e6d60826b959","Type":"ContainerStarted","Data":"124d28bee80f4c37d88684f837a94cfaa1a461cb0c09ffbfa1d992e53876912e"} Mar 08 04:19:06.172240 master-0 kubenswrapper[18592]: I0308 04:19:06.169573 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gz2ch" event={"ID":"7b079179-65b3-4f1a-8c57-b4f84a718761","Type":"ContainerStarted","Data":"45ad974296a648bc7b38887c2d33fcd930e3fd24f338de6b8310b1bfa6b5adf8"} Mar 08 04:19:06.173100 master-0 kubenswrapper[18592]: I0308 04:19:06.173055 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3987d444-6d52-4bec-bcac-e9f562019bbf","Type":"ContainerStarted","Data":"4e25c7e7d2933a09f7fef50c9ad43bd81aefaf03f0d3b64caef52986875b5b78"} Mar 08 04:19:06.175430 master-0 kubenswrapper[18592]: I0308 04:19:06.175389 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d","Type":"ContainerStarted","Data":"4f0256e39e86b30bae999cb1bc6aba0b533c766a03a5273d4eb7bdb18fc497d6"} Mar 08 04:19:06.178664 master-0 kubenswrapper[18592]: I0308 04:19:06.178630 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3baf7c23-212f-4e46-8734-6aebd220851f","Type":"ContainerStarted","Data":"70179208212f89cc314195ace3f70728cbf9f9394a33d5ce568dcd2ed70e70ae"} Mar 08 04:19:06.191031 master-0 kubenswrapper[18592]: I0308 04:19:06.187045 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"886aa90f-56dc-4e58-bafe-d3828e4f8781","Type":"ContainerStarted","Data":"80af8771c9b2fd97c216a2c620111c84a1aa43886542cfb6de2c2eb8ddf910e9"} Mar 08 04:19:06.195718 master-0 kubenswrapper[18592]: I0308 04:19:06.195516 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-gz2ch" podStartSLOduration=3.195500272 podStartE2EDuration="3.195500272s" podCreationTimestamp="2026-03-08 04:19:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:19:06.184509714 +0000 UTC m=+1558.283264064" watchObservedRunningTime="2026-03-08 04:19:06.195500272 +0000 UTC m=+1558.294254622" Mar 08 04:19:06.197609 master-0 kubenswrapper[18592]: I0308 04:19:06.196550 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" event={"ID":"5fbc5df3-3059-42a1-b753-e325df83f3bd","Type":"ContainerStarted","Data":"42171a16a3ca3dcb56e03a2da8f82d627b245ccfc2cd1d284816a44e3380ad26"} Mar 08 04:19:06.334536 master-0 kubenswrapper[18592]: I0308 04:19:06.333651 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zqczq"] Mar 08 04:19:07.225937 master-0 kubenswrapper[18592]: I0308 04:19:07.225803 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zqczq" event={"ID":"1953d5ca-ae5f-488f-998f-bea80ea7a09c","Type":"ContainerStarted","Data":"621d660dd302ca66a361ccaab4d3a4be58c28c37fe95952c072e7ebb1c51670a"} Mar 08 04:19:07.225937 master-0 kubenswrapper[18592]: I0308 04:19:07.225868 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zqczq" event={"ID":"1953d5ca-ae5f-488f-998f-bea80ea7a09c","Type":"ContainerStarted","Data":"b0e838330aefa215ec7df93b4b2c747923cc51eb4b496941a60a5adacfb5bc88"} Mar 08 04:19:07.228717 master-0 kubenswrapper[18592]: I0308 04:19:07.228651 18592 generic.go:334] "Generic (PLEG): container finished" podID="5fbc5df3-3059-42a1-b753-e325df83f3bd" containerID="e51ccc50fb947177adef20d9bab2569798703badf9faa82f42a9e8c66b4fba81" exitCode=0 Mar 08 04:19:07.229454 master-0 kubenswrapper[18592]: I0308 04:19:07.229375 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" event={"ID":"5fbc5df3-3059-42a1-b753-e325df83f3bd","Type":"ContainerDied","Data":"e51ccc50fb947177adef20d9bab2569798703badf9faa82f42a9e8c66b4fba81"} Mar 08 04:19:07.274501 master-0 kubenswrapper[18592]: I0308 04:19:07.274293 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-zqczq" podStartSLOduration=2.274273897 podStartE2EDuration="2.274273897s" podCreationTimestamp="2026-03-08 04:19:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:19:07.256329029 +0000 UTC m=+1559.355083379" watchObservedRunningTime="2026-03-08 04:19:07.274273897 +0000 UTC m=+1559.373028237" Mar 08 04:19:08.613129 master-0 kubenswrapper[18592]: I0308 04:19:08.612987 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 04:19:08.625455 master-0 kubenswrapper[18592]: I0308 04:19:08.625371 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 04:19:10.276155 master-0 kubenswrapper[18592]: I0308 04:19:10.276019 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3baf7c23-212f-4e46-8734-6aebd220851f","Type":"ContainerStarted","Data":"72aeac56b3682dc0408d7058c84d512338a216005814f3f0757ae58c815e221e"} Mar 08 04:19:10.276155 master-0 kubenswrapper[18592]: I0308 04:19:10.276079 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3baf7c23-212f-4e46-8734-6aebd220851f","Type":"ContainerStarted","Data":"2b8fd88fc1f2577b6d2684b86aeff480750f4e201772fa6287ad7fca66f4f415"} Mar 08 04:19:10.294347 master-0 kubenswrapper[18592]: I0308 04:19:10.294239 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="886aa90f-56dc-4e58-bafe-d3828e4f8781" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ddcb223c20eb0a761e62a7bc4b77640e34d5f3bbe403c859a4e251b97e190e3e" gracePeriod=30 Mar 08 04:19:10.294347 master-0 kubenswrapper[18592]: I0308 04:19:10.294259 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"886aa90f-56dc-4e58-bafe-d3828e4f8781","Type":"ContainerStarted","Data":"ddcb223c20eb0a761e62a7bc4b77640e34d5f3bbe403c859a4e251b97e190e3e"} Mar 08 04:19:10.304355 master-0 kubenswrapper[18592]: I0308 04:19:10.304136 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" event={"ID":"5fbc5df3-3059-42a1-b753-e325df83f3bd","Type":"ContainerStarted","Data":"594c4626b43e5156aef5f6f37841f7110536f38a59cde5003eb3067fec95e397"} Mar 08 04:19:10.304515 master-0 kubenswrapper[18592]: I0308 04:19:10.304501 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:10.306572 master-0 kubenswrapper[18592]: I0308 04:19:10.306523 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"eb204bfb-9945-45b5-bf3c-e6d60826b959","Type":"ContainerStarted","Data":"9df132ac71f9ce548624a8ded8e723b1388a034f1518c2705d226d0da4ca21d5"} Mar 08 04:19:10.310748 master-0 kubenswrapper[18592]: I0308 04:19:10.310712 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3987d444-6d52-4bec-bcac-e9f562019bbf","Type":"ContainerStarted","Data":"1e37089cb47bb604258a34a5e57bbdd2ea51494ac1bce6d7f42875de91135a78"} Mar 08 04:19:10.310748 master-0 kubenswrapper[18592]: I0308 04:19:10.310747 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3987d444-6d52-4bec-bcac-e9f562019bbf","Type":"ContainerStarted","Data":"1e5e6657486555a1eead830522c65683cd72b3c89bac8b9a7ea458abf0f1b222"} Mar 08 04:19:10.310904 master-0 kubenswrapper[18592]: I0308 04:19:10.310843 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3987d444-6d52-4bec-bcac-e9f562019bbf" containerName="nova-metadata-log" containerID="cri-o://1e5e6657486555a1eead830522c65683cd72b3c89bac8b9a7ea458abf0f1b222" gracePeriod=30 Mar 08 04:19:10.310941 master-0 kubenswrapper[18592]: I0308 04:19:10.310921 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3987d444-6d52-4bec-bcac-e9f562019bbf" containerName="nova-metadata-metadata" containerID="cri-o://1e37089cb47bb604258a34a5e57bbdd2ea51494ac1bce6d7f42875de91135a78" gracePeriod=30 Mar 08 04:19:10.576140 master-0 kubenswrapper[18592]: I0308 04:19:10.575917 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.888682328 podStartE2EDuration="6.57581919s" podCreationTimestamp="2026-03-08 04:19:04 +0000 UTC" firstStartedPulling="2026-03-08 04:19:05.633766658 +0000 UTC m=+1557.732521008" lastFinishedPulling="2026-03-08 04:19:09.32090351 +0000 UTC m=+1561.419657870" observedRunningTime="2026-03-08 04:19:10.538848996 +0000 UTC m=+1562.637603366" watchObservedRunningTime="2026-03-08 04:19:10.57581919 +0000 UTC m=+1562.674573540" Mar 08 04:19:10.584344 master-0 kubenswrapper[18592]: I0308 04:19:10.584239 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.337457303 podStartE2EDuration="6.584215888s" podCreationTimestamp="2026-03-08 04:19:04 +0000 UTC" firstStartedPulling="2026-03-08 04:19:06.076976772 +0000 UTC m=+1558.175731122" lastFinishedPulling="2026-03-08 04:19:09.323735357 +0000 UTC m=+1561.422489707" observedRunningTime="2026-03-08 04:19:10.570009272 +0000 UTC m=+1562.668763622" watchObservedRunningTime="2026-03-08 04:19:10.584215888 +0000 UTC m=+1562.682970238" Mar 08 04:19:10.634853 master-0 kubenswrapper[18592]: I0308 04:19:10.633770 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.97856341 podStartE2EDuration="6.633742984s" podCreationTimestamp="2026-03-08 04:19:04 +0000 UTC" firstStartedPulling="2026-03-08 04:19:05.636114622 +0000 UTC m=+1557.734868972" lastFinishedPulling="2026-03-08 04:19:09.291294186 +0000 UTC m=+1561.390048546" observedRunningTime="2026-03-08 04:19:10.601238891 +0000 UTC m=+1562.699993261" watchObservedRunningTime="2026-03-08 04:19:10.633742984 +0000 UTC m=+1562.732497334" Mar 08 04:19:10.681521 master-0 kubenswrapper[18592]: I0308 04:19:10.680676 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.315109236 podStartE2EDuration="6.680652679s" podCreationTimestamp="2026-03-08 04:19:04 +0000 UTC" firstStartedPulling="2026-03-08 04:19:05.925799104 +0000 UTC m=+1558.024553454" lastFinishedPulling="2026-03-08 04:19:09.291342547 +0000 UTC m=+1561.390096897" observedRunningTime="2026-03-08 04:19:10.630882097 +0000 UTC m=+1562.729636447" watchObservedRunningTime="2026-03-08 04:19:10.680652679 +0000 UTC m=+1562.779407029" Mar 08 04:19:10.687919 master-0 kubenswrapper[18592]: I0308 04:19:10.687260 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" podStartSLOduration=6.687238568 podStartE2EDuration="6.687238568s" podCreationTimestamp="2026-03-08 04:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:19:10.660429409 +0000 UTC m=+1562.759183769" watchObservedRunningTime="2026-03-08 04:19:10.687238568 +0000 UTC m=+1562.785992908" Mar 08 04:19:11.345916 master-0 kubenswrapper[18592]: I0308 04:19:11.345756 18592 generic.go:334] "Generic (PLEG): container finished" podID="3987d444-6d52-4bec-bcac-e9f562019bbf" containerID="1e5e6657486555a1eead830522c65683cd72b3c89bac8b9a7ea458abf0f1b222" exitCode=143 Mar 08 04:19:11.346475 master-0 kubenswrapper[18592]: I0308 04:19:11.345991 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3987d444-6d52-4bec-bcac-e9f562019bbf","Type":"ContainerDied","Data":"1e5e6657486555a1eead830522c65683cd72b3c89bac8b9a7ea458abf0f1b222"} Mar 08 04:19:14.788275 master-0 kubenswrapper[18592]: I0308 04:19:14.788205 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:14.799856 master-0 kubenswrapper[18592]: I0308 04:19:14.799753 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 04:19:14.799856 master-0 kubenswrapper[18592]: I0308 04:19:14.799836 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 04:19:14.894853 master-0 kubenswrapper[18592]: I0308 04:19:14.891079 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 04:19:14.894853 master-0 kubenswrapper[18592]: I0308 04:19:14.891150 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 04:19:14.918062 master-0 kubenswrapper[18592]: I0308 04:19:14.918015 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 04:19:14.918062 master-0 kubenswrapper[18592]: I0308 04:19:14.918061 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 04:19:14.951012 master-0 kubenswrapper[18592]: I0308 04:19:14.950968 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 04:19:15.015723 master-0 kubenswrapper[18592]: I0308 04:19:15.015310 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:15.114039 master-0 kubenswrapper[18592]: I0308 04:19:15.113206 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78b99bbb9f-4bv2m"] Mar 08 04:19:15.114039 master-0 kubenswrapper[18592]: I0308 04:19:15.113452 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" podUID="09bdbbb9-e374-47e8-8ed6-48c2ffd8e252" containerName="dnsmasq-dns" containerID="cri-o://97c60d17ee3243d1ac5dc1fdcc1a240bb0578b08472552c1cb07b9e8c4f1bb6a" gracePeriod=10 Mar 08 04:19:15.491571 master-0 kubenswrapper[18592]: I0308 04:19:15.491508 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 04:19:15.897108 master-0 kubenswrapper[18592]: I0308 04:19:15.897050 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3baf7c23-212f-4e46-8734-6aebd220851f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.0:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 04:19:15.898023 master-0 kubenswrapper[18592]: I0308 04:19:15.897299 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3baf7c23-212f-4e46-8734-6aebd220851f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.0:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 04:19:18.619074 master-0 kubenswrapper[18592]: I0308 04:19:18.618409 18592 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" podUID="09bdbbb9-e374-47e8-8ed6-48c2ffd8e252" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.240:5353: connect: connection refused" Mar 08 04:19:20.533674 master-0 kubenswrapper[18592]: I0308 04:19:20.533172 18592 generic.go:334] "Generic (PLEG): container finished" podID="09bdbbb9-e374-47e8-8ed6-48c2ffd8e252" containerID="97c60d17ee3243d1ac5dc1fdcc1a240bb0578b08472552c1cb07b9e8c4f1bb6a" exitCode=0 Mar 08 04:19:20.533674 master-0 kubenswrapper[18592]: I0308 04:19:20.533260 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" event={"ID":"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252","Type":"ContainerDied","Data":"97c60d17ee3243d1ac5dc1fdcc1a240bb0578b08472552c1cb07b9e8c4f1bb6a"} Mar 08 04:19:20.538601 master-0 kubenswrapper[18592]: I0308 04:19:20.537668 18592 generic.go:334] "Generic (PLEG): container finished" podID="7b079179-65b3-4f1a-8c57-b4f84a718761" containerID="45ad974296a648bc7b38887c2d33fcd930e3fd24f338de6b8310b1bfa6b5adf8" exitCode=0 Mar 08 04:19:20.538601 master-0 kubenswrapper[18592]: I0308 04:19:20.537756 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gz2ch" event={"ID":"7b079179-65b3-4f1a-8c57-b4f84a718761","Type":"ContainerDied","Data":"45ad974296a648bc7b38887c2d33fcd930e3fd24f338de6b8310b1bfa6b5adf8"} Mar 08 04:19:20.544234 master-0 kubenswrapper[18592]: I0308 04:19:20.542162 18592 generic.go:334] "Generic (PLEG): container finished" podID="1953d5ca-ae5f-488f-998f-bea80ea7a09c" containerID="621d660dd302ca66a361ccaab4d3a4be58c28c37fe95952c072e7ebb1c51670a" exitCode=0 Mar 08 04:19:20.544234 master-0 kubenswrapper[18592]: I0308 04:19:20.542204 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zqczq" event={"ID":"1953d5ca-ae5f-488f-998f-bea80ea7a09c","Type":"ContainerDied","Data":"621d660dd302ca66a361ccaab4d3a4be58c28c37fe95952c072e7ebb1c51670a"} Mar 08 04:19:20.797194 master-0 kubenswrapper[18592]: I0308 04:19:20.797075 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:19:20.903073 master-0 kubenswrapper[18592]: I0308 04:19:20.901660 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-ovsdbserver-nb\") pod \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " Mar 08 04:19:20.903073 master-0 kubenswrapper[18592]: I0308 04:19:20.901793 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-ovsdbserver-sb\") pod \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " Mar 08 04:19:20.903073 master-0 kubenswrapper[18592]: I0308 04:19:20.901860 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-dns-svc\") pod \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " Mar 08 04:19:20.903073 master-0 kubenswrapper[18592]: I0308 04:19:20.901923 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw7st\" (UniqueName: \"kubernetes.io/projected/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-kube-api-access-nw7st\") pod \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " Mar 08 04:19:20.903073 master-0 kubenswrapper[18592]: I0308 04:19:20.901962 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-config\") pod \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " Mar 08 04:19:20.903073 master-0 kubenswrapper[18592]: I0308 04:19:20.902070 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-dns-swift-storage-0\") pod \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\" (UID: \"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252\") " Mar 08 04:19:20.910243 master-0 kubenswrapper[18592]: I0308 04:19:20.910186 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-kube-api-access-nw7st" (OuterVolumeSpecName: "kube-api-access-nw7st") pod "09bdbbb9-e374-47e8-8ed6-48c2ffd8e252" (UID: "09bdbbb9-e374-47e8-8ed6-48c2ffd8e252"). InnerVolumeSpecName "kube-api-access-nw7st". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:19:20.958468 master-0 kubenswrapper[18592]: I0308 04:19:20.958406 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "09bdbbb9-e374-47e8-8ed6-48c2ffd8e252" (UID: "09bdbbb9-e374-47e8-8ed6-48c2ffd8e252"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:19:20.964186 master-0 kubenswrapper[18592]: I0308 04:19:20.964151 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-config" (OuterVolumeSpecName: "config") pod "09bdbbb9-e374-47e8-8ed6-48c2ffd8e252" (UID: "09bdbbb9-e374-47e8-8ed6-48c2ffd8e252"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:19:20.969773 master-0 kubenswrapper[18592]: I0308 04:19:20.969723 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "09bdbbb9-e374-47e8-8ed6-48c2ffd8e252" (UID: "09bdbbb9-e374-47e8-8ed6-48c2ffd8e252"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:19:20.973943 master-0 kubenswrapper[18592]: I0308 04:19:20.973887 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "09bdbbb9-e374-47e8-8ed6-48c2ffd8e252" (UID: "09bdbbb9-e374-47e8-8ed6-48c2ffd8e252"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:19:20.990065 master-0 kubenswrapper[18592]: I0308 04:19:20.989941 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "09bdbbb9-e374-47e8-8ed6-48c2ffd8e252" (UID: "09bdbbb9-e374-47e8-8ed6-48c2ffd8e252"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:19:21.005542 master-0 kubenswrapper[18592]: I0308 04:19:21.005473 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:21.005542 master-0 kubenswrapper[18592]: I0308 04:19:21.005524 18592 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:21.005542 master-0 kubenswrapper[18592]: I0308 04:19:21.005539 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw7st\" (UniqueName: \"kubernetes.io/projected/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-kube-api-access-nw7st\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:21.005542 master-0 kubenswrapper[18592]: I0308 04:19:21.005557 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:21.005870 master-0 kubenswrapper[18592]: I0308 04:19:21.005572 18592 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:21.005870 master-0 kubenswrapper[18592]: I0308 04:19:21.005586 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:21.560380 master-0 kubenswrapper[18592]: I0308 04:19:21.558559 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d","Type":"ContainerStarted","Data":"476e6f8a0b1ef7019341ad8083de6165ef556f1afd201474705428e2c367a6b8"} Mar 08 04:19:21.560380 master-0 kubenswrapper[18592]: I0308 04:19:21.560072 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 04:19:21.564462 master-0 kubenswrapper[18592]: I0308 04:19:21.563061 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" Mar 08 04:19:21.572924 master-0 kubenswrapper[18592]: I0308 04:19:21.572897 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b99bbb9f-4bv2m" event={"ID":"09bdbbb9-e374-47e8-8ed6-48c2ffd8e252","Type":"ContainerDied","Data":"1f979ec6ce88865ea0f7dbf7ca0c057fc76c0c6b34e72ea8bfa2747405d9b91a"} Mar 08 04:19:21.573639 master-0 kubenswrapper[18592]: I0308 04:19:21.573616 18592 scope.go:117] "RemoveContainer" containerID="97c60d17ee3243d1ac5dc1fdcc1a240bb0578b08472552c1cb07b9e8c4f1bb6a" Mar 08 04:19:21.595277 master-0 kubenswrapper[18592]: I0308 04:19:21.595183 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-compute-ironic-compute-0" podStartSLOduration=3.636678412 podStartE2EDuration="18.595157663s" podCreationTimestamp="2026-03-08 04:19:03 +0000 UTC" firstStartedPulling="2026-03-08 04:19:05.510041656 +0000 UTC m=+1557.608796006" lastFinishedPulling="2026-03-08 04:19:20.468520877 +0000 UTC m=+1572.567275257" observedRunningTime="2026-03-08 04:19:21.588279755 +0000 UTC m=+1573.687034105" watchObservedRunningTime="2026-03-08 04:19:21.595157663 +0000 UTC m=+1573.693912023" Mar 08 04:19:21.616856 master-0 kubenswrapper[18592]: I0308 04:19:21.614464 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 04:19:21.630024 master-0 kubenswrapper[18592]: I0308 04:19:21.629985 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78b99bbb9f-4bv2m"] Mar 08 04:19:21.644761 master-0 kubenswrapper[18592]: I0308 04:19:21.644691 18592 scope.go:117] "RemoveContainer" containerID="3020c1f4e33f3277089f2e743f943eff3521ab8f7bc288529f4544ace38be504" Mar 08 04:19:21.654593 master-0 kubenswrapper[18592]: I0308 04:19:21.654535 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78b99bbb9f-4bv2m"] Mar 08 04:19:22.159723 master-0 kubenswrapper[18592]: I0308 04:19:22.159596 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09bdbbb9-e374-47e8-8ed6-48c2ffd8e252" path="/var/lib/kubelet/pods/09bdbbb9-e374-47e8-8ed6-48c2ffd8e252/volumes" Mar 08 04:19:22.196171 master-0 kubenswrapper[18592]: I0308 04:19:22.196133 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zqczq" Mar 08 04:19:22.217538 master-0 kubenswrapper[18592]: I0308 04:19:22.217500 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gz2ch" Mar 08 04:19:22.342143 master-0 kubenswrapper[18592]: I0308 04:19:22.341321 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b079179-65b3-4f1a-8c57-b4f84a718761-scripts\") pod \"7b079179-65b3-4f1a-8c57-b4f84a718761\" (UID: \"7b079179-65b3-4f1a-8c57-b4f84a718761\") " Mar 08 04:19:22.342143 master-0 kubenswrapper[18592]: I0308 04:19:22.341437 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1953d5ca-ae5f-488f-998f-bea80ea7a09c-combined-ca-bundle\") pod \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\" (UID: \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\") " Mar 08 04:19:22.342143 master-0 kubenswrapper[18592]: I0308 04:19:22.341472 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1953d5ca-ae5f-488f-998f-bea80ea7a09c-config-data\") pod \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\" (UID: \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\") " Mar 08 04:19:22.342143 master-0 kubenswrapper[18592]: I0308 04:19:22.341551 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8dqd\" (UniqueName: \"kubernetes.io/projected/7b079179-65b3-4f1a-8c57-b4f84a718761-kube-api-access-j8dqd\") pod \"7b079179-65b3-4f1a-8c57-b4f84a718761\" (UID: \"7b079179-65b3-4f1a-8c57-b4f84a718761\") " Mar 08 04:19:22.342143 master-0 kubenswrapper[18592]: I0308 04:19:22.341637 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b079179-65b3-4f1a-8c57-b4f84a718761-combined-ca-bundle\") pod \"7b079179-65b3-4f1a-8c57-b4f84a718761\" (UID: \"7b079179-65b3-4f1a-8c57-b4f84a718761\") " Mar 08 04:19:22.342143 master-0 kubenswrapper[18592]: I0308 04:19:22.341795 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b079179-65b3-4f1a-8c57-b4f84a718761-config-data\") pod \"7b079179-65b3-4f1a-8c57-b4f84a718761\" (UID: \"7b079179-65b3-4f1a-8c57-b4f84a718761\") " Mar 08 04:19:22.342143 master-0 kubenswrapper[18592]: I0308 04:19:22.341921 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1953d5ca-ae5f-488f-998f-bea80ea7a09c-scripts\") pod \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\" (UID: \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\") " Mar 08 04:19:22.342143 master-0 kubenswrapper[18592]: I0308 04:19:22.341961 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn7x4\" (UniqueName: \"kubernetes.io/projected/1953d5ca-ae5f-488f-998f-bea80ea7a09c-kube-api-access-qn7x4\") pod \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\" (UID: \"1953d5ca-ae5f-488f-998f-bea80ea7a09c\") " Mar 08 04:19:22.347318 master-0 kubenswrapper[18592]: I0308 04:19:22.346246 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1953d5ca-ae5f-488f-998f-bea80ea7a09c-kube-api-access-qn7x4" (OuterVolumeSpecName: "kube-api-access-qn7x4") pod "1953d5ca-ae5f-488f-998f-bea80ea7a09c" (UID: "1953d5ca-ae5f-488f-998f-bea80ea7a09c"). InnerVolumeSpecName "kube-api-access-qn7x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:19:22.348938 master-0 kubenswrapper[18592]: I0308 04:19:22.348792 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b079179-65b3-4f1a-8c57-b4f84a718761-kube-api-access-j8dqd" (OuterVolumeSpecName: "kube-api-access-j8dqd") pod "7b079179-65b3-4f1a-8c57-b4f84a718761" (UID: "7b079179-65b3-4f1a-8c57-b4f84a718761"). InnerVolumeSpecName "kube-api-access-j8dqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:19:22.349144 master-0 kubenswrapper[18592]: I0308 04:19:22.349016 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1953d5ca-ae5f-488f-998f-bea80ea7a09c-scripts" (OuterVolumeSpecName: "scripts") pod "1953d5ca-ae5f-488f-998f-bea80ea7a09c" (UID: "1953d5ca-ae5f-488f-998f-bea80ea7a09c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:22.349495 master-0 kubenswrapper[18592]: I0308 04:19:22.349464 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b079179-65b3-4f1a-8c57-b4f84a718761-scripts" (OuterVolumeSpecName: "scripts") pod "7b079179-65b3-4f1a-8c57-b4f84a718761" (UID: "7b079179-65b3-4f1a-8c57-b4f84a718761"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:22.374652 master-0 kubenswrapper[18592]: I0308 04:19:22.374602 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1953d5ca-ae5f-488f-998f-bea80ea7a09c-config-data" (OuterVolumeSpecName: "config-data") pod "1953d5ca-ae5f-488f-998f-bea80ea7a09c" (UID: "1953d5ca-ae5f-488f-998f-bea80ea7a09c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:22.379531 master-0 kubenswrapper[18592]: I0308 04:19:22.378540 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b079179-65b3-4f1a-8c57-b4f84a718761-config-data" (OuterVolumeSpecName: "config-data") pod "7b079179-65b3-4f1a-8c57-b4f84a718761" (UID: "7b079179-65b3-4f1a-8c57-b4f84a718761"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:22.379531 master-0 kubenswrapper[18592]: I0308 04:19:22.378846 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b079179-65b3-4f1a-8c57-b4f84a718761-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b079179-65b3-4f1a-8c57-b4f84a718761" (UID: "7b079179-65b3-4f1a-8c57-b4f84a718761"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:22.379531 master-0 kubenswrapper[18592]: I0308 04:19:22.379350 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1953d5ca-ae5f-488f-998f-bea80ea7a09c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1953d5ca-ae5f-488f-998f-bea80ea7a09c" (UID: "1953d5ca-ae5f-488f-998f-bea80ea7a09c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:22.445321 master-0 kubenswrapper[18592]: I0308 04:19:22.445270 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b079179-65b3-4f1a-8c57-b4f84a718761-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:22.445321 master-0 kubenswrapper[18592]: I0308 04:19:22.445313 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1953d5ca-ae5f-488f-998f-bea80ea7a09c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:22.445321 master-0 kubenswrapper[18592]: I0308 04:19:22.445326 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1953d5ca-ae5f-488f-998f-bea80ea7a09c-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:22.445321 master-0 kubenswrapper[18592]: I0308 04:19:22.445335 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8dqd\" (UniqueName: \"kubernetes.io/projected/7b079179-65b3-4f1a-8c57-b4f84a718761-kube-api-access-j8dqd\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:22.445668 master-0 kubenswrapper[18592]: I0308 04:19:22.445344 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b079179-65b3-4f1a-8c57-b4f84a718761-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:22.445668 master-0 kubenswrapper[18592]: I0308 04:19:22.445353 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b079179-65b3-4f1a-8c57-b4f84a718761-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:22.445668 master-0 kubenswrapper[18592]: I0308 04:19:22.445361 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1953d5ca-ae5f-488f-998f-bea80ea7a09c-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:22.445668 master-0 kubenswrapper[18592]: I0308 04:19:22.445369 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn7x4\" (UniqueName: \"kubernetes.io/projected/1953d5ca-ae5f-488f-998f-bea80ea7a09c-kube-api-access-qn7x4\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:22.577587 master-0 kubenswrapper[18592]: I0308 04:19:22.577428 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gz2ch" event={"ID":"7b079179-65b3-4f1a-8c57-b4f84a718761","Type":"ContainerDied","Data":"d8a24d0f3cb8b9e6c9edf95cb614b93f962110b11a405c31fe6d6fdea0de752a"} Mar 08 04:19:22.577587 master-0 kubenswrapper[18592]: I0308 04:19:22.577587 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8a24d0f3cb8b9e6c9edf95cb614b93f962110b11a405c31fe6d6fdea0de752a" Mar 08 04:19:22.578261 master-0 kubenswrapper[18592]: I0308 04:19:22.577682 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gz2ch" Mar 08 04:19:22.582525 master-0 kubenswrapper[18592]: I0308 04:19:22.582443 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zqczq" event={"ID":"1953d5ca-ae5f-488f-998f-bea80ea7a09c","Type":"ContainerDied","Data":"b0e838330aefa215ec7df93b4b2c747923cc51eb4b496941a60a5adacfb5bc88"} Mar 08 04:19:22.582690 master-0 kubenswrapper[18592]: I0308 04:19:22.582532 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0e838330aefa215ec7df93b4b2c747923cc51eb4b496941a60a5adacfb5bc88" Mar 08 04:19:22.582690 master-0 kubenswrapper[18592]: I0308 04:19:22.582498 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zqczq" Mar 08 04:19:22.697998 master-0 kubenswrapper[18592]: I0308 04:19:22.697783 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 04:19:22.698387 master-0 kubenswrapper[18592]: E0308 04:19:22.698362 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09bdbbb9-e374-47e8-8ed6-48c2ffd8e252" containerName="dnsmasq-dns" Mar 08 04:19:22.698387 master-0 kubenswrapper[18592]: I0308 04:19:22.698380 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="09bdbbb9-e374-47e8-8ed6-48c2ffd8e252" containerName="dnsmasq-dns" Mar 08 04:19:22.698507 master-0 kubenswrapper[18592]: E0308 04:19:22.698413 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b079179-65b3-4f1a-8c57-b4f84a718761" containerName="nova-manage" Mar 08 04:19:22.698507 master-0 kubenswrapper[18592]: I0308 04:19:22.698419 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b079179-65b3-4f1a-8c57-b4f84a718761" containerName="nova-manage" Mar 08 04:19:22.698507 master-0 kubenswrapper[18592]: E0308 04:19:22.698444 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1953d5ca-ae5f-488f-998f-bea80ea7a09c" containerName="nova-cell1-conductor-db-sync" Mar 08 04:19:22.698507 master-0 kubenswrapper[18592]: I0308 04:19:22.698451 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="1953d5ca-ae5f-488f-998f-bea80ea7a09c" containerName="nova-cell1-conductor-db-sync" Mar 08 04:19:22.698507 master-0 kubenswrapper[18592]: E0308 04:19:22.698464 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09bdbbb9-e374-47e8-8ed6-48c2ffd8e252" containerName="init" Mar 08 04:19:22.698507 master-0 kubenswrapper[18592]: I0308 04:19:22.698470 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="09bdbbb9-e374-47e8-8ed6-48c2ffd8e252" containerName="init" Mar 08 04:19:22.698882 master-0 kubenswrapper[18592]: I0308 04:19:22.698695 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="1953d5ca-ae5f-488f-998f-bea80ea7a09c" containerName="nova-cell1-conductor-db-sync" Mar 08 04:19:22.698882 master-0 kubenswrapper[18592]: I0308 04:19:22.698730 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b079179-65b3-4f1a-8c57-b4f84a718761" containerName="nova-manage" Mar 08 04:19:22.698882 master-0 kubenswrapper[18592]: I0308 04:19:22.698741 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="09bdbbb9-e374-47e8-8ed6-48c2ffd8e252" containerName="dnsmasq-dns" Mar 08 04:19:22.699763 master-0 kubenswrapper[18592]: I0308 04:19:22.699727 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 04:19:22.702310 master-0 kubenswrapper[18592]: I0308 04:19:22.702177 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 08 04:19:22.729258 master-0 kubenswrapper[18592]: I0308 04:19:22.729043 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 04:19:22.792110 master-0 kubenswrapper[18592]: I0308 04:19:22.792045 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 04:19:22.792355 master-0 kubenswrapper[18592]: I0308 04:19:22.792328 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3baf7c23-212f-4e46-8734-6aebd220851f" containerName="nova-api-log" containerID="cri-o://2b8fd88fc1f2577b6d2684b86aeff480750f4e201772fa6287ad7fca66f4f415" gracePeriod=30 Mar 08 04:19:22.792487 master-0 kubenswrapper[18592]: I0308 04:19:22.792468 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3baf7c23-212f-4e46-8734-6aebd220851f" containerName="nova-api-api" containerID="cri-o://72aeac56b3682dc0408d7058c84d512338a216005814f3f0757ae58c815e221e" gracePeriod=30 Mar 08 04:19:22.829970 master-0 kubenswrapper[18592]: I0308 04:19:22.829535 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 04:19:22.829970 master-0 kubenswrapper[18592]: I0308 04:19:22.829789 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="eb204bfb-9945-45b5-bf3c-e6d60826b959" containerName="nova-scheduler-scheduler" containerID="cri-o://9df132ac71f9ce548624a8ded8e723b1388a034f1518c2705d226d0da4ca21d5" gracePeriod=30 Mar 08 04:19:22.858158 master-0 kubenswrapper[18592]: I0308 04:19:22.857778 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3590bdae-73b7-424b-897d-20a88497d3d0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3590bdae-73b7-424b-897d-20a88497d3d0\") " pod="openstack/nova-cell1-conductor-0" Mar 08 04:19:22.858158 master-0 kubenswrapper[18592]: I0308 04:19:22.857936 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmcn8\" (UniqueName: \"kubernetes.io/projected/3590bdae-73b7-424b-897d-20a88497d3d0-kube-api-access-qmcn8\") pod \"nova-cell1-conductor-0\" (UID: \"3590bdae-73b7-424b-897d-20a88497d3d0\") " pod="openstack/nova-cell1-conductor-0" Mar 08 04:19:22.858470 master-0 kubenswrapper[18592]: I0308 04:19:22.858447 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3590bdae-73b7-424b-897d-20a88497d3d0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3590bdae-73b7-424b-897d-20a88497d3d0\") " pod="openstack/nova-cell1-conductor-0" Mar 08 04:19:22.964388 master-0 kubenswrapper[18592]: I0308 04:19:22.964144 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3590bdae-73b7-424b-897d-20a88497d3d0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3590bdae-73b7-424b-897d-20a88497d3d0\") " pod="openstack/nova-cell1-conductor-0" Mar 08 04:19:22.964388 master-0 kubenswrapper[18592]: I0308 04:19:22.964205 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmcn8\" (UniqueName: \"kubernetes.io/projected/3590bdae-73b7-424b-897d-20a88497d3d0-kube-api-access-qmcn8\") pod \"nova-cell1-conductor-0\" (UID: \"3590bdae-73b7-424b-897d-20a88497d3d0\") " pod="openstack/nova-cell1-conductor-0" Mar 08 04:19:22.964388 master-0 kubenswrapper[18592]: I0308 04:19:22.964364 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3590bdae-73b7-424b-897d-20a88497d3d0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3590bdae-73b7-424b-897d-20a88497d3d0\") " pod="openstack/nova-cell1-conductor-0" Mar 08 04:19:22.969840 master-0 kubenswrapper[18592]: I0308 04:19:22.969799 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3590bdae-73b7-424b-897d-20a88497d3d0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3590bdae-73b7-424b-897d-20a88497d3d0\") " pod="openstack/nova-cell1-conductor-0" Mar 08 04:19:22.973323 master-0 kubenswrapper[18592]: I0308 04:19:22.971296 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3590bdae-73b7-424b-897d-20a88497d3d0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3590bdae-73b7-424b-897d-20a88497d3d0\") " pod="openstack/nova-cell1-conductor-0" Mar 08 04:19:22.983534 master-0 kubenswrapper[18592]: I0308 04:19:22.983478 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmcn8\" (UniqueName: \"kubernetes.io/projected/3590bdae-73b7-424b-897d-20a88497d3d0-kube-api-access-qmcn8\") pod \"nova-cell1-conductor-0\" (UID: \"3590bdae-73b7-424b-897d-20a88497d3d0\") " pod="openstack/nova-cell1-conductor-0" Mar 08 04:19:23.020532 master-0 kubenswrapper[18592]: I0308 04:19:23.020488 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 04:19:23.536896 master-0 kubenswrapper[18592]: I0308 04:19:23.535959 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 04:19:23.603633 master-0 kubenswrapper[18592]: I0308 04:19:23.603560 18592 generic.go:334] "Generic (PLEG): container finished" podID="3baf7c23-212f-4e46-8734-6aebd220851f" containerID="2b8fd88fc1f2577b6d2684b86aeff480750f4e201772fa6287ad7fca66f4f415" exitCode=143 Mar 08 04:19:23.604449 master-0 kubenswrapper[18592]: I0308 04:19:23.603625 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3baf7c23-212f-4e46-8734-6aebd220851f","Type":"ContainerDied","Data":"2b8fd88fc1f2577b6d2684b86aeff480750f4e201772fa6287ad7fca66f4f415"} Mar 08 04:19:23.604785 master-0 kubenswrapper[18592]: I0308 04:19:23.604710 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3590bdae-73b7-424b-897d-20a88497d3d0","Type":"ContainerStarted","Data":"3148a05632620cecaf6f61e1523a7495df3e72c4bf290941215b616ac24692d7"} Mar 08 04:19:24.621675 master-0 kubenswrapper[18592]: I0308 04:19:24.621596 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3590bdae-73b7-424b-897d-20a88497d3d0","Type":"ContainerStarted","Data":"bdebcfbb7a051e823f8a7fa110bd1b220bcc8b9d5f9566258388a8f14811e52f"} Mar 08 04:19:24.622485 master-0 kubenswrapper[18592]: I0308 04:19:24.621725 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 08 04:19:24.686079 master-0 kubenswrapper[18592]: I0308 04:19:24.685537 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.685514258 podStartE2EDuration="2.685514258s" podCreationTimestamp="2026-03-08 04:19:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:19:24.675732971 +0000 UTC m=+1576.774487361" watchObservedRunningTime="2026-03-08 04:19:24.685514258 +0000 UTC m=+1576.784268618" Mar 08 04:19:24.919172 master-0 kubenswrapper[18592]: E0308 04:19:24.919086 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9df132ac71f9ce548624a8ded8e723b1388a034f1518c2705d226d0da4ca21d5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 04:19:24.921737 master-0 kubenswrapper[18592]: E0308 04:19:24.921661 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9df132ac71f9ce548624a8ded8e723b1388a034f1518c2705d226d0da4ca21d5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 04:19:24.923559 master-0 kubenswrapper[18592]: E0308 04:19:24.923491 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9df132ac71f9ce548624a8ded8e723b1388a034f1518c2705d226d0da4ca21d5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 04:19:24.923559 master-0 kubenswrapper[18592]: E0308 04:19:24.923543 18592 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="eb204bfb-9945-45b5-bf3c-e6d60826b959" containerName="nova-scheduler-scheduler" Mar 08 04:19:25.642812 master-0 kubenswrapper[18592]: I0308 04:19:25.642743 18592 generic.go:334] "Generic (PLEG): container finished" podID="abe912ba-4a33-4634-a3fb-b6fb09b38d8e" containerID="436fa8830779ba8629c29fae95b09526f25d16ee956367773b8f253f32f5e7d8" exitCode=0 Mar 08 04:19:25.643358 master-0 kubenswrapper[18592]: I0308 04:19:25.642886 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"abe912ba-4a33-4634-a3fb-b6fb09b38d8e","Type":"ContainerDied","Data":"436fa8830779ba8629c29fae95b09526f25d16ee956367773b8f253f32f5e7d8"} Mar 08 04:19:26.575092 master-0 kubenswrapper[18592]: I0308 04:19:26.575051 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 04:19:26.660974 master-0 kubenswrapper[18592]: I0308 04:19:26.660859 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3baf7c23-212f-4e46-8734-6aebd220851f-config-data\") pod \"3baf7c23-212f-4e46-8734-6aebd220851f\" (UID: \"3baf7c23-212f-4e46-8734-6aebd220851f\") " Mar 08 04:19:26.660974 master-0 kubenswrapper[18592]: I0308 04:19:26.660934 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3baf7c23-212f-4e46-8734-6aebd220851f-combined-ca-bundle\") pod \"3baf7c23-212f-4e46-8734-6aebd220851f\" (UID: \"3baf7c23-212f-4e46-8734-6aebd220851f\") " Mar 08 04:19:26.661441 master-0 kubenswrapper[18592]: I0308 04:19:26.660993 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3baf7c23-212f-4e46-8734-6aebd220851f-logs\") pod \"3baf7c23-212f-4e46-8734-6aebd220851f\" (UID: \"3baf7c23-212f-4e46-8734-6aebd220851f\") " Mar 08 04:19:26.661441 master-0 kubenswrapper[18592]: I0308 04:19:26.661138 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7rkz\" (UniqueName: \"kubernetes.io/projected/3baf7c23-212f-4e46-8734-6aebd220851f-kube-api-access-k7rkz\") pod \"3baf7c23-212f-4e46-8734-6aebd220851f\" (UID: \"3baf7c23-212f-4e46-8734-6aebd220851f\") " Mar 08 04:19:26.661973 master-0 kubenswrapper[18592]: I0308 04:19:26.661900 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3baf7c23-212f-4e46-8734-6aebd220851f-logs" (OuterVolumeSpecName: "logs") pod "3baf7c23-212f-4e46-8734-6aebd220851f" (UID: "3baf7c23-212f-4e46-8734-6aebd220851f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:19:26.661973 master-0 kubenswrapper[18592]: I0308 04:19:26.661912 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"abe912ba-4a33-4634-a3fb-b6fb09b38d8e","Type":"ContainerStarted","Data":"0ced4f7e3d2a40f74eb104a537093b96ef01a7441df6f02747c1450494cb1bf7"} Mar 08 04:19:26.664385 master-0 kubenswrapper[18592]: I0308 04:19:26.664346 18592 generic.go:334] "Generic (PLEG): container finished" podID="3baf7c23-212f-4e46-8734-6aebd220851f" containerID="72aeac56b3682dc0408d7058c84d512338a216005814f3f0757ae58c815e221e" exitCode=0 Mar 08 04:19:26.664472 master-0 kubenswrapper[18592]: I0308 04:19:26.664412 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3baf7c23-212f-4e46-8734-6aebd220851f","Type":"ContainerDied","Data":"72aeac56b3682dc0408d7058c84d512338a216005814f3f0757ae58c815e221e"} Mar 08 04:19:26.664517 master-0 kubenswrapper[18592]: I0308 04:19:26.664492 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3baf7c23-212f-4e46-8734-6aebd220851f","Type":"ContainerDied","Data":"70179208212f89cc314195ace3f70728cbf9f9394a33d5ce568dcd2ed70e70ae"} Mar 08 04:19:26.664552 master-0 kubenswrapper[18592]: I0308 04:19:26.664514 18592 scope.go:117] "RemoveContainer" containerID="72aeac56b3682dc0408d7058c84d512338a216005814f3f0757ae58c815e221e" Mar 08 04:19:26.664678 master-0 kubenswrapper[18592]: I0308 04:19:26.664660 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 04:19:26.667937 master-0 kubenswrapper[18592]: I0308 04:19:26.667897 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3baf7c23-212f-4e46-8734-6aebd220851f-kube-api-access-k7rkz" (OuterVolumeSpecName: "kube-api-access-k7rkz") pod "3baf7c23-212f-4e46-8734-6aebd220851f" (UID: "3baf7c23-212f-4e46-8734-6aebd220851f"). InnerVolumeSpecName "kube-api-access-k7rkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:19:26.695402 master-0 kubenswrapper[18592]: I0308 04:19:26.695104 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3baf7c23-212f-4e46-8734-6aebd220851f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3baf7c23-212f-4e46-8734-6aebd220851f" (UID: "3baf7c23-212f-4e46-8734-6aebd220851f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:26.695402 master-0 kubenswrapper[18592]: I0308 04:19:26.695366 18592 scope.go:117] "RemoveContainer" containerID="2b8fd88fc1f2577b6d2684b86aeff480750f4e201772fa6287ad7fca66f4f415" Mar 08 04:19:26.709117 master-0 kubenswrapper[18592]: I0308 04:19:26.709071 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3baf7c23-212f-4e46-8734-6aebd220851f-config-data" (OuterVolumeSpecName: "config-data") pod "3baf7c23-212f-4e46-8734-6aebd220851f" (UID: "3baf7c23-212f-4e46-8734-6aebd220851f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:26.732405 master-0 kubenswrapper[18592]: I0308 04:19:26.732357 18592 scope.go:117] "RemoveContainer" containerID="72aeac56b3682dc0408d7058c84d512338a216005814f3f0757ae58c815e221e" Mar 08 04:19:26.732916 master-0 kubenswrapper[18592]: E0308 04:19:26.732884 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72aeac56b3682dc0408d7058c84d512338a216005814f3f0757ae58c815e221e\": container with ID starting with 72aeac56b3682dc0408d7058c84d512338a216005814f3f0757ae58c815e221e not found: ID does not exist" containerID="72aeac56b3682dc0408d7058c84d512338a216005814f3f0757ae58c815e221e" Mar 08 04:19:26.733032 master-0 kubenswrapper[18592]: I0308 04:19:26.732920 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72aeac56b3682dc0408d7058c84d512338a216005814f3f0757ae58c815e221e"} err="failed to get container status \"72aeac56b3682dc0408d7058c84d512338a216005814f3f0757ae58c815e221e\": rpc error: code = NotFound desc = could not find container \"72aeac56b3682dc0408d7058c84d512338a216005814f3f0757ae58c815e221e\": container with ID starting with 72aeac56b3682dc0408d7058c84d512338a216005814f3f0757ae58c815e221e not found: ID does not exist" Mar 08 04:19:26.733032 master-0 kubenswrapper[18592]: I0308 04:19:26.732940 18592 scope.go:117] "RemoveContainer" containerID="2b8fd88fc1f2577b6d2684b86aeff480750f4e201772fa6287ad7fca66f4f415" Mar 08 04:19:26.733490 master-0 kubenswrapper[18592]: E0308 04:19:26.733445 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b8fd88fc1f2577b6d2684b86aeff480750f4e201772fa6287ad7fca66f4f415\": container with ID starting with 2b8fd88fc1f2577b6d2684b86aeff480750f4e201772fa6287ad7fca66f4f415 not found: ID does not exist" containerID="2b8fd88fc1f2577b6d2684b86aeff480750f4e201772fa6287ad7fca66f4f415" Mar 08 04:19:26.733564 master-0 kubenswrapper[18592]: I0308 04:19:26.733495 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b8fd88fc1f2577b6d2684b86aeff480750f4e201772fa6287ad7fca66f4f415"} err="failed to get container status \"2b8fd88fc1f2577b6d2684b86aeff480750f4e201772fa6287ad7fca66f4f415\": rpc error: code = NotFound desc = could not find container \"2b8fd88fc1f2577b6d2684b86aeff480750f4e201772fa6287ad7fca66f4f415\": container with ID starting with 2b8fd88fc1f2577b6d2684b86aeff480750f4e201772fa6287ad7fca66f4f415 not found: ID does not exist" Mar 08 04:19:26.763371 master-0 kubenswrapper[18592]: I0308 04:19:26.763311 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3baf7c23-212f-4e46-8734-6aebd220851f-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:26.763371 master-0 kubenswrapper[18592]: I0308 04:19:26.763359 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3baf7c23-212f-4e46-8734-6aebd220851f-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:26.763371 master-0 kubenswrapper[18592]: I0308 04:19:26.763370 18592 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3baf7c23-212f-4e46-8734-6aebd220851f-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:26.763371 master-0 kubenswrapper[18592]: I0308 04:19:26.763380 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7rkz\" (UniqueName: \"kubernetes.io/projected/3baf7c23-212f-4e46-8734-6aebd220851f-kube-api-access-k7rkz\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:26.823657 master-0 kubenswrapper[18592]: E0308 04:19:26.823606 18592 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb204bfb_9945_45b5_bf3c_e6d60826b959.slice/crio-conmon-9df132ac71f9ce548624a8ded8e723b1388a034f1518c2705d226d0da4ca21d5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb204bfb_9945_45b5_bf3c_e6d60826b959.slice/crio-9df132ac71f9ce548624a8ded8e723b1388a034f1518c2705d226d0da4ca21d5.scope\": RecentStats: unable to find data in memory cache]" Mar 08 04:19:27.061492 master-0 kubenswrapper[18592]: I0308 04:19:27.061438 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 04:19:27.082851 master-0 kubenswrapper[18592]: I0308 04:19:27.079482 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 04:19:27.111782 master-0 kubenswrapper[18592]: I0308 04:19:27.111029 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 04:19:27.111782 master-0 kubenswrapper[18592]: E0308 04:19:27.111608 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3baf7c23-212f-4e46-8734-6aebd220851f" containerName="nova-api-log" Mar 08 04:19:27.111782 master-0 kubenswrapper[18592]: I0308 04:19:27.111623 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="3baf7c23-212f-4e46-8734-6aebd220851f" containerName="nova-api-log" Mar 08 04:19:27.111782 master-0 kubenswrapper[18592]: E0308 04:19:27.111671 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3baf7c23-212f-4e46-8734-6aebd220851f" containerName="nova-api-api" Mar 08 04:19:27.111782 master-0 kubenswrapper[18592]: I0308 04:19:27.111677 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="3baf7c23-212f-4e46-8734-6aebd220851f" containerName="nova-api-api" Mar 08 04:19:27.112089 master-0 kubenswrapper[18592]: I0308 04:19:27.111977 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="3baf7c23-212f-4e46-8734-6aebd220851f" containerName="nova-api-api" Mar 08 04:19:27.112089 master-0 kubenswrapper[18592]: I0308 04:19:27.111993 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="3baf7c23-212f-4e46-8734-6aebd220851f" containerName="nova-api-log" Mar 08 04:19:27.115051 master-0 kubenswrapper[18592]: I0308 04:19:27.115021 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 04:19:27.117381 master-0 kubenswrapper[18592]: I0308 04:19:27.117335 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 04:19:27.145807 master-0 kubenswrapper[18592]: I0308 04:19:27.145744 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 04:19:27.174933 master-0 kubenswrapper[18592]: I0308 04:19:27.174635 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 04:19:27.175471 master-0 kubenswrapper[18592]: I0308 04:19:27.175439 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaf095f0-60ad-43de-a29c-6cb21bf470ad-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\") " pod="openstack/nova-api-0" Mar 08 04:19:27.175557 master-0 kubenswrapper[18592]: I0308 04:19:27.175531 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaf095f0-60ad-43de-a29c-6cb21bf470ad-logs\") pod \"nova-api-0\" (UID: \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\") " pod="openstack/nova-api-0" Mar 08 04:19:27.175601 master-0 kubenswrapper[18592]: I0308 04:19:27.175583 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaf095f0-60ad-43de-a29c-6cb21bf470ad-config-data\") pod \"nova-api-0\" (UID: \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\") " pod="openstack/nova-api-0" Mar 08 04:19:27.175641 master-0 kubenswrapper[18592]: I0308 04:19:27.175614 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whp7d\" (UniqueName: \"kubernetes.io/projected/eaf095f0-60ad-43de-a29c-6cb21bf470ad-kube-api-access-whp7d\") pod \"nova-api-0\" (UID: \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\") " pod="openstack/nova-api-0" Mar 08 04:19:27.282439 master-0 kubenswrapper[18592]: I0308 04:19:27.282376 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb204bfb-9945-45b5-bf3c-e6d60826b959-config-data\") pod \"eb204bfb-9945-45b5-bf3c-e6d60826b959\" (UID: \"eb204bfb-9945-45b5-bf3c-e6d60826b959\") " Mar 08 04:19:27.282655 master-0 kubenswrapper[18592]: I0308 04:19:27.282628 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb204bfb-9945-45b5-bf3c-e6d60826b959-combined-ca-bundle\") pod \"eb204bfb-9945-45b5-bf3c-e6d60826b959\" (UID: \"eb204bfb-9945-45b5-bf3c-e6d60826b959\") " Mar 08 04:19:27.282919 master-0 kubenswrapper[18592]: I0308 04:19:27.282894 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrc85\" (UniqueName: \"kubernetes.io/projected/eb204bfb-9945-45b5-bf3c-e6d60826b959-kube-api-access-wrc85\") pod \"eb204bfb-9945-45b5-bf3c-e6d60826b959\" (UID: \"eb204bfb-9945-45b5-bf3c-e6d60826b959\") " Mar 08 04:19:27.284643 master-0 kubenswrapper[18592]: I0308 04:19:27.284596 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaf095f0-60ad-43de-a29c-6cb21bf470ad-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\") " pod="openstack/nova-api-0" Mar 08 04:19:27.285686 master-0 kubenswrapper[18592]: I0308 04:19:27.285650 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaf095f0-60ad-43de-a29c-6cb21bf470ad-logs\") pod \"nova-api-0\" (UID: \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\") " pod="openstack/nova-api-0" Mar 08 04:19:27.285835 master-0 kubenswrapper[18592]: I0308 04:19:27.285798 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaf095f0-60ad-43de-a29c-6cb21bf470ad-config-data\") pod \"nova-api-0\" (UID: \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\") " pod="openstack/nova-api-0" Mar 08 04:19:27.285967 master-0 kubenswrapper[18592]: I0308 04:19:27.285928 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whp7d\" (UniqueName: \"kubernetes.io/projected/eaf095f0-60ad-43de-a29c-6cb21bf470ad-kube-api-access-whp7d\") pod \"nova-api-0\" (UID: \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\") " pod="openstack/nova-api-0" Mar 08 04:19:27.288039 master-0 kubenswrapper[18592]: I0308 04:19:27.287998 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaf095f0-60ad-43de-a29c-6cb21bf470ad-logs\") pod \"nova-api-0\" (UID: \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\") " pod="openstack/nova-api-0" Mar 08 04:19:27.288450 master-0 kubenswrapper[18592]: I0308 04:19:27.288374 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb204bfb-9945-45b5-bf3c-e6d60826b959-kube-api-access-wrc85" (OuterVolumeSpecName: "kube-api-access-wrc85") pod "eb204bfb-9945-45b5-bf3c-e6d60826b959" (UID: "eb204bfb-9945-45b5-bf3c-e6d60826b959"). InnerVolumeSpecName "kube-api-access-wrc85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:19:27.289248 master-0 kubenswrapper[18592]: I0308 04:19:27.289217 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaf095f0-60ad-43de-a29c-6cb21bf470ad-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\") " pod="openstack/nova-api-0" Mar 08 04:19:27.291446 master-0 kubenswrapper[18592]: I0308 04:19:27.291411 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaf095f0-60ad-43de-a29c-6cb21bf470ad-config-data\") pod \"nova-api-0\" (UID: \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\") " pod="openstack/nova-api-0" Mar 08 04:19:27.303687 master-0 kubenswrapper[18592]: I0308 04:19:27.303607 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whp7d\" (UniqueName: \"kubernetes.io/projected/eaf095f0-60ad-43de-a29c-6cb21bf470ad-kube-api-access-whp7d\") pod \"nova-api-0\" (UID: \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\") " pod="openstack/nova-api-0" Mar 08 04:19:27.317588 master-0 kubenswrapper[18592]: I0308 04:19:27.317536 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb204bfb-9945-45b5-bf3c-e6d60826b959-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb204bfb-9945-45b5-bf3c-e6d60826b959" (UID: "eb204bfb-9945-45b5-bf3c-e6d60826b959"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:27.331677 master-0 kubenswrapper[18592]: I0308 04:19:27.331242 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb204bfb-9945-45b5-bf3c-e6d60826b959-config-data" (OuterVolumeSpecName: "config-data") pod "eb204bfb-9945-45b5-bf3c-e6d60826b959" (UID: "eb204bfb-9945-45b5-bf3c-e6d60826b959"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:27.394157 master-0 kubenswrapper[18592]: I0308 04:19:27.394117 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb204bfb-9945-45b5-bf3c-e6d60826b959-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:27.394345 master-0 kubenswrapper[18592]: I0308 04:19:27.394332 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrc85\" (UniqueName: \"kubernetes.io/projected/eb204bfb-9945-45b5-bf3c-e6d60826b959-kube-api-access-wrc85\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:27.394407 master-0 kubenswrapper[18592]: I0308 04:19:27.394397 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb204bfb-9945-45b5-bf3c-e6d60826b959-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:27.457491 master-0 kubenswrapper[18592]: I0308 04:19:27.457447 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 04:19:27.697148 master-0 kubenswrapper[18592]: I0308 04:19:27.697092 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"abe912ba-4a33-4634-a3fb-b6fb09b38d8e","Type":"ContainerStarted","Data":"a3443341ef902199bd680438464b9b7665271bdb09ad862f3e1c522787c77807"} Mar 08 04:19:27.697148 master-0 kubenswrapper[18592]: I0308 04:19:27.697156 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"abe912ba-4a33-4634-a3fb-b6fb09b38d8e","Type":"ContainerStarted","Data":"c71b57997da8878a178cb02a99d8526854787f51058402851be0dedf8af56d19"} Mar 08 04:19:27.697777 master-0 kubenswrapper[18592]: I0308 04:19:27.697172 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Mar 08 04:19:27.697777 master-0 kubenswrapper[18592]: I0308 04:19:27.697189 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Mar 08 04:19:27.701662 master-0 kubenswrapper[18592]: I0308 04:19:27.701611 18592 generic.go:334] "Generic (PLEG): container finished" podID="eb204bfb-9945-45b5-bf3c-e6d60826b959" containerID="9df132ac71f9ce548624a8ded8e723b1388a034f1518c2705d226d0da4ca21d5" exitCode=0 Mar 08 04:19:27.701751 master-0 kubenswrapper[18592]: I0308 04:19:27.701732 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"eb204bfb-9945-45b5-bf3c-e6d60826b959","Type":"ContainerDied","Data":"9df132ac71f9ce548624a8ded8e723b1388a034f1518c2705d226d0da4ca21d5"} Mar 08 04:19:27.701812 master-0 kubenswrapper[18592]: I0308 04:19:27.701761 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"eb204bfb-9945-45b5-bf3c-e6d60826b959","Type":"ContainerDied","Data":"124d28bee80f4c37d88684f837a94cfaa1a461cb0c09ffbfa1d992e53876912e"} Mar 08 04:19:27.701812 master-0 kubenswrapper[18592]: I0308 04:19:27.701779 18592 scope.go:117] "RemoveContainer" containerID="9df132ac71f9ce548624a8ded8e723b1388a034f1518c2705d226d0da4ca21d5" Mar 08 04:19:27.703390 master-0 kubenswrapper[18592]: I0308 04:19:27.703288 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 04:19:27.743403 master-0 kubenswrapper[18592]: I0308 04:19:27.736738 18592 scope.go:117] "RemoveContainer" containerID="9df132ac71f9ce548624a8ded8e723b1388a034f1518c2705d226d0da4ca21d5" Mar 08 04:19:27.743403 master-0 kubenswrapper[18592]: E0308 04:19:27.737548 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9df132ac71f9ce548624a8ded8e723b1388a034f1518c2705d226d0da4ca21d5\": container with ID starting with 9df132ac71f9ce548624a8ded8e723b1388a034f1518c2705d226d0da4ca21d5 not found: ID does not exist" containerID="9df132ac71f9ce548624a8ded8e723b1388a034f1518c2705d226d0da4ca21d5" Mar 08 04:19:27.743403 master-0 kubenswrapper[18592]: I0308 04:19:27.737600 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9df132ac71f9ce548624a8ded8e723b1388a034f1518c2705d226d0da4ca21d5"} err="failed to get container status \"9df132ac71f9ce548624a8ded8e723b1388a034f1518c2705d226d0da4ca21d5\": rpc error: code = NotFound desc = could not find container \"9df132ac71f9ce548624a8ded8e723b1388a034f1518c2705d226d0da4ca21d5\": container with ID starting with 9df132ac71f9ce548624a8ded8e723b1388a034f1518c2705d226d0da4ca21d5 not found: ID does not exist" Mar 08 04:19:27.750438 master-0 kubenswrapper[18592]: I0308 04:19:27.750279 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-conductor-0" podStartSLOduration=74.699958208 podStartE2EDuration="1m54.750248127s" podCreationTimestamp="2026-03-08 04:17:33 +0000 UTC" firstStartedPulling="2026-03-08 04:17:43.029911798 +0000 UTC m=+1475.128666148" lastFinishedPulling="2026-03-08 04:18:23.080201707 +0000 UTC m=+1515.178956067" observedRunningTime="2026-03-08 04:19:27.727780196 +0000 UTC m=+1579.826534546" watchObservedRunningTime="2026-03-08 04:19:27.750248127 +0000 UTC m=+1579.849002477" Mar 08 04:19:27.800849 master-0 kubenswrapper[18592]: I0308 04:19:27.800238 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 04:19:27.821504 master-0 kubenswrapper[18592]: I0308 04:19:27.820282 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 04:19:27.857208 master-0 kubenswrapper[18592]: I0308 04:19:27.857138 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 04:19:27.857735 master-0 kubenswrapper[18592]: E0308 04:19:27.857707 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb204bfb-9945-45b5-bf3c-e6d60826b959" containerName="nova-scheduler-scheduler" Mar 08 04:19:27.857735 master-0 kubenswrapper[18592]: I0308 04:19:27.857727 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb204bfb-9945-45b5-bf3c-e6d60826b959" containerName="nova-scheduler-scheduler" Mar 08 04:19:27.858055 master-0 kubenswrapper[18592]: I0308 04:19:27.858029 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb204bfb-9945-45b5-bf3c-e6d60826b959" containerName="nova-scheduler-scheduler" Mar 08 04:19:27.858816 master-0 kubenswrapper[18592]: I0308 04:19:27.858785 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 04:19:27.863775 master-0 kubenswrapper[18592]: I0308 04:19:27.863609 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 04:19:27.878441 master-0 kubenswrapper[18592]: I0308 04:19:27.878395 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 04:19:27.906730 master-0 kubenswrapper[18592]: I0308 04:19:27.906655 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924fe7b2-8c29-4604-a79a-64d2b5d42e11-config-data\") pod \"nova-scheduler-0\" (UID: \"924fe7b2-8c29-4604-a79a-64d2b5d42e11\") " pod="openstack/nova-scheduler-0" Mar 08 04:19:27.907321 master-0 kubenswrapper[18592]: I0308 04:19:27.907257 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924fe7b2-8c29-4604-a79a-64d2b5d42e11-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"924fe7b2-8c29-4604-a79a-64d2b5d42e11\") " pod="openstack/nova-scheduler-0" Mar 08 04:19:27.907721 master-0 kubenswrapper[18592]: I0308 04:19:27.907681 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9q2n\" (UniqueName: \"kubernetes.io/projected/924fe7b2-8c29-4604-a79a-64d2b5d42e11-kube-api-access-t9q2n\") pod \"nova-scheduler-0\" (UID: \"924fe7b2-8c29-4604-a79a-64d2b5d42e11\") " pod="openstack/nova-scheduler-0" Mar 08 04:19:27.919908 master-0 kubenswrapper[18592]: W0308 04:19:27.919774 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaf095f0_60ad_43de_a29c_6cb21bf470ad.slice/crio-dd95339767acbe3df02c787593e43d5406b148c7264064cbb40fab0af998be14 WatchSource:0}: Error finding container dd95339767acbe3df02c787593e43d5406b148c7264064cbb40fab0af998be14: Status 404 returned error can't find the container with id dd95339767acbe3df02c787593e43d5406b148c7264064cbb40fab0af998be14 Mar 08 04:19:27.943883 master-0 kubenswrapper[18592]: I0308 04:19:27.943774 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 04:19:28.011972 master-0 kubenswrapper[18592]: I0308 04:19:28.011366 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924fe7b2-8c29-4604-a79a-64d2b5d42e11-config-data\") pod \"nova-scheduler-0\" (UID: \"924fe7b2-8c29-4604-a79a-64d2b5d42e11\") " pod="openstack/nova-scheduler-0" Mar 08 04:19:28.012096 master-0 kubenswrapper[18592]: I0308 04:19:28.012075 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924fe7b2-8c29-4604-a79a-64d2b5d42e11-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"924fe7b2-8c29-4604-a79a-64d2b5d42e11\") " pod="openstack/nova-scheduler-0" Mar 08 04:19:28.012189 master-0 kubenswrapper[18592]: I0308 04:19:28.012163 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9q2n\" (UniqueName: \"kubernetes.io/projected/924fe7b2-8c29-4604-a79a-64d2b5d42e11-kube-api-access-t9q2n\") pod \"nova-scheduler-0\" (UID: \"924fe7b2-8c29-4604-a79a-64d2b5d42e11\") " pod="openstack/nova-scheduler-0" Mar 08 04:19:28.014571 master-0 kubenswrapper[18592]: I0308 04:19:28.014529 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924fe7b2-8c29-4604-a79a-64d2b5d42e11-config-data\") pod \"nova-scheduler-0\" (UID: \"924fe7b2-8c29-4604-a79a-64d2b5d42e11\") " pod="openstack/nova-scheduler-0" Mar 08 04:19:28.017410 master-0 kubenswrapper[18592]: I0308 04:19:28.017365 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924fe7b2-8c29-4604-a79a-64d2b5d42e11-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"924fe7b2-8c29-4604-a79a-64d2b5d42e11\") " pod="openstack/nova-scheduler-0" Mar 08 04:19:28.028461 master-0 kubenswrapper[18592]: I0308 04:19:28.028438 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9q2n\" (UniqueName: \"kubernetes.io/projected/924fe7b2-8c29-4604-a79a-64d2b5d42e11-kube-api-access-t9q2n\") pod \"nova-scheduler-0\" (UID: \"924fe7b2-8c29-4604-a79a-64d2b5d42e11\") " pod="openstack/nova-scheduler-0" Mar 08 04:19:28.051949 master-0 kubenswrapper[18592]: I0308 04:19:28.051905 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 08 04:19:28.162175 master-0 kubenswrapper[18592]: I0308 04:19:28.161999 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3baf7c23-212f-4e46-8734-6aebd220851f" path="/var/lib/kubelet/pods/3baf7c23-212f-4e46-8734-6aebd220851f/volumes" Mar 08 04:19:28.162675 master-0 kubenswrapper[18592]: I0308 04:19:28.162644 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb204bfb-9945-45b5-bf3c-e6d60826b959" path="/var/lib/kubelet/pods/eb204bfb-9945-45b5-bf3c-e6d60826b959/volumes" Mar 08 04:19:28.191589 master-0 kubenswrapper[18592]: I0308 04:19:28.191542 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 04:19:28.697841 master-0 kubenswrapper[18592]: W0308 04:19:28.697765 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod924fe7b2_8c29_4604_a79a_64d2b5d42e11.slice/crio-45829af79144168cf1f8c4db2fa251a0c058146ada5c1f2149ce4f7f89f7fe85 WatchSource:0}: Error finding container 45829af79144168cf1f8c4db2fa251a0c058146ada5c1f2149ce4f7f89f7fe85: Status 404 returned error can't find the container with id 45829af79144168cf1f8c4db2fa251a0c058146ada5c1f2149ce4f7f89f7fe85 Mar 08 04:19:28.700606 master-0 kubenswrapper[18592]: I0308 04:19:28.700287 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 04:19:28.722443 master-0 kubenswrapper[18592]: I0308 04:19:28.721110 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eaf095f0-60ad-43de-a29c-6cb21bf470ad","Type":"ContainerStarted","Data":"dee7832c5c7cec471f88b7543bbff442c90b1920f3bd0e442c98e3e21306bf13"} Mar 08 04:19:28.722443 master-0 kubenswrapper[18592]: I0308 04:19:28.721164 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eaf095f0-60ad-43de-a29c-6cb21bf470ad","Type":"ContainerStarted","Data":"580f07b20dc274b7ea10da5721b94468bab12dc8f6ca25465ab0c69b63f19c72"} Mar 08 04:19:28.722443 master-0 kubenswrapper[18592]: I0308 04:19:28.721177 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eaf095f0-60ad-43de-a29c-6cb21bf470ad","Type":"ContainerStarted","Data":"dd95339767acbe3df02c787593e43d5406b148c7264064cbb40fab0af998be14"} Mar 08 04:19:28.731891 master-0 kubenswrapper[18592]: I0308 04:19:28.731811 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"924fe7b2-8c29-4604-a79a-64d2b5d42e11","Type":"ContainerStarted","Data":"45829af79144168cf1f8c4db2fa251a0c058146ada5c1f2149ce4f7f89f7fe85"} Mar 08 04:19:28.746495 master-0 kubenswrapper[18592]: I0308 04:19:28.746416 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.746022985 podStartE2EDuration="1.746022985s" podCreationTimestamp="2026-03-08 04:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:19:28.741249996 +0000 UTC m=+1580.840004346" watchObservedRunningTime="2026-03-08 04:19:28.746022985 +0000 UTC m=+1580.844777345" Mar 08 04:19:28.921397 master-0 kubenswrapper[18592]: I0308 04:19:28.921328 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-conductor-0" Mar 08 04:19:29.745855 master-0 kubenswrapper[18592]: I0308 04:19:29.745663 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"924fe7b2-8c29-4604-a79a-64d2b5d42e11","Type":"ContainerStarted","Data":"46c4ac6a03a7b1b9601ef3e9a177ca316ce89e93d57c72aea9367285d73c9e83"} Mar 08 04:19:29.785457 master-0 kubenswrapper[18592]: I0308 04:19:29.785366 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.785346517 podStartE2EDuration="2.785346517s" podCreationTimestamp="2026-03-08 04:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:19:29.765322523 +0000 UTC m=+1581.864076873" watchObservedRunningTime="2026-03-08 04:19:29.785346517 +0000 UTC m=+1581.884100867" Mar 08 04:19:30.342145 master-0 kubenswrapper[18592]: I0308 04:19:30.342092 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-conductor-0" Mar 08 04:19:30.865541 master-0 kubenswrapper[18592]: I0308 04:19:30.865470 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Mar 08 04:19:31.785031 master-0 kubenswrapper[18592]: I0308 04:19:31.784939 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Mar 08 04:19:33.192714 master-0 kubenswrapper[18592]: I0308 04:19:33.192600 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 04:19:37.457837 master-0 kubenswrapper[18592]: I0308 04:19:37.457769 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 04:19:37.458624 master-0 kubenswrapper[18592]: I0308 04:19:37.457817 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 04:19:38.193015 master-0 kubenswrapper[18592]: I0308 04:19:38.192951 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 04:19:38.248934 master-0 kubenswrapper[18592]: I0308 04:19:38.248879 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 04:19:38.540500 master-0 kubenswrapper[18592]: I0308 04:19:38.540015 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="eaf095f0-60ad-43de-a29c-6cb21bf470ad" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.7:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 04:19:38.540500 master-0 kubenswrapper[18592]: I0308 04:19:38.540102 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="eaf095f0-60ad-43de-a29c-6cb21bf470ad" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.7:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 04:19:38.942537 master-0 kubenswrapper[18592]: I0308 04:19:38.942408 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 04:19:40.711197 master-0 kubenswrapper[18592]: I0308 04:19:40.711152 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:40.815700 master-0 kubenswrapper[18592]: I0308 04:19:40.815640 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 04:19:40.830874 master-0 kubenswrapper[18592]: I0308 04:19:40.827090 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886aa90f-56dc-4e58-bafe-d3828e4f8781-config-data\") pod \"886aa90f-56dc-4e58-bafe-d3828e4f8781\" (UID: \"886aa90f-56dc-4e58-bafe-d3828e4f8781\") " Mar 08 04:19:40.830874 master-0 kubenswrapper[18592]: I0308 04:19:40.827233 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74qjt\" (UniqueName: \"kubernetes.io/projected/886aa90f-56dc-4e58-bafe-d3828e4f8781-kube-api-access-74qjt\") pod \"886aa90f-56dc-4e58-bafe-d3828e4f8781\" (UID: \"886aa90f-56dc-4e58-bafe-d3828e4f8781\") " Mar 08 04:19:40.830874 master-0 kubenswrapper[18592]: I0308 04:19:40.827490 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886aa90f-56dc-4e58-bafe-d3828e4f8781-combined-ca-bundle\") pod \"886aa90f-56dc-4e58-bafe-d3828e4f8781\" (UID: \"886aa90f-56dc-4e58-bafe-d3828e4f8781\") " Mar 08 04:19:40.830874 master-0 kubenswrapper[18592]: I0308 04:19:40.830728 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/886aa90f-56dc-4e58-bafe-d3828e4f8781-kube-api-access-74qjt" (OuterVolumeSpecName: "kube-api-access-74qjt") pod "886aa90f-56dc-4e58-bafe-d3828e4f8781" (UID: "886aa90f-56dc-4e58-bafe-d3828e4f8781"). InnerVolumeSpecName "kube-api-access-74qjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:19:40.868742 master-0 kubenswrapper[18592]: I0308 04:19:40.868637 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886aa90f-56dc-4e58-bafe-d3828e4f8781-config-data" (OuterVolumeSpecName: "config-data") pod "886aa90f-56dc-4e58-bafe-d3828e4f8781" (UID: "886aa90f-56dc-4e58-bafe-d3828e4f8781"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:40.871844 master-0 kubenswrapper[18592]: I0308 04:19:40.871783 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886aa90f-56dc-4e58-bafe-d3828e4f8781-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "886aa90f-56dc-4e58-bafe-d3828e4f8781" (UID: "886aa90f-56dc-4e58-bafe-d3828e4f8781"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:40.913660 master-0 kubenswrapper[18592]: I0308 04:19:40.913452 18592 generic.go:334] "Generic (PLEG): container finished" podID="3987d444-6d52-4bec-bcac-e9f562019bbf" containerID="1e37089cb47bb604258a34a5e57bbdd2ea51494ac1bce6d7f42875de91135a78" exitCode=137 Mar 08 04:19:40.913660 master-0 kubenswrapper[18592]: I0308 04:19:40.913517 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3987d444-6d52-4bec-bcac-e9f562019bbf","Type":"ContainerDied","Data":"1e37089cb47bb604258a34a5e57bbdd2ea51494ac1bce6d7f42875de91135a78"} Mar 08 04:19:40.913660 master-0 kubenswrapper[18592]: I0308 04:19:40.913534 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 04:19:40.913660 master-0 kubenswrapper[18592]: I0308 04:19:40.913556 18592 scope.go:117] "RemoveContainer" containerID="1e37089cb47bb604258a34a5e57bbdd2ea51494ac1bce6d7f42875de91135a78" Mar 08 04:19:40.913660 master-0 kubenswrapper[18592]: I0308 04:19:40.913544 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3987d444-6d52-4bec-bcac-e9f562019bbf","Type":"ContainerDied","Data":"4e25c7e7d2933a09f7fef50c9ad43bd81aefaf03f0d3b64caef52986875b5b78"} Mar 08 04:19:40.917266 master-0 kubenswrapper[18592]: I0308 04:19:40.916949 18592 generic.go:334] "Generic (PLEG): container finished" podID="886aa90f-56dc-4e58-bafe-d3828e4f8781" containerID="ddcb223c20eb0a761e62a7bc4b77640e34d5f3bbe403c859a4e251b97e190e3e" exitCode=137 Mar 08 04:19:40.917266 master-0 kubenswrapper[18592]: I0308 04:19:40.916979 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"886aa90f-56dc-4e58-bafe-d3828e4f8781","Type":"ContainerDied","Data":"ddcb223c20eb0a761e62a7bc4b77640e34d5f3bbe403c859a4e251b97e190e3e"} Mar 08 04:19:40.917266 master-0 kubenswrapper[18592]: I0308 04:19:40.916997 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"886aa90f-56dc-4e58-bafe-d3828e4f8781","Type":"ContainerDied","Data":"80af8771c9b2fd97c216a2c620111c84a1aa43886542cfb6de2c2eb8ddf910e9"} Mar 08 04:19:40.918505 master-0 kubenswrapper[18592]: I0308 04:19:40.917410 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:40.936520 master-0 kubenswrapper[18592]: I0308 04:19:40.936476 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3987d444-6d52-4bec-bcac-e9f562019bbf-config-data\") pod \"3987d444-6d52-4bec-bcac-e9f562019bbf\" (UID: \"3987d444-6d52-4bec-bcac-e9f562019bbf\") " Mar 08 04:19:40.936657 master-0 kubenswrapper[18592]: I0308 04:19:40.936623 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3987d444-6d52-4bec-bcac-e9f562019bbf-combined-ca-bundle\") pod \"3987d444-6d52-4bec-bcac-e9f562019bbf\" (UID: \"3987d444-6d52-4bec-bcac-e9f562019bbf\") " Mar 08 04:19:40.936799 master-0 kubenswrapper[18592]: I0308 04:19:40.936734 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkcc5\" (UniqueName: \"kubernetes.io/projected/3987d444-6d52-4bec-bcac-e9f562019bbf-kube-api-access-pkcc5\") pod \"3987d444-6d52-4bec-bcac-e9f562019bbf\" (UID: \"3987d444-6d52-4bec-bcac-e9f562019bbf\") " Mar 08 04:19:40.936799 master-0 kubenswrapper[18592]: I0308 04:19:40.936788 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3987d444-6d52-4bec-bcac-e9f562019bbf-logs\") pod \"3987d444-6d52-4bec-bcac-e9f562019bbf\" (UID: \"3987d444-6d52-4bec-bcac-e9f562019bbf\") " Mar 08 04:19:40.937287 master-0 kubenswrapper[18592]: I0308 04:19:40.937268 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886aa90f-56dc-4e58-bafe-d3828e4f8781-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:40.937287 master-0 kubenswrapper[18592]: I0308 04:19:40.937286 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886aa90f-56dc-4e58-bafe-d3828e4f8781-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:40.937381 master-0 kubenswrapper[18592]: I0308 04:19:40.937296 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74qjt\" (UniqueName: \"kubernetes.io/projected/886aa90f-56dc-4e58-bafe-d3828e4f8781-kube-api-access-74qjt\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:40.937579 master-0 kubenswrapper[18592]: I0308 04:19:40.937515 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3987d444-6d52-4bec-bcac-e9f562019bbf-logs" (OuterVolumeSpecName: "logs") pod "3987d444-6d52-4bec-bcac-e9f562019bbf" (UID: "3987d444-6d52-4bec-bcac-e9f562019bbf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:19:40.940165 master-0 kubenswrapper[18592]: I0308 04:19:40.940105 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3987d444-6d52-4bec-bcac-e9f562019bbf-kube-api-access-pkcc5" (OuterVolumeSpecName: "kube-api-access-pkcc5") pod "3987d444-6d52-4bec-bcac-e9f562019bbf" (UID: "3987d444-6d52-4bec-bcac-e9f562019bbf"). InnerVolumeSpecName "kube-api-access-pkcc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:19:40.963271 master-0 kubenswrapper[18592]: I0308 04:19:40.963225 18592 scope.go:117] "RemoveContainer" containerID="1e5e6657486555a1eead830522c65683cd72b3c89bac8b9a7ea458abf0f1b222" Mar 08 04:19:40.966040 master-0 kubenswrapper[18592]: I0308 04:19:40.965982 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 04:19:40.967626 master-0 kubenswrapper[18592]: I0308 04:19:40.967591 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3987d444-6d52-4bec-bcac-e9f562019bbf-config-data" (OuterVolumeSpecName: "config-data") pod "3987d444-6d52-4bec-bcac-e9f562019bbf" (UID: "3987d444-6d52-4bec-bcac-e9f562019bbf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:40.983552 master-0 kubenswrapper[18592]: I0308 04:19:40.983496 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 04:19:40.997051 master-0 kubenswrapper[18592]: I0308 04:19:40.996992 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3987d444-6d52-4bec-bcac-e9f562019bbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3987d444-6d52-4bec-bcac-e9f562019bbf" (UID: "3987d444-6d52-4bec-bcac-e9f562019bbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:40.997994 master-0 kubenswrapper[18592]: I0308 04:19:40.997951 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 04:19:40.998691 master-0 kubenswrapper[18592]: E0308 04:19:40.998610 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3987d444-6d52-4bec-bcac-e9f562019bbf" containerName="nova-metadata-log" Mar 08 04:19:40.998691 master-0 kubenswrapper[18592]: I0308 04:19:40.998634 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="3987d444-6d52-4bec-bcac-e9f562019bbf" containerName="nova-metadata-log" Mar 08 04:19:40.998691 master-0 kubenswrapper[18592]: E0308 04:19:40.998657 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886aa90f-56dc-4e58-bafe-d3828e4f8781" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 04:19:40.998691 master-0 kubenswrapper[18592]: I0308 04:19:40.998668 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="886aa90f-56dc-4e58-bafe-d3828e4f8781" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 04:19:40.998874 master-0 kubenswrapper[18592]: E0308 04:19:40.998699 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3987d444-6d52-4bec-bcac-e9f562019bbf" containerName="nova-metadata-metadata" Mar 08 04:19:40.998874 master-0 kubenswrapper[18592]: I0308 04:19:40.998709 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="3987d444-6d52-4bec-bcac-e9f562019bbf" containerName="nova-metadata-metadata" Mar 08 04:19:40.999097 master-0 kubenswrapper[18592]: I0308 04:19:40.999072 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="3987d444-6d52-4bec-bcac-e9f562019bbf" containerName="nova-metadata-log" Mar 08 04:19:40.999142 master-0 kubenswrapper[18592]: I0308 04:19:40.999101 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="886aa90f-56dc-4e58-bafe-d3828e4f8781" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 04:19:40.999142 master-0 kubenswrapper[18592]: I0308 04:19:40.999137 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="3987d444-6d52-4bec-bcac-e9f562019bbf" containerName="nova-metadata-metadata" Mar 08 04:19:41.000223 master-0 kubenswrapper[18592]: I0308 04:19:41.000193 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:41.010205 master-0 kubenswrapper[18592]: I0308 04:19:41.010165 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 04:19:41.014165 master-0 kubenswrapper[18592]: I0308 04:19:41.013310 18592 scope.go:117] "RemoveContainer" containerID="1e37089cb47bb604258a34a5e57bbdd2ea51494ac1bce6d7f42875de91135a78" Mar 08 04:19:41.014409 master-0 kubenswrapper[18592]: I0308 04:19:41.014377 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 08 04:19:41.014566 master-0 kubenswrapper[18592]: I0308 04:19:41.014524 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 08 04:19:41.014841 master-0 kubenswrapper[18592]: I0308 04:19:41.014753 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 08 04:19:41.023931 master-0 kubenswrapper[18592]: E0308 04:19:41.023774 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e37089cb47bb604258a34a5e57bbdd2ea51494ac1bce6d7f42875de91135a78\": container with ID starting with 1e37089cb47bb604258a34a5e57bbdd2ea51494ac1bce6d7f42875de91135a78 not found: ID does not exist" containerID="1e37089cb47bb604258a34a5e57bbdd2ea51494ac1bce6d7f42875de91135a78" Mar 08 04:19:41.023931 master-0 kubenswrapper[18592]: I0308 04:19:41.023863 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e37089cb47bb604258a34a5e57bbdd2ea51494ac1bce6d7f42875de91135a78"} err="failed to get container status \"1e37089cb47bb604258a34a5e57bbdd2ea51494ac1bce6d7f42875de91135a78\": rpc error: code = NotFound desc = could not find container \"1e37089cb47bb604258a34a5e57bbdd2ea51494ac1bce6d7f42875de91135a78\": container with ID starting with 1e37089cb47bb604258a34a5e57bbdd2ea51494ac1bce6d7f42875de91135a78 not found: ID does not exist" Mar 08 04:19:41.023931 master-0 kubenswrapper[18592]: I0308 04:19:41.023898 18592 scope.go:117] "RemoveContainer" containerID="1e5e6657486555a1eead830522c65683cd72b3c89bac8b9a7ea458abf0f1b222" Mar 08 04:19:41.024289 master-0 kubenswrapper[18592]: E0308 04:19:41.024253 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e5e6657486555a1eead830522c65683cd72b3c89bac8b9a7ea458abf0f1b222\": container with ID starting with 1e5e6657486555a1eead830522c65683cd72b3c89bac8b9a7ea458abf0f1b222 not found: ID does not exist" containerID="1e5e6657486555a1eead830522c65683cd72b3c89bac8b9a7ea458abf0f1b222" Mar 08 04:19:41.024335 master-0 kubenswrapper[18592]: I0308 04:19:41.024288 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e5e6657486555a1eead830522c65683cd72b3c89bac8b9a7ea458abf0f1b222"} err="failed to get container status \"1e5e6657486555a1eead830522c65683cd72b3c89bac8b9a7ea458abf0f1b222\": rpc error: code = NotFound desc = could not find container \"1e5e6657486555a1eead830522c65683cd72b3c89bac8b9a7ea458abf0f1b222\": container with ID starting with 1e5e6657486555a1eead830522c65683cd72b3c89bac8b9a7ea458abf0f1b222 not found: ID does not exist" Mar 08 04:19:41.024369 master-0 kubenswrapper[18592]: I0308 04:19:41.024338 18592 scope.go:117] "RemoveContainer" containerID="ddcb223c20eb0a761e62a7bc4b77640e34d5f3bbe403c859a4e251b97e190e3e" Mar 08 04:19:41.039518 master-0 kubenswrapper[18592]: I0308 04:19:41.039447 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3987d444-6d52-4bec-bcac-e9f562019bbf-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:41.039518 master-0 kubenswrapper[18592]: I0308 04:19:41.039516 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3987d444-6d52-4bec-bcac-e9f562019bbf-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:41.039844 master-0 kubenswrapper[18592]: I0308 04:19:41.039528 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkcc5\" (UniqueName: \"kubernetes.io/projected/3987d444-6d52-4bec-bcac-e9f562019bbf-kube-api-access-pkcc5\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:41.039844 master-0 kubenswrapper[18592]: I0308 04:19:41.039539 18592 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3987d444-6d52-4bec-bcac-e9f562019bbf-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:41.063405 master-0 kubenswrapper[18592]: I0308 04:19:41.063365 18592 scope.go:117] "RemoveContainer" containerID="ddcb223c20eb0a761e62a7bc4b77640e34d5f3bbe403c859a4e251b97e190e3e" Mar 08 04:19:41.063901 master-0 kubenswrapper[18592]: E0308 04:19:41.063818 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddcb223c20eb0a761e62a7bc4b77640e34d5f3bbe403c859a4e251b97e190e3e\": container with ID starting with ddcb223c20eb0a761e62a7bc4b77640e34d5f3bbe403c859a4e251b97e190e3e not found: ID does not exist" containerID="ddcb223c20eb0a761e62a7bc4b77640e34d5f3bbe403c859a4e251b97e190e3e" Mar 08 04:19:41.063982 master-0 kubenswrapper[18592]: I0308 04:19:41.063906 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddcb223c20eb0a761e62a7bc4b77640e34d5f3bbe403c859a4e251b97e190e3e"} err="failed to get container status \"ddcb223c20eb0a761e62a7bc4b77640e34d5f3bbe403c859a4e251b97e190e3e\": rpc error: code = NotFound desc = could not find container \"ddcb223c20eb0a761e62a7bc4b77640e34d5f3bbe403c859a4e251b97e190e3e\": container with ID starting with ddcb223c20eb0a761e62a7bc4b77640e34d5f3bbe403c859a4e251b97e190e3e not found: ID does not exist" Mar 08 04:19:41.142321 master-0 kubenswrapper[18592]: I0308 04:19:41.141946 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffcc095d-2c9f-44dc-9bef-6f6f962e8d70-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ffcc095d-2c9f-44dc-9bef-6f6f962e8d70\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:41.142321 master-0 kubenswrapper[18592]: I0308 04:19:41.142017 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb6qb\" (UniqueName: \"kubernetes.io/projected/ffcc095d-2c9f-44dc-9bef-6f6f962e8d70-kube-api-access-kb6qb\") pod \"nova-cell1-novncproxy-0\" (UID: \"ffcc095d-2c9f-44dc-9bef-6f6f962e8d70\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:41.142321 master-0 kubenswrapper[18592]: I0308 04:19:41.142159 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffcc095d-2c9f-44dc-9bef-6f6f962e8d70-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ffcc095d-2c9f-44dc-9bef-6f6f962e8d70\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:41.142321 master-0 kubenswrapper[18592]: I0308 04:19:41.142277 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffcc095d-2c9f-44dc-9bef-6f6f962e8d70-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ffcc095d-2c9f-44dc-9bef-6f6f962e8d70\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:41.142321 master-0 kubenswrapper[18592]: I0308 04:19:41.142339 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffcc095d-2c9f-44dc-9bef-6f6f962e8d70-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ffcc095d-2c9f-44dc-9bef-6f6f962e8d70\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:41.245259 master-0 kubenswrapper[18592]: I0308 04:19:41.244300 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffcc095d-2c9f-44dc-9bef-6f6f962e8d70-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ffcc095d-2c9f-44dc-9bef-6f6f962e8d70\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:41.245259 master-0 kubenswrapper[18592]: I0308 04:19:41.244454 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffcc095d-2c9f-44dc-9bef-6f6f962e8d70-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ffcc095d-2c9f-44dc-9bef-6f6f962e8d70\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:41.245259 master-0 kubenswrapper[18592]: I0308 04:19:41.244494 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffcc095d-2c9f-44dc-9bef-6f6f962e8d70-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ffcc095d-2c9f-44dc-9bef-6f6f962e8d70\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:41.245259 master-0 kubenswrapper[18592]: I0308 04:19:41.244556 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffcc095d-2c9f-44dc-9bef-6f6f962e8d70-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ffcc095d-2c9f-44dc-9bef-6f6f962e8d70\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:41.245259 master-0 kubenswrapper[18592]: I0308 04:19:41.244578 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb6qb\" (UniqueName: \"kubernetes.io/projected/ffcc095d-2c9f-44dc-9bef-6f6f962e8d70-kube-api-access-kb6qb\") pod \"nova-cell1-novncproxy-0\" (UID: \"ffcc095d-2c9f-44dc-9bef-6f6f962e8d70\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:41.249112 master-0 kubenswrapper[18592]: I0308 04:19:41.249067 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffcc095d-2c9f-44dc-9bef-6f6f962e8d70-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ffcc095d-2c9f-44dc-9bef-6f6f962e8d70\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:41.249668 master-0 kubenswrapper[18592]: I0308 04:19:41.249617 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffcc095d-2c9f-44dc-9bef-6f6f962e8d70-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ffcc095d-2c9f-44dc-9bef-6f6f962e8d70\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:41.250961 master-0 kubenswrapper[18592]: I0308 04:19:41.250605 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffcc095d-2c9f-44dc-9bef-6f6f962e8d70-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ffcc095d-2c9f-44dc-9bef-6f6f962e8d70\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:41.253225 master-0 kubenswrapper[18592]: I0308 04:19:41.253164 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffcc095d-2c9f-44dc-9bef-6f6f962e8d70-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ffcc095d-2c9f-44dc-9bef-6f6f962e8d70\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:41.287129 master-0 kubenswrapper[18592]: I0308 04:19:41.287002 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 04:19:41.298413 master-0 kubenswrapper[18592]: I0308 04:19:41.298368 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb6qb\" (UniqueName: \"kubernetes.io/projected/ffcc095d-2c9f-44dc-9bef-6f6f962e8d70-kube-api-access-kb6qb\") pod \"nova-cell1-novncproxy-0\" (UID: \"ffcc095d-2c9f-44dc-9bef-6f6f962e8d70\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:41.335156 master-0 kubenswrapper[18592]: I0308 04:19:41.335089 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 04:19:41.342875 master-0 kubenswrapper[18592]: I0308 04:19:41.342853 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:41.359696 master-0 kubenswrapper[18592]: I0308 04:19:41.359642 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 04:19:41.371655 master-0 kubenswrapper[18592]: I0308 04:19:41.371361 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 04:19:41.377968 master-0 kubenswrapper[18592]: I0308 04:19:41.377560 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 08 04:19:41.377968 master-0 kubenswrapper[18592]: I0308 04:19:41.377813 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 04:19:41.382479 master-0 kubenswrapper[18592]: I0308 04:19:41.382423 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 04:19:41.449527 master-0 kubenswrapper[18592]: I0308 04:19:41.449475 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg8hq\" (UniqueName: \"kubernetes.io/projected/351657d7-7d32-440d-94a9-46bfb27871a4-kube-api-access-fg8hq\") pod \"nova-metadata-0\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " pod="openstack/nova-metadata-0" Mar 08 04:19:41.449798 master-0 kubenswrapper[18592]: I0308 04:19:41.449779 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/351657d7-7d32-440d-94a9-46bfb27871a4-config-data\") pod \"nova-metadata-0\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " pod="openstack/nova-metadata-0" Mar 08 04:19:41.450018 master-0 kubenswrapper[18592]: I0308 04:19:41.449999 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/351657d7-7d32-440d-94a9-46bfb27871a4-logs\") pod \"nova-metadata-0\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " pod="openstack/nova-metadata-0" Mar 08 04:19:41.450157 master-0 kubenswrapper[18592]: I0308 04:19:41.450138 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/351657d7-7d32-440d-94a9-46bfb27871a4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " pod="openstack/nova-metadata-0" Mar 08 04:19:41.450355 master-0 kubenswrapper[18592]: I0308 04:19:41.450336 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/351657d7-7d32-440d-94a9-46bfb27871a4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " pod="openstack/nova-metadata-0" Mar 08 04:19:41.551965 master-0 kubenswrapper[18592]: I0308 04:19:41.551919 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/351657d7-7d32-440d-94a9-46bfb27871a4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " pod="openstack/nova-metadata-0" Mar 08 04:19:41.552201 master-0 kubenswrapper[18592]: I0308 04:19:41.551981 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg8hq\" (UniqueName: \"kubernetes.io/projected/351657d7-7d32-440d-94a9-46bfb27871a4-kube-api-access-fg8hq\") pod \"nova-metadata-0\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " pod="openstack/nova-metadata-0" Mar 08 04:19:41.552201 master-0 kubenswrapper[18592]: I0308 04:19:41.552002 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/351657d7-7d32-440d-94a9-46bfb27871a4-config-data\") pod \"nova-metadata-0\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " pod="openstack/nova-metadata-0" Mar 08 04:19:41.552303 master-0 kubenswrapper[18592]: I0308 04:19:41.552274 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/351657d7-7d32-440d-94a9-46bfb27871a4-logs\") pod \"nova-metadata-0\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " pod="openstack/nova-metadata-0" Mar 08 04:19:41.552359 master-0 kubenswrapper[18592]: I0308 04:19:41.552332 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/351657d7-7d32-440d-94a9-46bfb27871a4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " pod="openstack/nova-metadata-0" Mar 08 04:19:41.553169 master-0 kubenswrapper[18592]: I0308 04:19:41.553087 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/351657d7-7d32-440d-94a9-46bfb27871a4-logs\") pod \"nova-metadata-0\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " pod="openstack/nova-metadata-0" Mar 08 04:19:41.556217 master-0 kubenswrapper[18592]: I0308 04:19:41.556186 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/351657d7-7d32-440d-94a9-46bfb27871a4-config-data\") pod \"nova-metadata-0\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " pod="openstack/nova-metadata-0" Mar 08 04:19:41.557246 master-0 kubenswrapper[18592]: I0308 04:19:41.557212 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/351657d7-7d32-440d-94a9-46bfb27871a4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " pod="openstack/nova-metadata-0" Mar 08 04:19:41.567682 master-0 kubenswrapper[18592]: I0308 04:19:41.567626 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/351657d7-7d32-440d-94a9-46bfb27871a4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " pod="openstack/nova-metadata-0" Mar 08 04:19:41.579717 master-0 kubenswrapper[18592]: I0308 04:19:41.578876 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg8hq\" (UniqueName: \"kubernetes.io/projected/351657d7-7d32-440d-94a9-46bfb27871a4-kube-api-access-fg8hq\") pod \"nova-metadata-0\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " pod="openstack/nova-metadata-0" Mar 08 04:19:41.801834 master-0 kubenswrapper[18592]: I0308 04:19:41.801744 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 04:19:41.827087 master-0 kubenswrapper[18592]: W0308 04:19:41.827015 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffcc095d_2c9f_44dc_9bef_6f6f962e8d70.slice/crio-20b11381831ea588e837eee29bf8115c7fec6572373d2d7ca65fd9b971be0528 WatchSource:0}: Error finding container 20b11381831ea588e837eee29bf8115c7fec6572373d2d7ca65fd9b971be0528: Status 404 returned error can't find the container with id 20b11381831ea588e837eee29bf8115c7fec6572373d2d7ca65fd9b971be0528 Mar 08 04:19:41.828762 master-0 kubenswrapper[18592]: I0308 04:19:41.828711 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 04:19:41.934883 master-0 kubenswrapper[18592]: I0308 04:19:41.934794 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ffcc095d-2c9f-44dc-9bef-6f6f962e8d70","Type":"ContainerStarted","Data":"20b11381831ea588e837eee29bf8115c7fec6572373d2d7ca65fd9b971be0528"} Mar 08 04:19:42.160586 master-0 kubenswrapper[18592]: I0308 04:19:42.160506 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3987d444-6d52-4bec-bcac-e9f562019bbf" path="/var/lib/kubelet/pods/3987d444-6d52-4bec-bcac-e9f562019bbf/volumes" Mar 08 04:19:42.161396 master-0 kubenswrapper[18592]: I0308 04:19:42.161359 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="886aa90f-56dc-4e58-bafe-d3828e4f8781" path="/var/lib/kubelet/pods/886aa90f-56dc-4e58-bafe-d3828e4f8781/volumes" Mar 08 04:19:42.322511 master-0 kubenswrapper[18592]: W0308 04:19:42.322444 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod351657d7_7d32_440d_94a9_46bfb27871a4.slice/crio-5d706115ff8f139ddc612bd9f55faf254fd521eb7961da4b68a614ddd94b8596 WatchSource:0}: Error finding container 5d706115ff8f139ddc612bd9f55faf254fd521eb7961da4b68a614ddd94b8596: Status 404 returned error can't find the container with id 5d706115ff8f139ddc612bd9f55faf254fd521eb7961da4b68a614ddd94b8596 Mar 08 04:19:42.352190 master-0 kubenswrapper[18592]: I0308 04:19:42.352059 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 04:19:42.957353 master-0 kubenswrapper[18592]: I0308 04:19:42.957284 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ffcc095d-2c9f-44dc-9bef-6f6f962e8d70","Type":"ContainerStarted","Data":"9d19c5bf526022eb8c257e63fa7e7fa4035c7539e7e8ff3763f85580bca5e834"} Mar 08 04:19:42.962842 master-0 kubenswrapper[18592]: I0308 04:19:42.962791 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"351657d7-7d32-440d-94a9-46bfb27871a4","Type":"ContainerStarted","Data":"8d2ab13b5231ff92fa88342510da9fcbc544f34fe981c9e7bbd3206434ee661c"} Mar 08 04:19:42.962948 master-0 kubenswrapper[18592]: I0308 04:19:42.962846 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"351657d7-7d32-440d-94a9-46bfb27871a4","Type":"ContainerStarted","Data":"4a2defff732ad3eb5100d1e4751d09aad32cc280f036126795e5f22a656d064b"} Mar 08 04:19:42.962948 master-0 kubenswrapper[18592]: I0308 04:19:42.962857 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"351657d7-7d32-440d-94a9-46bfb27871a4","Type":"ContainerStarted","Data":"5d706115ff8f139ddc612bd9f55faf254fd521eb7961da4b68a614ddd94b8596"} Mar 08 04:19:42.998569 master-0 kubenswrapper[18592]: I0308 04:19:42.998422 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.998387059 podStartE2EDuration="2.998387059s" podCreationTimestamp="2026-03-08 04:19:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:19:42.978986362 +0000 UTC m=+1595.077740712" watchObservedRunningTime="2026-03-08 04:19:42.998387059 +0000 UTC m=+1595.097141459" Mar 08 04:19:43.019781 master-0 kubenswrapper[18592]: I0308 04:19:43.019688 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.019666307 podStartE2EDuration="2.019666307s" podCreationTimestamp="2026-03-08 04:19:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:19:42.999648813 +0000 UTC m=+1595.098403163" watchObservedRunningTime="2026-03-08 04:19:43.019666307 +0000 UTC m=+1595.118420667" Mar 08 04:19:46.344582 master-0 kubenswrapper[18592]: I0308 04:19:46.344547 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:46.802745 master-0 kubenswrapper[18592]: I0308 04:19:46.802663 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 04:19:46.802745 master-0 kubenswrapper[18592]: I0308 04:19:46.802741 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 04:19:47.465287 master-0 kubenswrapper[18592]: I0308 04:19:47.465220 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 04:19:47.470488 master-0 kubenswrapper[18592]: I0308 04:19:47.465771 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 04:19:47.470488 master-0 kubenswrapper[18592]: I0308 04:19:47.468892 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 04:19:47.474350 master-0 kubenswrapper[18592]: I0308 04:19:47.474285 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 04:19:48.079212 master-0 kubenswrapper[18592]: I0308 04:19:48.079111 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 04:19:48.084634 master-0 kubenswrapper[18592]: I0308 04:19:48.084573 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 04:19:48.371495 master-0 kubenswrapper[18592]: I0308 04:19:48.371020 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78466d865f-kkm92"] Mar 08 04:19:48.376411 master-0 kubenswrapper[18592]: I0308 04:19:48.374896 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.387145 master-0 kubenswrapper[18592]: I0308 04:19:48.387090 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78466d865f-kkm92"] Mar 08 04:19:48.454225 master-0 kubenswrapper[18592]: I0308 04:19:48.453931 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37219418-787e-46f1-b4a9-afd765b3c33b-dns-svc\") pod \"dnsmasq-dns-78466d865f-kkm92\" (UID: \"37219418-787e-46f1-b4a9-afd765b3c33b\") " pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.455305 master-0 kubenswrapper[18592]: I0308 04:19:48.454141 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37219418-787e-46f1-b4a9-afd765b3c33b-dns-swift-storage-0\") pod \"dnsmasq-dns-78466d865f-kkm92\" (UID: \"37219418-787e-46f1-b4a9-afd765b3c33b\") " pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.456450 master-0 kubenswrapper[18592]: I0308 04:19:48.456418 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37219418-787e-46f1-b4a9-afd765b3c33b-config\") pod \"dnsmasq-dns-78466d865f-kkm92\" (UID: \"37219418-787e-46f1-b4a9-afd765b3c33b\") " pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.459078 master-0 kubenswrapper[18592]: I0308 04:19:48.459043 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjpwf\" (UniqueName: \"kubernetes.io/projected/37219418-787e-46f1-b4a9-afd765b3c33b-kube-api-access-mjpwf\") pod \"dnsmasq-dns-78466d865f-kkm92\" (UID: \"37219418-787e-46f1-b4a9-afd765b3c33b\") " pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.459143 master-0 kubenswrapper[18592]: I0308 04:19:48.459119 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37219418-787e-46f1-b4a9-afd765b3c33b-ovsdbserver-nb\") pod \"dnsmasq-dns-78466d865f-kkm92\" (UID: \"37219418-787e-46f1-b4a9-afd765b3c33b\") " pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.459388 master-0 kubenswrapper[18592]: I0308 04:19:48.459342 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37219418-787e-46f1-b4a9-afd765b3c33b-ovsdbserver-sb\") pod \"dnsmasq-dns-78466d865f-kkm92\" (UID: \"37219418-787e-46f1-b4a9-afd765b3c33b\") " pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.562064 master-0 kubenswrapper[18592]: I0308 04:19:48.561424 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37219418-787e-46f1-b4a9-afd765b3c33b-ovsdbserver-nb\") pod \"dnsmasq-dns-78466d865f-kkm92\" (UID: \"37219418-787e-46f1-b4a9-afd765b3c33b\") " pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.562064 master-0 kubenswrapper[18592]: I0308 04:19:48.561543 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37219418-787e-46f1-b4a9-afd765b3c33b-ovsdbserver-sb\") pod \"dnsmasq-dns-78466d865f-kkm92\" (UID: \"37219418-787e-46f1-b4a9-afd765b3c33b\") " pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.562064 master-0 kubenswrapper[18592]: I0308 04:19:48.561572 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37219418-787e-46f1-b4a9-afd765b3c33b-dns-svc\") pod \"dnsmasq-dns-78466d865f-kkm92\" (UID: \"37219418-787e-46f1-b4a9-afd765b3c33b\") " pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.562064 master-0 kubenswrapper[18592]: I0308 04:19:48.561705 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37219418-787e-46f1-b4a9-afd765b3c33b-dns-swift-storage-0\") pod \"dnsmasq-dns-78466d865f-kkm92\" (UID: \"37219418-787e-46f1-b4a9-afd765b3c33b\") " pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.562064 master-0 kubenswrapper[18592]: I0308 04:19:48.561789 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37219418-787e-46f1-b4a9-afd765b3c33b-config\") pod \"dnsmasq-dns-78466d865f-kkm92\" (UID: \"37219418-787e-46f1-b4a9-afd765b3c33b\") " pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.562064 master-0 kubenswrapper[18592]: I0308 04:19:48.561833 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjpwf\" (UniqueName: \"kubernetes.io/projected/37219418-787e-46f1-b4a9-afd765b3c33b-kube-api-access-mjpwf\") pod \"dnsmasq-dns-78466d865f-kkm92\" (UID: \"37219418-787e-46f1-b4a9-afd765b3c33b\") " pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.562877 master-0 kubenswrapper[18592]: I0308 04:19:48.562527 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37219418-787e-46f1-b4a9-afd765b3c33b-ovsdbserver-nb\") pod \"dnsmasq-dns-78466d865f-kkm92\" (UID: \"37219418-787e-46f1-b4a9-afd765b3c33b\") " pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.563930 master-0 kubenswrapper[18592]: I0308 04:19:48.563010 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37219418-787e-46f1-b4a9-afd765b3c33b-ovsdbserver-sb\") pod \"dnsmasq-dns-78466d865f-kkm92\" (UID: \"37219418-787e-46f1-b4a9-afd765b3c33b\") " pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.563930 master-0 kubenswrapper[18592]: I0308 04:19:48.563147 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37219418-787e-46f1-b4a9-afd765b3c33b-dns-swift-storage-0\") pod \"dnsmasq-dns-78466d865f-kkm92\" (UID: \"37219418-787e-46f1-b4a9-afd765b3c33b\") " pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.563930 master-0 kubenswrapper[18592]: I0308 04:19:48.563605 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37219418-787e-46f1-b4a9-afd765b3c33b-dns-svc\") pod \"dnsmasq-dns-78466d865f-kkm92\" (UID: \"37219418-787e-46f1-b4a9-afd765b3c33b\") " pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.563930 master-0 kubenswrapper[18592]: I0308 04:19:48.563882 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37219418-787e-46f1-b4a9-afd765b3c33b-config\") pod \"dnsmasq-dns-78466d865f-kkm92\" (UID: \"37219418-787e-46f1-b4a9-afd765b3c33b\") " pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.579291 master-0 kubenswrapper[18592]: I0308 04:19:48.579243 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjpwf\" (UniqueName: \"kubernetes.io/projected/37219418-787e-46f1-b4a9-afd765b3c33b-kube-api-access-mjpwf\") pod \"dnsmasq-dns-78466d865f-kkm92\" (UID: \"37219418-787e-46f1-b4a9-afd765b3c33b\") " pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:48.752171 master-0 kubenswrapper[18592]: I0308 04:19:48.752121 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:49.263704 master-0 kubenswrapper[18592]: I0308 04:19:49.262691 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78466d865f-kkm92"] Mar 08 04:19:50.102368 master-0 kubenswrapper[18592]: I0308 04:19:50.102305 18592 generic.go:334] "Generic (PLEG): container finished" podID="37219418-787e-46f1-b4a9-afd765b3c33b" containerID="ae74cfdfa724389f3feb57b1e610054861503018d194d36e6be421c87244a76c" exitCode=0 Mar 08 04:19:50.102368 master-0 kubenswrapper[18592]: I0308 04:19:50.102345 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78466d865f-kkm92" event={"ID":"37219418-787e-46f1-b4a9-afd765b3c33b","Type":"ContainerDied","Data":"ae74cfdfa724389f3feb57b1e610054861503018d194d36e6be421c87244a76c"} Mar 08 04:19:50.102368 master-0 kubenswrapper[18592]: I0308 04:19:50.102381 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78466d865f-kkm92" event={"ID":"37219418-787e-46f1-b4a9-afd765b3c33b","Type":"ContainerStarted","Data":"937f95566f5408369d74a144e6b6556dfc1e394e0292971b67ac4e2cf3531295"} Mar 08 04:19:50.717859 master-0 kubenswrapper[18592]: I0308 04:19:50.715636 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 04:19:51.137324 master-0 kubenswrapper[18592]: I0308 04:19:51.137188 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78466d865f-kkm92" event={"ID":"37219418-787e-46f1-b4a9-afd765b3c33b","Type":"ContainerStarted","Data":"d4384265432a84ba5335cfbfdba55eb5761a319c9ba278134299fd1d3b18b5d5"} Mar 08 04:19:51.137324 master-0 kubenswrapper[18592]: I0308 04:19:51.137250 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="eaf095f0-60ad-43de-a29c-6cb21bf470ad" containerName="nova-api-log" containerID="cri-o://580f07b20dc274b7ea10da5721b94468bab12dc8f6ca25465ab0c69b63f19c72" gracePeriod=30 Mar 08 04:19:51.137971 master-0 kubenswrapper[18592]: I0308 04:19:51.137390 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="eaf095f0-60ad-43de-a29c-6cb21bf470ad" containerName="nova-api-api" containerID="cri-o://dee7832c5c7cec471f88b7543bbff442c90b1920f3bd0e442c98e3e21306bf13" gracePeriod=30 Mar 08 04:19:51.185865 master-0 kubenswrapper[18592]: I0308 04:19:51.185766 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78466d865f-kkm92" podStartSLOduration=3.185747196 podStartE2EDuration="3.185747196s" podCreationTimestamp="2026-03-08 04:19:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:19:51.175467337 +0000 UTC m=+1603.274221687" watchObservedRunningTime="2026-03-08 04:19:51.185747196 +0000 UTC m=+1603.284501546" Mar 08 04:19:51.343776 master-0 kubenswrapper[18592]: I0308 04:19:51.343728 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:51.367172 master-0 kubenswrapper[18592]: I0308 04:19:51.367103 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:51.802596 master-0 kubenswrapper[18592]: I0308 04:19:51.802528 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 04:19:51.802596 master-0 kubenswrapper[18592]: I0308 04:19:51.802595 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 04:19:52.166904 master-0 kubenswrapper[18592]: I0308 04:19:52.166224 18592 generic.go:334] "Generic (PLEG): container finished" podID="eaf095f0-60ad-43de-a29c-6cb21bf470ad" containerID="580f07b20dc274b7ea10da5721b94468bab12dc8f6ca25465ab0c69b63f19c72" exitCode=143 Mar 08 04:19:52.166904 master-0 kubenswrapper[18592]: I0308 04:19:52.166464 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eaf095f0-60ad-43de-a29c-6cb21bf470ad","Type":"ContainerDied","Data":"580f07b20dc274b7ea10da5721b94468bab12dc8f6ca25465ab0c69b63f19c72"} Mar 08 04:19:52.166904 master-0 kubenswrapper[18592]: I0308 04:19:52.166536 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:52.194840 master-0 kubenswrapper[18592]: I0308 04:19:52.194410 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 08 04:19:52.407846 master-0 kubenswrapper[18592]: I0308 04:19:52.400199 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-sr2h2"] Mar 08 04:19:52.407846 master-0 kubenswrapper[18592]: I0308 04:19:52.402154 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sr2h2" Mar 08 04:19:52.431911 master-0 kubenswrapper[18592]: I0308 04:19:52.427472 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 08 04:19:52.431911 master-0 kubenswrapper[18592]: I0308 04:19:52.428062 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 08 04:19:52.457972 master-0 kubenswrapper[18592]: I0308 04:19:52.452883 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-host-discover-fvmbl"] Mar 08 04:19:52.457972 master-0 kubenswrapper[18592]: I0308 04:19:52.455984 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-fvmbl" Mar 08 04:19:52.465846 master-0 kubenswrapper[18592]: I0308 04:19:52.463720 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-sr2h2"] Mar 08 04:19:52.475333 master-0 kubenswrapper[18592]: I0308 04:19:52.475274 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-combined-ca-bundle\") pod \"nova-cell1-host-discover-fvmbl\" (UID: \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\") " pod="openstack/nova-cell1-host-discover-fvmbl" Mar 08 04:19:52.475333 master-0 kubenswrapper[18592]: I0308 04:19:52.475334 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c462f4de-c8cf-4b54-9936-0fb278b41547-config-data\") pod \"nova-cell1-cell-mapping-sr2h2\" (UID: \"c462f4de-c8cf-4b54-9936-0fb278b41547\") " pod="openstack/nova-cell1-cell-mapping-sr2h2" Mar 08 04:19:52.475574 master-0 kubenswrapper[18592]: I0308 04:19:52.475382 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56p5b\" (UniqueName: \"kubernetes.io/projected/c462f4de-c8cf-4b54-9936-0fb278b41547-kube-api-access-56p5b\") pod \"nova-cell1-cell-mapping-sr2h2\" (UID: \"c462f4de-c8cf-4b54-9936-0fb278b41547\") " pod="openstack/nova-cell1-cell-mapping-sr2h2" Mar 08 04:19:52.475574 master-0 kubenswrapper[18592]: I0308 04:19:52.475436 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-config-data\") pod \"nova-cell1-host-discover-fvmbl\" (UID: \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\") " pod="openstack/nova-cell1-host-discover-fvmbl" Mar 08 04:19:52.475574 master-0 kubenswrapper[18592]: I0308 04:19:52.475453 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c462f4de-c8cf-4b54-9936-0fb278b41547-scripts\") pod \"nova-cell1-cell-mapping-sr2h2\" (UID: \"c462f4de-c8cf-4b54-9936-0fb278b41547\") " pod="openstack/nova-cell1-cell-mapping-sr2h2" Mar 08 04:19:52.475574 master-0 kubenswrapper[18592]: I0308 04:19:52.475493 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c462f4de-c8cf-4b54-9936-0fb278b41547-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sr2h2\" (UID: \"c462f4de-c8cf-4b54-9936-0fb278b41547\") " pod="openstack/nova-cell1-cell-mapping-sr2h2" Mar 08 04:19:52.475574 master-0 kubenswrapper[18592]: I0308 04:19:52.475571 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpmjg\" (UniqueName: \"kubernetes.io/projected/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-kube-api-access-xpmjg\") pod \"nova-cell1-host-discover-fvmbl\" (UID: \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\") " pod="openstack/nova-cell1-host-discover-fvmbl" Mar 08 04:19:52.475715 master-0 kubenswrapper[18592]: I0308 04:19:52.475613 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-scripts\") pod \"nova-cell1-host-discover-fvmbl\" (UID: \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\") " pod="openstack/nova-cell1-host-discover-fvmbl" Mar 08 04:19:52.479910 master-0 kubenswrapper[18592]: I0308 04:19:52.479801 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-fvmbl"] Mar 08 04:19:52.580920 master-0 kubenswrapper[18592]: I0308 04:19:52.580793 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpmjg\" (UniqueName: \"kubernetes.io/projected/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-kube-api-access-xpmjg\") pod \"nova-cell1-host-discover-fvmbl\" (UID: \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\") " pod="openstack/nova-cell1-host-discover-fvmbl" Mar 08 04:19:52.581189 master-0 kubenswrapper[18592]: I0308 04:19:52.580945 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-scripts\") pod \"nova-cell1-host-discover-fvmbl\" (UID: \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\") " pod="openstack/nova-cell1-host-discover-fvmbl" Mar 08 04:19:52.581189 master-0 kubenswrapper[18592]: I0308 04:19:52.581037 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-combined-ca-bundle\") pod \"nova-cell1-host-discover-fvmbl\" (UID: \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\") " pod="openstack/nova-cell1-host-discover-fvmbl" Mar 08 04:19:52.581344 master-0 kubenswrapper[18592]: I0308 04:19:52.581183 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c462f4de-c8cf-4b54-9936-0fb278b41547-config-data\") pod \"nova-cell1-cell-mapping-sr2h2\" (UID: \"c462f4de-c8cf-4b54-9936-0fb278b41547\") " pod="openstack/nova-cell1-cell-mapping-sr2h2" Mar 08 04:19:52.581344 master-0 kubenswrapper[18592]: I0308 04:19:52.581296 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56p5b\" (UniqueName: \"kubernetes.io/projected/c462f4de-c8cf-4b54-9936-0fb278b41547-kube-api-access-56p5b\") pod \"nova-cell1-cell-mapping-sr2h2\" (UID: \"c462f4de-c8cf-4b54-9936-0fb278b41547\") " pod="openstack/nova-cell1-cell-mapping-sr2h2" Mar 08 04:19:52.581474 master-0 kubenswrapper[18592]: I0308 04:19:52.581393 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-config-data\") pod \"nova-cell1-host-discover-fvmbl\" (UID: \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\") " pod="openstack/nova-cell1-host-discover-fvmbl" Mar 08 04:19:52.581474 master-0 kubenswrapper[18592]: I0308 04:19:52.581415 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c462f4de-c8cf-4b54-9936-0fb278b41547-scripts\") pod \"nova-cell1-cell-mapping-sr2h2\" (UID: \"c462f4de-c8cf-4b54-9936-0fb278b41547\") " pod="openstack/nova-cell1-cell-mapping-sr2h2" Mar 08 04:19:52.581780 master-0 kubenswrapper[18592]: I0308 04:19:52.581615 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c462f4de-c8cf-4b54-9936-0fb278b41547-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sr2h2\" (UID: \"c462f4de-c8cf-4b54-9936-0fb278b41547\") " pod="openstack/nova-cell1-cell-mapping-sr2h2" Mar 08 04:19:52.585274 master-0 kubenswrapper[18592]: I0308 04:19:52.585234 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-config-data\") pod \"nova-cell1-host-discover-fvmbl\" (UID: \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\") " pod="openstack/nova-cell1-host-discover-fvmbl" Mar 08 04:19:52.586292 master-0 kubenswrapper[18592]: I0308 04:19:52.586242 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-scripts\") pod \"nova-cell1-host-discover-fvmbl\" (UID: \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\") " pod="openstack/nova-cell1-host-discover-fvmbl" Mar 08 04:19:52.587417 master-0 kubenswrapper[18592]: I0308 04:19:52.587378 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c462f4de-c8cf-4b54-9936-0fb278b41547-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sr2h2\" (UID: \"c462f4de-c8cf-4b54-9936-0fb278b41547\") " pod="openstack/nova-cell1-cell-mapping-sr2h2" Mar 08 04:19:52.593701 master-0 kubenswrapper[18592]: I0308 04:19:52.593655 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c462f4de-c8cf-4b54-9936-0fb278b41547-config-data\") pod \"nova-cell1-cell-mapping-sr2h2\" (UID: \"c462f4de-c8cf-4b54-9936-0fb278b41547\") " pod="openstack/nova-cell1-cell-mapping-sr2h2" Mar 08 04:19:52.595779 master-0 kubenswrapper[18592]: I0308 04:19:52.595741 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-combined-ca-bundle\") pod \"nova-cell1-host-discover-fvmbl\" (UID: \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\") " pod="openstack/nova-cell1-host-discover-fvmbl" Mar 08 04:19:52.596048 master-0 kubenswrapper[18592]: I0308 04:19:52.596014 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56p5b\" (UniqueName: \"kubernetes.io/projected/c462f4de-c8cf-4b54-9936-0fb278b41547-kube-api-access-56p5b\") pod \"nova-cell1-cell-mapping-sr2h2\" (UID: \"c462f4de-c8cf-4b54-9936-0fb278b41547\") " pod="openstack/nova-cell1-cell-mapping-sr2h2" Mar 08 04:19:52.599027 master-0 kubenswrapper[18592]: I0308 04:19:52.598967 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpmjg\" (UniqueName: \"kubernetes.io/projected/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-kube-api-access-xpmjg\") pod \"nova-cell1-host-discover-fvmbl\" (UID: \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\") " pod="openstack/nova-cell1-host-discover-fvmbl" Mar 08 04:19:52.600208 master-0 kubenswrapper[18592]: I0308 04:19:52.600093 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c462f4de-c8cf-4b54-9936-0fb278b41547-scripts\") pod \"nova-cell1-cell-mapping-sr2h2\" (UID: \"c462f4de-c8cf-4b54-9936-0fb278b41547\") " pod="openstack/nova-cell1-cell-mapping-sr2h2" Mar 08 04:19:52.739855 master-0 kubenswrapper[18592]: I0308 04:19:52.738939 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sr2h2" Mar 08 04:19:52.774675 master-0 kubenswrapper[18592]: I0308 04:19:52.773925 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-fvmbl" Mar 08 04:19:52.823860 master-0 kubenswrapper[18592]: I0308 04:19:52.822049 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="351657d7-7d32-440d-94a9-46bfb27871a4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.10:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 04:19:52.823860 master-0 kubenswrapper[18592]: I0308 04:19:52.822369 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="351657d7-7d32-440d-94a9-46bfb27871a4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.10:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 04:19:53.322400 master-0 kubenswrapper[18592]: I0308 04:19:53.318002 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-sr2h2"] Mar 08 04:19:53.325675 master-0 kubenswrapper[18592]: I0308 04:19:53.325605 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-fvmbl"] Mar 08 04:19:53.325796 master-0 kubenswrapper[18592]: W0308 04:19:53.325713 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55182ece_5dae_4bb2_9d2c_cf0fbc3bb55e.slice/crio-0925b77484ab99d48b4ec8476597508088651bee7a462a9c16edeef8fbc7fcba WatchSource:0}: Error finding container 0925b77484ab99d48b4ec8476597508088651bee7a462a9c16edeef8fbc7fcba: Status 404 returned error can't find the container with id 0925b77484ab99d48b4ec8476597508088651bee7a462a9c16edeef8fbc7fcba Mar 08 04:19:54.212467 master-0 kubenswrapper[18592]: I0308 04:19:54.212404 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sr2h2" event={"ID":"c462f4de-c8cf-4b54-9936-0fb278b41547","Type":"ContainerStarted","Data":"9aeac887db28754d3c3881529c598d3ae3e5127023ee8dd5e1bba09f0cf10681"} Mar 08 04:19:54.212467 master-0 kubenswrapper[18592]: I0308 04:19:54.212469 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sr2h2" event={"ID":"c462f4de-c8cf-4b54-9936-0fb278b41547","Type":"ContainerStarted","Data":"1cc48b1d11a9b40a147fe582bfddf81a3f454351560aa8f2592e7efdff8a7d50"} Mar 08 04:19:54.230561 master-0 kubenswrapper[18592]: I0308 04:19:54.230498 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-fvmbl" event={"ID":"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e","Type":"ContainerStarted","Data":"f7fcc5facccab763e3578828ba33e72d72a24f268851ad3dfc9473fa31086bd2"} Mar 08 04:19:54.230561 master-0 kubenswrapper[18592]: I0308 04:19:54.230566 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-fvmbl" event={"ID":"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e","Type":"ContainerStarted","Data":"0925b77484ab99d48b4ec8476597508088651bee7a462a9c16edeef8fbc7fcba"} Mar 08 04:19:54.249958 master-0 kubenswrapper[18592]: I0308 04:19:54.249785 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-sr2h2" podStartSLOduration=2.249762426 podStartE2EDuration="2.249762426s" podCreationTimestamp="2026-03-08 04:19:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:19:54.239147667 +0000 UTC m=+1606.337902037" watchObservedRunningTime="2026-03-08 04:19:54.249762426 +0000 UTC m=+1606.348516776" Mar 08 04:19:54.264071 master-0 kubenswrapper[18592]: I0308 04:19:54.263973 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-host-discover-fvmbl" podStartSLOduration=2.263954401 podStartE2EDuration="2.263954401s" podCreationTimestamp="2026-03-08 04:19:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:19:54.258653567 +0000 UTC m=+1606.357407917" watchObservedRunningTime="2026-03-08 04:19:54.263954401 +0000 UTC m=+1606.362708741" Mar 08 04:19:54.881223 master-0 kubenswrapper[18592]: I0308 04:19:54.881169 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 04:19:54.957093 master-0 kubenswrapper[18592]: I0308 04:19:54.954749 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whp7d\" (UniqueName: \"kubernetes.io/projected/eaf095f0-60ad-43de-a29c-6cb21bf470ad-kube-api-access-whp7d\") pod \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\" (UID: \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\") " Mar 08 04:19:54.957093 master-0 kubenswrapper[18592]: I0308 04:19:54.954906 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaf095f0-60ad-43de-a29c-6cb21bf470ad-logs\") pod \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\" (UID: \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\") " Mar 08 04:19:54.957093 master-0 kubenswrapper[18592]: I0308 04:19:54.954953 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaf095f0-60ad-43de-a29c-6cb21bf470ad-config-data\") pod \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\" (UID: \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\") " Mar 08 04:19:54.957093 master-0 kubenswrapper[18592]: I0308 04:19:54.955044 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaf095f0-60ad-43de-a29c-6cb21bf470ad-combined-ca-bundle\") pod \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\" (UID: \"eaf095f0-60ad-43de-a29c-6cb21bf470ad\") " Mar 08 04:19:54.958401 master-0 kubenswrapper[18592]: I0308 04:19:54.958048 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaf095f0-60ad-43de-a29c-6cb21bf470ad-logs" (OuterVolumeSpecName: "logs") pod "eaf095f0-60ad-43de-a29c-6cb21bf470ad" (UID: "eaf095f0-60ad-43de-a29c-6cb21bf470ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:19:54.960307 master-0 kubenswrapper[18592]: I0308 04:19:54.960263 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaf095f0-60ad-43de-a29c-6cb21bf470ad-kube-api-access-whp7d" (OuterVolumeSpecName: "kube-api-access-whp7d") pod "eaf095f0-60ad-43de-a29c-6cb21bf470ad" (UID: "eaf095f0-60ad-43de-a29c-6cb21bf470ad"). InnerVolumeSpecName "kube-api-access-whp7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:19:55.008079 master-0 kubenswrapper[18592]: I0308 04:19:55.007379 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaf095f0-60ad-43de-a29c-6cb21bf470ad-config-data" (OuterVolumeSpecName: "config-data") pod "eaf095f0-60ad-43de-a29c-6cb21bf470ad" (UID: "eaf095f0-60ad-43de-a29c-6cb21bf470ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:55.011473 master-0 kubenswrapper[18592]: I0308 04:19:55.011370 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaf095f0-60ad-43de-a29c-6cb21bf470ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eaf095f0-60ad-43de-a29c-6cb21bf470ad" (UID: "eaf095f0-60ad-43de-a29c-6cb21bf470ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:55.062294 master-0 kubenswrapper[18592]: I0308 04:19:55.062221 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whp7d\" (UniqueName: \"kubernetes.io/projected/eaf095f0-60ad-43de-a29c-6cb21bf470ad-kube-api-access-whp7d\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:55.062294 master-0 kubenswrapper[18592]: I0308 04:19:55.062271 18592 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaf095f0-60ad-43de-a29c-6cb21bf470ad-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:55.062294 master-0 kubenswrapper[18592]: I0308 04:19:55.062284 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaf095f0-60ad-43de-a29c-6cb21bf470ad-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:55.062294 master-0 kubenswrapper[18592]: I0308 04:19:55.062293 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaf095f0-60ad-43de-a29c-6cb21bf470ad-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:55.263853 master-0 kubenswrapper[18592]: I0308 04:19:55.258612 18592 generic.go:334] "Generic (PLEG): container finished" podID="eaf095f0-60ad-43de-a29c-6cb21bf470ad" containerID="dee7832c5c7cec471f88b7543bbff442c90b1920f3bd0e442c98e3e21306bf13" exitCode=0 Mar 08 04:19:55.263853 master-0 kubenswrapper[18592]: I0308 04:19:55.258690 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 04:19:55.263853 master-0 kubenswrapper[18592]: I0308 04:19:55.258752 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eaf095f0-60ad-43de-a29c-6cb21bf470ad","Type":"ContainerDied","Data":"dee7832c5c7cec471f88b7543bbff442c90b1920f3bd0e442c98e3e21306bf13"} Mar 08 04:19:55.263853 master-0 kubenswrapper[18592]: I0308 04:19:55.258790 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eaf095f0-60ad-43de-a29c-6cb21bf470ad","Type":"ContainerDied","Data":"dd95339767acbe3df02c787593e43d5406b148c7264064cbb40fab0af998be14"} Mar 08 04:19:55.263853 master-0 kubenswrapper[18592]: I0308 04:19:55.258809 18592 scope.go:117] "RemoveContainer" containerID="dee7832c5c7cec471f88b7543bbff442c90b1920f3bd0e442c98e3e21306bf13" Mar 08 04:19:55.307844 master-0 kubenswrapper[18592]: I0308 04:19:55.304579 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 04:19:55.311890 master-0 kubenswrapper[18592]: I0308 04:19:55.311857 18592 scope.go:117] "RemoveContainer" containerID="580f07b20dc274b7ea10da5721b94468bab12dc8f6ca25465ab0c69b63f19c72" Mar 08 04:19:55.319001 master-0 kubenswrapper[18592]: I0308 04:19:55.318333 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 04:19:55.350638 master-0 kubenswrapper[18592]: I0308 04:19:55.350583 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 04:19:55.354842 master-0 kubenswrapper[18592]: E0308 04:19:55.351163 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf095f0-60ad-43de-a29c-6cb21bf470ad" containerName="nova-api-api" Mar 08 04:19:55.354842 master-0 kubenswrapper[18592]: I0308 04:19:55.351184 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf095f0-60ad-43de-a29c-6cb21bf470ad" containerName="nova-api-api" Mar 08 04:19:55.354842 master-0 kubenswrapper[18592]: E0308 04:19:55.351231 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf095f0-60ad-43de-a29c-6cb21bf470ad" containerName="nova-api-log" Mar 08 04:19:55.354842 master-0 kubenswrapper[18592]: I0308 04:19:55.351237 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf095f0-60ad-43de-a29c-6cb21bf470ad" containerName="nova-api-log" Mar 08 04:19:55.354842 master-0 kubenswrapper[18592]: I0308 04:19:55.351519 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf095f0-60ad-43de-a29c-6cb21bf470ad" containerName="nova-api-api" Mar 08 04:19:55.354842 master-0 kubenswrapper[18592]: I0308 04:19:55.351571 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf095f0-60ad-43de-a29c-6cb21bf470ad" containerName="nova-api-log" Mar 08 04:19:55.354842 master-0 kubenswrapper[18592]: I0308 04:19:55.353004 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 04:19:55.364843 master-0 kubenswrapper[18592]: I0308 04:19:55.361182 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 04:19:55.364843 master-0 kubenswrapper[18592]: I0308 04:19:55.362527 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 08 04:19:55.364843 master-0 kubenswrapper[18592]: I0308 04:19:55.363797 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 04:19:55.364843 master-0 kubenswrapper[18592]: I0308 04:19:55.363953 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 08 04:19:55.376953 master-0 kubenswrapper[18592]: I0308 04:19:55.375277 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-public-tls-certs\") pod \"nova-api-0\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " pod="openstack/nova-api-0" Mar 08 04:19:55.376953 master-0 kubenswrapper[18592]: I0308 04:19:55.375335 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-config-data\") pod \"nova-api-0\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " pod="openstack/nova-api-0" Mar 08 04:19:55.376953 master-0 kubenswrapper[18592]: I0308 04:19:55.375364 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaedf508-6629-4535-9b0d-8d1e6aa649c2-logs\") pod \"nova-api-0\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " pod="openstack/nova-api-0" Mar 08 04:19:55.376953 master-0 kubenswrapper[18592]: I0308 04:19:55.375384 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlrtt\" (UniqueName: \"kubernetes.io/projected/eaedf508-6629-4535-9b0d-8d1e6aa649c2-kube-api-access-zlrtt\") pod \"nova-api-0\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " pod="openstack/nova-api-0" Mar 08 04:19:55.376953 master-0 kubenswrapper[18592]: I0308 04:19:55.375405 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " pod="openstack/nova-api-0" Mar 08 04:19:55.376953 master-0 kubenswrapper[18592]: I0308 04:19:55.375488 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " pod="openstack/nova-api-0" Mar 08 04:19:55.386755 master-0 kubenswrapper[18592]: I0308 04:19:55.386701 18592 scope.go:117] "RemoveContainer" containerID="dee7832c5c7cec471f88b7543bbff442c90b1920f3bd0e442c98e3e21306bf13" Mar 08 04:19:55.388550 master-0 kubenswrapper[18592]: E0308 04:19:55.388490 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dee7832c5c7cec471f88b7543bbff442c90b1920f3bd0e442c98e3e21306bf13\": container with ID starting with dee7832c5c7cec471f88b7543bbff442c90b1920f3bd0e442c98e3e21306bf13 not found: ID does not exist" containerID="dee7832c5c7cec471f88b7543bbff442c90b1920f3bd0e442c98e3e21306bf13" Mar 08 04:19:55.388725 master-0 kubenswrapper[18592]: I0308 04:19:55.388564 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dee7832c5c7cec471f88b7543bbff442c90b1920f3bd0e442c98e3e21306bf13"} err="failed to get container status \"dee7832c5c7cec471f88b7543bbff442c90b1920f3bd0e442c98e3e21306bf13\": rpc error: code = NotFound desc = could not find container \"dee7832c5c7cec471f88b7543bbff442c90b1920f3bd0e442c98e3e21306bf13\": container with ID starting with dee7832c5c7cec471f88b7543bbff442c90b1920f3bd0e442c98e3e21306bf13 not found: ID does not exist" Mar 08 04:19:55.388725 master-0 kubenswrapper[18592]: I0308 04:19:55.388591 18592 scope.go:117] "RemoveContainer" containerID="580f07b20dc274b7ea10da5721b94468bab12dc8f6ca25465ab0c69b63f19c72" Mar 08 04:19:55.401322 master-0 kubenswrapper[18592]: E0308 04:19:55.401267 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"580f07b20dc274b7ea10da5721b94468bab12dc8f6ca25465ab0c69b63f19c72\": container with ID starting with 580f07b20dc274b7ea10da5721b94468bab12dc8f6ca25465ab0c69b63f19c72 not found: ID does not exist" containerID="580f07b20dc274b7ea10da5721b94468bab12dc8f6ca25465ab0c69b63f19c72" Mar 08 04:19:55.401322 master-0 kubenswrapper[18592]: I0308 04:19:55.401308 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"580f07b20dc274b7ea10da5721b94468bab12dc8f6ca25465ab0c69b63f19c72"} err="failed to get container status \"580f07b20dc274b7ea10da5721b94468bab12dc8f6ca25465ab0c69b63f19c72\": rpc error: code = NotFound desc = could not find container \"580f07b20dc274b7ea10da5721b94468bab12dc8f6ca25465ab0c69b63f19c72\": container with ID starting with 580f07b20dc274b7ea10da5721b94468bab12dc8f6ca25465ab0c69b63f19c72 not found: ID does not exist" Mar 08 04:19:55.478660 master-0 kubenswrapper[18592]: I0308 04:19:55.477813 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-public-tls-certs\") pod \"nova-api-0\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " pod="openstack/nova-api-0" Mar 08 04:19:55.478660 master-0 kubenswrapper[18592]: I0308 04:19:55.477874 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-config-data\") pod \"nova-api-0\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " pod="openstack/nova-api-0" Mar 08 04:19:55.478660 master-0 kubenswrapper[18592]: I0308 04:19:55.477922 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaedf508-6629-4535-9b0d-8d1e6aa649c2-logs\") pod \"nova-api-0\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " pod="openstack/nova-api-0" Mar 08 04:19:55.478660 master-0 kubenswrapper[18592]: I0308 04:19:55.477944 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlrtt\" (UniqueName: \"kubernetes.io/projected/eaedf508-6629-4535-9b0d-8d1e6aa649c2-kube-api-access-zlrtt\") pod \"nova-api-0\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " pod="openstack/nova-api-0" Mar 08 04:19:55.478660 master-0 kubenswrapper[18592]: I0308 04:19:55.478403 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaedf508-6629-4535-9b0d-8d1e6aa649c2-logs\") pod \"nova-api-0\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " pod="openstack/nova-api-0" Mar 08 04:19:55.479207 master-0 kubenswrapper[18592]: I0308 04:19:55.479175 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " pod="openstack/nova-api-0" Mar 08 04:19:55.479602 master-0 kubenswrapper[18592]: I0308 04:19:55.479574 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " pod="openstack/nova-api-0" Mar 08 04:19:55.484230 master-0 kubenswrapper[18592]: I0308 04:19:55.484165 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " pod="openstack/nova-api-0" Mar 08 04:19:55.484785 master-0 kubenswrapper[18592]: I0308 04:19:55.484749 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " pod="openstack/nova-api-0" Mar 08 04:19:55.487690 master-0 kubenswrapper[18592]: I0308 04:19:55.487658 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-config-data\") pod \"nova-api-0\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " pod="openstack/nova-api-0" Mar 08 04:19:55.495971 master-0 kubenswrapper[18592]: I0308 04:19:55.495908 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-public-tls-certs\") pod \"nova-api-0\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " pod="openstack/nova-api-0" Mar 08 04:19:55.504181 master-0 kubenswrapper[18592]: I0308 04:19:55.504152 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlrtt\" (UniqueName: \"kubernetes.io/projected/eaedf508-6629-4535-9b0d-8d1e6aa649c2-kube-api-access-zlrtt\") pod \"nova-api-0\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " pod="openstack/nova-api-0" Mar 08 04:19:55.718097 master-0 kubenswrapper[18592]: I0308 04:19:55.718028 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 04:19:56.202848 master-0 kubenswrapper[18592]: I0308 04:19:56.198872 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaf095f0-60ad-43de-a29c-6cb21bf470ad" path="/var/lib/kubelet/pods/eaf095f0-60ad-43de-a29c-6cb21bf470ad/volumes" Mar 08 04:19:56.308290 master-0 kubenswrapper[18592]: I0308 04:19:56.308239 18592 generic.go:334] "Generic (PLEG): container finished" podID="55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e" containerID="f7fcc5facccab763e3578828ba33e72d72a24f268851ad3dfc9473fa31086bd2" exitCode=0 Mar 08 04:19:56.308443 master-0 kubenswrapper[18592]: I0308 04:19:56.308330 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-fvmbl" event={"ID":"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e","Type":"ContainerDied","Data":"f7fcc5facccab763e3578828ba33e72d72a24f268851ad3dfc9473fa31086bd2"} Mar 08 04:19:56.309014 master-0 kubenswrapper[18592]: W0308 04:19:56.308975 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaedf508_6629_4535_9b0d_8d1e6aa649c2.slice/crio-c11cecd723656c857e147109f040c97e97d5fa2b248cb0c3dead891105a7facf WatchSource:0}: Error finding container c11cecd723656c857e147109f040c97e97d5fa2b248cb0c3dead891105a7facf: Status 404 returned error can't find the container with id c11cecd723656c857e147109f040c97e97d5fa2b248cb0c3dead891105a7facf Mar 08 04:19:56.324396 master-0 kubenswrapper[18592]: I0308 04:19:56.324354 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 04:19:57.361783 master-0 kubenswrapper[18592]: I0308 04:19:57.361701 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eaedf508-6629-4535-9b0d-8d1e6aa649c2","Type":"ContainerStarted","Data":"007f8a2aeee91b917eda7f979f907ed8bde9d72ecfdb409489c3d0224f2467e5"} Mar 08 04:19:57.366340 master-0 kubenswrapper[18592]: I0308 04:19:57.365448 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eaedf508-6629-4535-9b0d-8d1e6aa649c2","Type":"ContainerStarted","Data":"cf20e49358e02721067296d2563306b6b931747c29e5649929a96f15f7ef68f0"} Mar 08 04:19:57.366340 master-0 kubenswrapper[18592]: I0308 04:19:57.365546 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eaedf508-6629-4535-9b0d-8d1e6aa649c2","Type":"ContainerStarted","Data":"c11cecd723656c857e147109f040c97e97d5fa2b248cb0c3dead891105a7facf"} Mar 08 04:19:57.406570 master-0 kubenswrapper[18592]: I0308 04:19:57.406477 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.406458254 podStartE2EDuration="2.406458254s" podCreationTimestamp="2026-03-08 04:19:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:19:57.396236946 +0000 UTC m=+1609.494991306" watchObservedRunningTime="2026-03-08 04:19:57.406458254 +0000 UTC m=+1609.505212604" Mar 08 04:19:57.894987 master-0 kubenswrapper[18592]: I0308 04:19:57.894928 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-fvmbl" Mar 08 04:19:58.076635 master-0 kubenswrapper[18592]: I0308 04:19:58.076521 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpmjg\" (UniqueName: \"kubernetes.io/projected/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-kube-api-access-xpmjg\") pod \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\" (UID: \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\") " Mar 08 04:19:58.076635 master-0 kubenswrapper[18592]: I0308 04:19:58.076568 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-combined-ca-bundle\") pod \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\" (UID: \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\") " Mar 08 04:19:58.076957 master-0 kubenswrapper[18592]: I0308 04:19:58.076709 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-scripts\") pod \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\" (UID: \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\") " Mar 08 04:19:58.076957 master-0 kubenswrapper[18592]: I0308 04:19:58.076814 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-config-data\") pod \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\" (UID: \"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e\") " Mar 08 04:19:58.102257 master-0 kubenswrapper[18592]: I0308 04:19:58.101922 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-scripts" (OuterVolumeSpecName: "scripts") pod "55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e" (UID: "55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:58.102525 master-0 kubenswrapper[18592]: I0308 04:19:58.102423 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-kube-api-access-xpmjg" (OuterVolumeSpecName: "kube-api-access-xpmjg") pod "55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e" (UID: "55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e"). InnerVolumeSpecName "kube-api-access-xpmjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:19:58.126510 master-0 kubenswrapper[18592]: I0308 04:19:58.126452 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-config-data" (OuterVolumeSpecName: "config-data") pod "55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e" (UID: "55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:58.145766 master-0 kubenswrapper[18592]: I0308 04:19:58.145684 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e" (UID: "55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:58.179640 master-0 kubenswrapper[18592]: I0308 04:19:58.179570 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:58.179640 master-0 kubenswrapper[18592]: I0308 04:19:58.179606 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:58.179640 master-0 kubenswrapper[18592]: I0308 04:19:58.179617 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpmjg\" (UniqueName: \"kubernetes.io/projected/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-kube-api-access-xpmjg\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:58.179640 master-0 kubenswrapper[18592]: I0308 04:19:58.179627 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:58.372948 master-0 kubenswrapper[18592]: I0308 04:19:58.372846 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-fvmbl" event={"ID":"55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e","Type":"ContainerDied","Data":"0925b77484ab99d48b4ec8476597508088651bee7a462a9c16edeef8fbc7fcba"} Mar 08 04:19:58.373469 master-0 kubenswrapper[18592]: I0308 04:19:58.373449 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0925b77484ab99d48b4ec8476597508088651bee7a462a9c16edeef8fbc7fcba" Mar 08 04:19:58.373535 master-0 kubenswrapper[18592]: I0308 04:19:58.373069 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-fvmbl" Mar 08 04:19:58.376251 master-0 kubenswrapper[18592]: I0308 04:19:58.376227 18592 generic.go:334] "Generic (PLEG): container finished" podID="c462f4de-c8cf-4b54-9936-0fb278b41547" containerID="9aeac887db28754d3c3881529c598d3ae3e5127023ee8dd5e1bba09f0cf10681" exitCode=0 Mar 08 04:19:58.376400 master-0 kubenswrapper[18592]: I0308 04:19:58.376317 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sr2h2" event={"ID":"c462f4de-c8cf-4b54-9936-0fb278b41547","Type":"ContainerDied","Data":"9aeac887db28754d3c3881529c598d3ae3e5127023ee8dd5e1bba09f0cf10681"} Mar 08 04:19:58.754996 master-0 kubenswrapper[18592]: I0308 04:19:58.754953 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78466d865f-kkm92" Mar 08 04:19:58.871341 master-0 kubenswrapper[18592]: I0308 04:19:58.870936 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd7bf96c-57r52"] Mar 08 04:19:58.871341 master-0 kubenswrapper[18592]: I0308 04:19:58.871265 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" podUID="5fbc5df3-3059-42a1-b753-e325df83f3bd" containerName="dnsmasq-dns" containerID="cri-o://594c4626b43e5156aef5f6f37841f7110536f38a59cde5003eb3067fec95e397" gracePeriod=10 Mar 08 04:19:59.399349 master-0 kubenswrapper[18592]: I0308 04:19:59.399289 18592 generic.go:334] "Generic (PLEG): container finished" podID="5fbc5df3-3059-42a1-b753-e325df83f3bd" containerID="594c4626b43e5156aef5f6f37841f7110536f38a59cde5003eb3067fec95e397" exitCode=0 Mar 08 04:19:59.399927 master-0 kubenswrapper[18592]: I0308 04:19:59.399527 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" event={"ID":"5fbc5df3-3059-42a1-b753-e325df83f3bd","Type":"ContainerDied","Data":"594c4626b43e5156aef5f6f37841f7110536f38a59cde5003eb3067fec95e397"} Mar 08 04:19:59.399927 master-0 kubenswrapper[18592]: I0308 04:19:59.399562 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" event={"ID":"5fbc5df3-3059-42a1-b753-e325df83f3bd","Type":"ContainerDied","Data":"42171a16a3ca3dcb56e03a2da8f82d627b245ccfc2cd1d284816a44e3380ad26"} Mar 08 04:19:59.399927 master-0 kubenswrapper[18592]: I0308 04:19:59.399579 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42171a16a3ca3dcb56e03a2da8f82d627b245ccfc2cd1d284816a44e3380ad26" Mar 08 04:19:59.472154 master-0 kubenswrapper[18592]: I0308 04:19:59.472110 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:19:59.642151 master-0 kubenswrapper[18592]: I0308 04:19:59.640601 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-config\") pod \"5fbc5df3-3059-42a1-b753-e325df83f3bd\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " Mar 08 04:19:59.642151 master-0 kubenswrapper[18592]: I0308 04:19:59.640722 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-dns-svc\") pod \"5fbc5df3-3059-42a1-b753-e325df83f3bd\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " Mar 08 04:19:59.642151 master-0 kubenswrapper[18592]: I0308 04:19:59.640872 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp75r\" (UniqueName: \"kubernetes.io/projected/5fbc5df3-3059-42a1-b753-e325df83f3bd-kube-api-access-fp75r\") pod \"5fbc5df3-3059-42a1-b753-e325df83f3bd\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " Mar 08 04:19:59.642151 master-0 kubenswrapper[18592]: I0308 04:19:59.640917 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-ovsdbserver-sb\") pod \"5fbc5df3-3059-42a1-b753-e325df83f3bd\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " Mar 08 04:19:59.642151 master-0 kubenswrapper[18592]: I0308 04:19:59.640943 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-dns-swift-storage-0\") pod \"5fbc5df3-3059-42a1-b753-e325df83f3bd\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " Mar 08 04:19:59.642151 master-0 kubenswrapper[18592]: I0308 04:19:59.640961 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-ovsdbserver-nb\") pod \"5fbc5df3-3059-42a1-b753-e325df83f3bd\" (UID: \"5fbc5df3-3059-42a1-b753-e325df83f3bd\") " Mar 08 04:19:59.662579 master-0 kubenswrapper[18592]: I0308 04:19:59.662504 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fbc5df3-3059-42a1-b753-e325df83f3bd-kube-api-access-fp75r" (OuterVolumeSpecName: "kube-api-access-fp75r") pod "5fbc5df3-3059-42a1-b753-e325df83f3bd" (UID: "5fbc5df3-3059-42a1-b753-e325df83f3bd"). InnerVolumeSpecName "kube-api-access-fp75r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:19:59.703034 master-0 kubenswrapper[18592]: I0308 04:19:59.702792 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5fbc5df3-3059-42a1-b753-e325df83f3bd" (UID: "5fbc5df3-3059-42a1-b753-e325df83f3bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:19:59.722293 master-0 kubenswrapper[18592]: I0308 04:19:59.721623 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-config" (OuterVolumeSpecName: "config") pod "5fbc5df3-3059-42a1-b753-e325df83f3bd" (UID: "5fbc5df3-3059-42a1-b753-e325df83f3bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:19:59.734236 master-0 kubenswrapper[18592]: I0308 04:19:59.734203 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5fbc5df3-3059-42a1-b753-e325df83f3bd" (UID: "5fbc5df3-3059-42a1-b753-e325df83f3bd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:19:59.735441 master-0 kubenswrapper[18592]: I0308 04:19:59.735381 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5fbc5df3-3059-42a1-b753-e325df83f3bd" (UID: "5fbc5df3-3059-42a1-b753-e325df83f3bd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:19:59.743812 master-0 kubenswrapper[18592]: I0308 04:19:59.743759 18592 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:59.743887 master-0 kubenswrapper[18592]: I0308 04:19:59.743808 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:59.743887 master-0 kubenswrapper[18592]: I0308 04:19:59.743844 18592 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:59.743887 master-0 kubenswrapper[18592]: I0308 04:19:59.743859 18592 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:59.743887 master-0 kubenswrapper[18592]: I0308 04:19:59.743871 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp75r\" (UniqueName: \"kubernetes.io/projected/5fbc5df3-3059-42a1-b753-e325df83f3bd-kube-api-access-fp75r\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:59.748498 master-0 kubenswrapper[18592]: I0308 04:19:59.748454 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5fbc5df3-3059-42a1-b753-e325df83f3bd" (UID: "5fbc5df3-3059-42a1-b753-e325df83f3bd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:19:59.763510 master-0 kubenswrapper[18592]: I0308 04:19:59.763161 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sr2h2" Mar 08 04:19:59.852876 master-0 kubenswrapper[18592]: I0308 04:19:59.852808 18592 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fbc5df3-3059-42a1-b753-e325df83f3bd-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 04:19:59.954167 master-0 kubenswrapper[18592]: I0308 04:19:59.954120 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c462f4de-c8cf-4b54-9936-0fb278b41547-scripts\") pod \"c462f4de-c8cf-4b54-9936-0fb278b41547\" (UID: \"c462f4de-c8cf-4b54-9936-0fb278b41547\") " Mar 08 04:19:59.954340 master-0 kubenswrapper[18592]: I0308 04:19:59.954265 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c462f4de-c8cf-4b54-9936-0fb278b41547-config-data\") pod \"c462f4de-c8cf-4b54-9936-0fb278b41547\" (UID: \"c462f4de-c8cf-4b54-9936-0fb278b41547\") " Mar 08 04:19:59.954524 master-0 kubenswrapper[18592]: I0308 04:19:59.954505 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56p5b\" (UniqueName: \"kubernetes.io/projected/c462f4de-c8cf-4b54-9936-0fb278b41547-kube-api-access-56p5b\") pod \"c462f4de-c8cf-4b54-9936-0fb278b41547\" (UID: \"c462f4de-c8cf-4b54-9936-0fb278b41547\") " Mar 08 04:19:59.954566 master-0 kubenswrapper[18592]: I0308 04:19:59.954544 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c462f4de-c8cf-4b54-9936-0fb278b41547-combined-ca-bundle\") pod \"c462f4de-c8cf-4b54-9936-0fb278b41547\" (UID: \"c462f4de-c8cf-4b54-9936-0fb278b41547\") " Mar 08 04:19:59.962548 master-0 kubenswrapper[18592]: I0308 04:19:59.962486 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c462f4de-c8cf-4b54-9936-0fb278b41547-scripts" (OuterVolumeSpecName: "scripts") pod "c462f4de-c8cf-4b54-9936-0fb278b41547" (UID: "c462f4de-c8cf-4b54-9936-0fb278b41547"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:59.962664 master-0 kubenswrapper[18592]: I0308 04:19:59.962586 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c462f4de-c8cf-4b54-9936-0fb278b41547-kube-api-access-56p5b" (OuterVolumeSpecName: "kube-api-access-56p5b") pod "c462f4de-c8cf-4b54-9936-0fb278b41547" (UID: "c462f4de-c8cf-4b54-9936-0fb278b41547"). InnerVolumeSpecName "kube-api-access-56p5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:19:59.983106 master-0 kubenswrapper[18592]: I0308 04:19:59.983050 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c462f4de-c8cf-4b54-9936-0fb278b41547-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c462f4de-c8cf-4b54-9936-0fb278b41547" (UID: "c462f4de-c8cf-4b54-9936-0fb278b41547"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:19:59.992658 master-0 kubenswrapper[18592]: I0308 04:19:59.992609 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c462f4de-c8cf-4b54-9936-0fb278b41547-config-data" (OuterVolumeSpecName: "config-data") pod "c462f4de-c8cf-4b54-9936-0fb278b41547" (UID: "c462f4de-c8cf-4b54-9936-0fb278b41547"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:20:00.056920 master-0 kubenswrapper[18592]: I0308 04:20:00.056874 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c462f4de-c8cf-4b54-9936-0fb278b41547-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:00.056920 master-0 kubenswrapper[18592]: I0308 04:20:00.056914 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56p5b\" (UniqueName: \"kubernetes.io/projected/c462f4de-c8cf-4b54-9936-0fb278b41547-kube-api-access-56p5b\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:00.057136 master-0 kubenswrapper[18592]: I0308 04:20:00.056929 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c462f4de-c8cf-4b54-9936-0fb278b41547-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:00.057136 master-0 kubenswrapper[18592]: I0308 04:20:00.056938 18592 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c462f4de-c8cf-4b54-9936-0fb278b41547-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:00.412268 master-0 kubenswrapper[18592]: I0308 04:20:00.412153 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd7bf96c-57r52" Mar 08 04:20:00.412268 master-0 kubenswrapper[18592]: I0308 04:20:00.412192 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sr2h2" Mar 08 04:20:00.412268 master-0 kubenswrapper[18592]: I0308 04:20:00.412194 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sr2h2" event={"ID":"c462f4de-c8cf-4b54-9936-0fb278b41547","Type":"ContainerDied","Data":"1cc48b1d11a9b40a147fe582bfddf81a3f454351560aa8f2592e7efdff8a7d50"} Mar 08 04:20:00.412802 master-0 kubenswrapper[18592]: I0308 04:20:00.412290 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cc48b1d11a9b40a147fe582bfddf81a3f454351560aa8f2592e7efdff8a7d50" Mar 08 04:20:00.454681 master-0 kubenswrapper[18592]: I0308 04:20:00.454610 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd7bf96c-57r52"] Mar 08 04:20:00.483197 master-0 kubenswrapper[18592]: I0308 04:20:00.483138 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd7bf96c-57r52"] Mar 08 04:20:00.622649 master-0 kubenswrapper[18592]: I0308 04:20:00.622589 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 04:20:00.622898 master-0 kubenswrapper[18592]: I0308 04:20:00.622856 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="924fe7b2-8c29-4604-a79a-64d2b5d42e11" containerName="nova-scheduler-scheduler" containerID="cri-o://46c4ac6a03a7b1b9601ef3e9a177ca316ce89e93d57c72aea9367285d73c9e83" gracePeriod=30 Mar 08 04:20:00.637123 master-0 kubenswrapper[18592]: I0308 04:20:00.637065 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 04:20:00.637336 master-0 kubenswrapper[18592]: I0308 04:20:00.637290 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="eaedf508-6629-4535-9b0d-8d1e6aa649c2" containerName="nova-api-log" containerID="cri-o://cf20e49358e02721067296d2563306b6b931747c29e5649929a96f15f7ef68f0" gracePeriod=30 Mar 08 04:20:00.638406 master-0 kubenswrapper[18592]: I0308 04:20:00.637433 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="eaedf508-6629-4535-9b0d-8d1e6aa649c2" containerName="nova-api-api" containerID="cri-o://007f8a2aeee91b917eda7f979f907ed8bde9d72ecfdb409489c3d0224f2467e5" gracePeriod=30 Mar 08 04:20:00.737537 master-0 kubenswrapper[18592]: I0308 04:20:00.737375 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 04:20:00.737723 master-0 kubenswrapper[18592]: I0308 04:20:00.737600 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="351657d7-7d32-440d-94a9-46bfb27871a4" containerName="nova-metadata-log" containerID="cri-o://4a2defff732ad3eb5100d1e4751d09aad32cc280f036126795e5f22a656d064b" gracePeriod=30 Mar 08 04:20:00.737776 master-0 kubenswrapper[18592]: I0308 04:20:00.737735 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="351657d7-7d32-440d-94a9-46bfb27871a4" containerName="nova-metadata-metadata" containerID="cri-o://8d2ab13b5231ff92fa88342510da9fcbc544f34fe981c9e7bbd3206434ee661c" gracePeriod=30 Mar 08 04:20:01.232527 master-0 kubenswrapper[18592]: I0308 04:20:01.232430 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 04:20:01.284426 master-0 kubenswrapper[18592]: I0308 04:20:01.284390 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-combined-ca-bundle\") pod \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " Mar 08 04:20:01.284748 master-0 kubenswrapper[18592]: I0308 04:20:01.284730 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-internal-tls-certs\") pod \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " Mar 08 04:20:01.284971 master-0 kubenswrapper[18592]: I0308 04:20:01.284958 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-public-tls-certs\") pod \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " Mar 08 04:20:01.285137 master-0 kubenswrapper[18592]: I0308 04:20:01.285123 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-config-data\") pod \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " Mar 08 04:20:01.285300 master-0 kubenswrapper[18592]: I0308 04:20:01.285285 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlrtt\" (UniqueName: \"kubernetes.io/projected/eaedf508-6629-4535-9b0d-8d1e6aa649c2-kube-api-access-zlrtt\") pod \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " Mar 08 04:20:01.285393 master-0 kubenswrapper[18592]: I0308 04:20:01.285381 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaedf508-6629-4535-9b0d-8d1e6aa649c2-logs\") pod \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\" (UID: \"eaedf508-6629-4535-9b0d-8d1e6aa649c2\") " Mar 08 04:20:01.286206 master-0 kubenswrapper[18592]: I0308 04:20:01.286190 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaedf508-6629-4535-9b0d-8d1e6aa649c2-logs" (OuterVolumeSpecName: "logs") pod "eaedf508-6629-4535-9b0d-8d1e6aa649c2" (UID: "eaedf508-6629-4535-9b0d-8d1e6aa649c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:20:01.298861 master-0 kubenswrapper[18592]: I0308 04:20:01.298799 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaedf508-6629-4535-9b0d-8d1e6aa649c2-kube-api-access-zlrtt" (OuterVolumeSpecName: "kube-api-access-zlrtt") pod "eaedf508-6629-4535-9b0d-8d1e6aa649c2" (UID: "eaedf508-6629-4535-9b0d-8d1e6aa649c2"). InnerVolumeSpecName "kube-api-access-zlrtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:20:01.336837 master-0 kubenswrapper[18592]: I0308 04:20:01.336746 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-config-data" (OuterVolumeSpecName: "config-data") pod "eaedf508-6629-4535-9b0d-8d1e6aa649c2" (UID: "eaedf508-6629-4535-9b0d-8d1e6aa649c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:20:01.337857 master-0 kubenswrapper[18592]: I0308 04:20:01.337687 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eaedf508-6629-4535-9b0d-8d1e6aa649c2" (UID: "eaedf508-6629-4535-9b0d-8d1e6aa649c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:20:01.349268 master-0 kubenswrapper[18592]: I0308 04:20:01.349215 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eaedf508-6629-4535-9b0d-8d1e6aa649c2" (UID: "eaedf508-6629-4535-9b0d-8d1e6aa649c2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:20:01.373085 master-0 kubenswrapper[18592]: I0308 04:20:01.373034 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eaedf508-6629-4535-9b0d-8d1e6aa649c2" (UID: "eaedf508-6629-4535-9b0d-8d1e6aa649c2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:20:01.387156 master-0 kubenswrapper[18592]: I0308 04:20:01.387110 18592 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:01.387156 master-0 kubenswrapper[18592]: I0308 04:20:01.387150 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:01.387436 master-0 kubenswrapper[18592]: I0308 04:20:01.387166 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlrtt\" (UniqueName: \"kubernetes.io/projected/eaedf508-6629-4535-9b0d-8d1e6aa649c2-kube-api-access-zlrtt\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:01.387436 master-0 kubenswrapper[18592]: I0308 04:20:01.387179 18592 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaedf508-6629-4535-9b0d-8d1e6aa649c2-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:01.387436 master-0 kubenswrapper[18592]: I0308 04:20:01.387190 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:01.387436 master-0 kubenswrapper[18592]: I0308 04:20:01.387200 18592 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaedf508-6629-4535-9b0d-8d1e6aa649c2-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:01.432858 master-0 kubenswrapper[18592]: I0308 04:20:01.423665 18592 generic.go:334] "Generic (PLEG): container finished" podID="351657d7-7d32-440d-94a9-46bfb27871a4" containerID="4a2defff732ad3eb5100d1e4751d09aad32cc280f036126795e5f22a656d064b" exitCode=143 Mar 08 04:20:01.432858 master-0 kubenswrapper[18592]: I0308 04:20:01.423730 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"351657d7-7d32-440d-94a9-46bfb27871a4","Type":"ContainerDied","Data":"4a2defff732ad3eb5100d1e4751d09aad32cc280f036126795e5f22a656d064b"} Mar 08 04:20:01.432858 master-0 kubenswrapper[18592]: I0308 04:20:01.425230 18592 generic.go:334] "Generic (PLEG): container finished" podID="eaedf508-6629-4535-9b0d-8d1e6aa649c2" containerID="007f8a2aeee91b917eda7f979f907ed8bde9d72ecfdb409489c3d0224f2467e5" exitCode=0 Mar 08 04:20:01.432858 master-0 kubenswrapper[18592]: I0308 04:20:01.425252 18592 generic.go:334] "Generic (PLEG): container finished" podID="eaedf508-6629-4535-9b0d-8d1e6aa649c2" containerID="cf20e49358e02721067296d2563306b6b931747c29e5649929a96f15f7ef68f0" exitCode=143 Mar 08 04:20:01.432858 master-0 kubenswrapper[18592]: I0308 04:20:01.425269 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eaedf508-6629-4535-9b0d-8d1e6aa649c2","Type":"ContainerDied","Data":"007f8a2aeee91b917eda7f979f907ed8bde9d72ecfdb409489c3d0224f2467e5"} Mar 08 04:20:01.432858 master-0 kubenswrapper[18592]: I0308 04:20:01.425287 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eaedf508-6629-4535-9b0d-8d1e6aa649c2","Type":"ContainerDied","Data":"cf20e49358e02721067296d2563306b6b931747c29e5649929a96f15f7ef68f0"} Mar 08 04:20:01.432858 master-0 kubenswrapper[18592]: I0308 04:20:01.425298 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eaedf508-6629-4535-9b0d-8d1e6aa649c2","Type":"ContainerDied","Data":"c11cecd723656c857e147109f040c97e97d5fa2b248cb0c3dead891105a7facf"} Mar 08 04:20:01.432858 master-0 kubenswrapper[18592]: I0308 04:20:01.425314 18592 scope.go:117] "RemoveContainer" containerID="007f8a2aeee91b917eda7f979f907ed8bde9d72ecfdb409489c3d0224f2467e5" Mar 08 04:20:01.432858 master-0 kubenswrapper[18592]: I0308 04:20:01.425444 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 04:20:01.476279 master-0 kubenswrapper[18592]: I0308 04:20:01.476215 18592 scope.go:117] "RemoveContainer" containerID="cf20e49358e02721067296d2563306b6b931747c29e5649929a96f15f7ef68f0" Mar 08 04:20:01.478086 master-0 kubenswrapper[18592]: I0308 04:20:01.477931 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 04:20:01.503614 master-0 kubenswrapper[18592]: I0308 04:20:01.503475 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: I0308 04:20:01.525111 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: E0308 04:20:01.525843 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fbc5df3-3059-42a1-b753-e325df83f3bd" containerName="init" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: I0308 04:20:01.525863 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fbc5df3-3059-42a1-b753-e325df83f3bd" containerName="init" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: E0308 04:20:01.525891 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e" containerName="nova-manage" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: I0308 04:20:01.525900 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e" containerName="nova-manage" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: E0308 04:20:01.525922 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaedf508-6629-4535-9b0d-8d1e6aa649c2" containerName="nova-api-log" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: I0308 04:20:01.525931 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaedf508-6629-4535-9b0d-8d1e6aa649c2" containerName="nova-api-log" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: E0308 04:20:01.525948 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c462f4de-c8cf-4b54-9936-0fb278b41547" containerName="nova-manage" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: I0308 04:20:01.525957 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="c462f4de-c8cf-4b54-9936-0fb278b41547" containerName="nova-manage" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: E0308 04:20:01.525981 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaedf508-6629-4535-9b0d-8d1e6aa649c2" containerName="nova-api-api" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: I0308 04:20:01.525990 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaedf508-6629-4535-9b0d-8d1e6aa649c2" containerName="nova-api-api" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: E0308 04:20:01.526011 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fbc5df3-3059-42a1-b753-e325df83f3bd" containerName="dnsmasq-dns" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: I0308 04:20:01.526019 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fbc5df3-3059-42a1-b753-e325df83f3bd" containerName="dnsmasq-dns" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: I0308 04:20:01.528691 18592 scope.go:117] "RemoveContainer" containerID="007f8a2aeee91b917eda7f979f907ed8bde9d72ecfdb409489c3d0224f2467e5" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: I0308 04:20:01.528898 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e" containerName="nova-manage" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: I0308 04:20:01.528945 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="c462f4de-c8cf-4b54-9936-0fb278b41547" containerName="nova-manage" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: I0308 04:20:01.528966 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaedf508-6629-4535-9b0d-8d1e6aa649c2" containerName="nova-api-api" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: I0308 04:20:01.528992 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fbc5df3-3059-42a1-b753-e325df83f3bd" containerName="dnsmasq-dns" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: I0308 04:20:01.529020 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaedf508-6629-4535-9b0d-8d1e6aa649c2" containerName="nova-api-log" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: E0308 04:20:01.531188 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"007f8a2aeee91b917eda7f979f907ed8bde9d72ecfdb409489c3d0224f2467e5\": container with ID starting with 007f8a2aeee91b917eda7f979f907ed8bde9d72ecfdb409489c3d0224f2467e5 not found: ID does not exist" containerID="007f8a2aeee91b917eda7f979f907ed8bde9d72ecfdb409489c3d0224f2467e5" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: I0308 04:20:01.531222 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"007f8a2aeee91b917eda7f979f907ed8bde9d72ecfdb409489c3d0224f2467e5"} err="failed to get container status \"007f8a2aeee91b917eda7f979f907ed8bde9d72ecfdb409489c3d0224f2467e5\": rpc error: code = NotFound desc = could not find container \"007f8a2aeee91b917eda7f979f907ed8bde9d72ecfdb409489c3d0224f2467e5\": container with ID starting with 007f8a2aeee91b917eda7f979f907ed8bde9d72ecfdb409489c3d0224f2467e5 not found: ID does not exist" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: I0308 04:20:01.531247 18592 scope.go:117] "RemoveContainer" containerID="cf20e49358e02721067296d2563306b6b931747c29e5649929a96f15f7ef68f0" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: I0308 04:20:01.532023 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: E0308 04:20:01.533233 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf20e49358e02721067296d2563306b6b931747c29e5649929a96f15f7ef68f0\": container with ID starting with cf20e49358e02721067296d2563306b6b931747c29e5649929a96f15f7ef68f0 not found: ID does not exist" containerID="cf20e49358e02721067296d2563306b6b931747c29e5649929a96f15f7ef68f0" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: I0308 04:20:01.533254 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf20e49358e02721067296d2563306b6b931747c29e5649929a96f15f7ef68f0"} err="failed to get container status \"cf20e49358e02721067296d2563306b6b931747c29e5649929a96f15f7ef68f0\": rpc error: code = NotFound desc = could not find container \"cf20e49358e02721067296d2563306b6b931747c29e5649929a96f15f7ef68f0\": container with ID starting with cf20e49358e02721067296d2563306b6b931747c29e5649929a96f15f7ef68f0 not found: ID does not exist" Mar 08 04:20:01.535557 master-0 kubenswrapper[18592]: I0308 04:20:01.533268 18592 scope.go:117] "RemoveContainer" containerID="007f8a2aeee91b917eda7f979f907ed8bde9d72ecfdb409489c3d0224f2467e5" Mar 08 04:20:01.537592 master-0 kubenswrapper[18592]: I0308 04:20:01.537061 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"007f8a2aeee91b917eda7f979f907ed8bde9d72ecfdb409489c3d0224f2467e5"} err="failed to get container status \"007f8a2aeee91b917eda7f979f907ed8bde9d72ecfdb409489c3d0224f2467e5\": rpc error: code = NotFound desc = could not find container \"007f8a2aeee91b917eda7f979f907ed8bde9d72ecfdb409489c3d0224f2467e5\": container with ID starting with 007f8a2aeee91b917eda7f979f907ed8bde9d72ecfdb409489c3d0224f2467e5 not found: ID does not exist" Mar 08 04:20:01.537592 master-0 kubenswrapper[18592]: I0308 04:20:01.537091 18592 scope.go:117] "RemoveContainer" containerID="cf20e49358e02721067296d2563306b6b931747c29e5649929a96f15f7ef68f0" Mar 08 04:20:01.537592 master-0 kubenswrapper[18592]: I0308 04:20:01.537333 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf20e49358e02721067296d2563306b6b931747c29e5649929a96f15f7ef68f0"} err="failed to get container status \"cf20e49358e02721067296d2563306b6b931747c29e5649929a96f15f7ef68f0\": rpc error: code = NotFound desc = could not find container \"cf20e49358e02721067296d2563306b6b931747c29e5649929a96f15f7ef68f0\": container with ID starting with cf20e49358e02721067296d2563306b6b931747c29e5649929a96f15f7ef68f0 not found: ID does not exist" Mar 08 04:20:01.537592 master-0 kubenswrapper[18592]: I0308 04:20:01.537448 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 04:20:01.537751 master-0 kubenswrapper[18592]: I0308 04:20:01.537606 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 08 04:20:01.537751 master-0 kubenswrapper[18592]: I0308 04:20:01.537720 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 08 04:20:01.579148 master-0 kubenswrapper[18592]: I0308 04:20:01.539745 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 04:20:01.696028 master-0 kubenswrapper[18592]: I0308 04:20:01.695964 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b94212a-655e-4381-b7f0-d195a9157e27-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4b94212a-655e-4381-b7f0-d195a9157e27\") " pod="openstack/nova-api-0" Mar 08 04:20:01.696232 master-0 kubenswrapper[18592]: I0308 04:20:01.696083 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b94212a-655e-4381-b7f0-d195a9157e27-logs\") pod \"nova-api-0\" (UID: \"4b94212a-655e-4381-b7f0-d195a9157e27\") " pod="openstack/nova-api-0" Mar 08 04:20:01.696232 master-0 kubenswrapper[18592]: I0308 04:20:01.696200 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b94212a-655e-4381-b7f0-d195a9157e27-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4b94212a-655e-4381-b7f0-d195a9157e27\") " pod="openstack/nova-api-0" Mar 08 04:20:01.696292 master-0 kubenswrapper[18592]: I0308 04:20:01.696256 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b94212a-655e-4381-b7f0-d195a9157e27-config-data\") pod \"nova-api-0\" (UID: \"4b94212a-655e-4381-b7f0-d195a9157e27\") " pod="openstack/nova-api-0" Mar 08 04:20:01.696355 master-0 kubenswrapper[18592]: I0308 04:20:01.696325 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b94212a-655e-4381-b7f0-d195a9157e27-public-tls-certs\") pod \"nova-api-0\" (UID: \"4b94212a-655e-4381-b7f0-d195a9157e27\") " pod="openstack/nova-api-0" Mar 08 04:20:01.696806 master-0 kubenswrapper[18592]: I0308 04:20:01.696767 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74k96\" (UniqueName: \"kubernetes.io/projected/4b94212a-655e-4381-b7f0-d195a9157e27-kube-api-access-74k96\") pod \"nova-api-0\" (UID: \"4b94212a-655e-4381-b7f0-d195a9157e27\") " pod="openstack/nova-api-0" Mar 08 04:20:01.799312 master-0 kubenswrapper[18592]: I0308 04:20:01.799177 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b94212a-655e-4381-b7f0-d195a9157e27-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4b94212a-655e-4381-b7f0-d195a9157e27\") " pod="openstack/nova-api-0" Mar 08 04:20:01.800077 master-0 kubenswrapper[18592]: I0308 04:20:01.800024 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b94212a-655e-4381-b7f0-d195a9157e27-logs\") pod \"nova-api-0\" (UID: \"4b94212a-655e-4381-b7f0-d195a9157e27\") " pod="openstack/nova-api-0" Mar 08 04:20:01.800264 master-0 kubenswrapper[18592]: I0308 04:20:01.800236 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b94212a-655e-4381-b7f0-d195a9157e27-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4b94212a-655e-4381-b7f0-d195a9157e27\") " pod="openstack/nova-api-0" Mar 08 04:20:01.800361 master-0 kubenswrapper[18592]: I0308 04:20:01.800338 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b94212a-655e-4381-b7f0-d195a9157e27-config-data\") pod \"nova-api-0\" (UID: \"4b94212a-655e-4381-b7f0-d195a9157e27\") " pod="openstack/nova-api-0" Mar 08 04:20:01.800402 master-0 kubenswrapper[18592]: I0308 04:20:01.800361 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b94212a-655e-4381-b7f0-d195a9157e27-public-tls-certs\") pod \"nova-api-0\" (UID: \"4b94212a-655e-4381-b7f0-d195a9157e27\") " pod="openstack/nova-api-0" Mar 08 04:20:01.800500 master-0 kubenswrapper[18592]: I0308 04:20:01.800465 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b94212a-655e-4381-b7f0-d195a9157e27-logs\") pod \"nova-api-0\" (UID: \"4b94212a-655e-4381-b7f0-d195a9157e27\") " pod="openstack/nova-api-0" Mar 08 04:20:01.801072 master-0 kubenswrapper[18592]: I0308 04:20:01.801040 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74k96\" (UniqueName: \"kubernetes.io/projected/4b94212a-655e-4381-b7f0-d195a9157e27-kube-api-access-74k96\") pod \"nova-api-0\" (UID: \"4b94212a-655e-4381-b7f0-d195a9157e27\") " pod="openstack/nova-api-0" Mar 08 04:20:01.802746 master-0 kubenswrapper[18592]: I0308 04:20:01.802717 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b94212a-655e-4381-b7f0-d195a9157e27-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4b94212a-655e-4381-b7f0-d195a9157e27\") " pod="openstack/nova-api-0" Mar 08 04:20:01.803800 master-0 kubenswrapper[18592]: I0308 04:20:01.803765 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b94212a-655e-4381-b7f0-d195a9157e27-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4b94212a-655e-4381-b7f0-d195a9157e27\") " pod="openstack/nova-api-0" Mar 08 04:20:01.804190 master-0 kubenswrapper[18592]: I0308 04:20:01.804174 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b94212a-655e-4381-b7f0-d195a9157e27-config-data\") pod \"nova-api-0\" (UID: \"4b94212a-655e-4381-b7f0-d195a9157e27\") " pod="openstack/nova-api-0" Mar 08 04:20:01.804481 master-0 kubenswrapper[18592]: I0308 04:20:01.804459 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b94212a-655e-4381-b7f0-d195a9157e27-public-tls-certs\") pod \"nova-api-0\" (UID: \"4b94212a-655e-4381-b7f0-d195a9157e27\") " pod="openstack/nova-api-0" Mar 08 04:20:01.822006 master-0 kubenswrapper[18592]: I0308 04:20:01.820534 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74k96\" (UniqueName: \"kubernetes.io/projected/4b94212a-655e-4381-b7f0-d195a9157e27-kube-api-access-74k96\") pod \"nova-api-0\" (UID: \"4b94212a-655e-4381-b7f0-d195a9157e27\") " pod="openstack/nova-api-0" Mar 08 04:20:01.914022 master-0 kubenswrapper[18592]: I0308 04:20:01.913957 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 04:20:02.162547 master-0 kubenswrapper[18592]: I0308 04:20:02.161515 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fbc5df3-3059-42a1-b753-e325df83f3bd" path="/var/lib/kubelet/pods/5fbc5df3-3059-42a1-b753-e325df83f3bd/volumes" Mar 08 04:20:02.163918 master-0 kubenswrapper[18592]: I0308 04:20:02.162971 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaedf508-6629-4535-9b0d-8d1e6aa649c2" path="/var/lib/kubelet/pods/eaedf508-6629-4535-9b0d-8d1e6aa649c2/volumes" Mar 08 04:20:02.452300 master-0 kubenswrapper[18592]: I0308 04:20:02.452068 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 04:20:03.194331 master-0 kubenswrapper[18592]: E0308 04:20:03.194265 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="46c4ac6a03a7b1b9601ef3e9a177ca316ce89e93d57c72aea9367285d73c9e83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 04:20:03.196523 master-0 kubenswrapper[18592]: E0308 04:20:03.196473 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="46c4ac6a03a7b1b9601ef3e9a177ca316ce89e93d57c72aea9367285d73c9e83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 04:20:03.198041 master-0 kubenswrapper[18592]: E0308 04:20:03.198006 18592 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="46c4ac6a03a7b1b9601ef3e9a177ca316ce89e93d57c72aea9367285d73c9e83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 04:20:03.198134 master-0 kubenswrapper[18592]: E0308 04:20:03.198040 18592 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="924fe7b2-8c29-4604-a79a-64d2b5d42e11" containerName="nova-scheduler-scheduler" Mar 08 04:20:03.458896 master-0 kubenswrapper[18592]: I0308 04:20:03.458677 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4b94212a-655e-4381-b7f0-d195a9157e27","Type":"ContainerStarted","Data":"e1bdfb88e836b70256d8be3719b3a54d2a01f6334e596d5bab92e479cbe2710d"} Mar 08 04:20:03.458896 master-0 kubenswrapper[18592]: I0308 04:20:03.458744 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4b94212a-655e-4381-b7f0-d195a9157e27","Type":"ContainerStarted","Data":"b2326ad91fa8234e56be8b05a350c3d08e63ce4619212988900c2a5eb65eb7e8"} Mar 08 04:20:03.458896 master-0 kubenswrapper[18592]: I0308 04:20:03.458765 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4b94212a-655e-4381-b7f0-d195a9157e27","Type":"ContainerStarted","Data":"eaf52fd0f8053dd0635fbc14bf9e932486cc2cdc6eb6d621f151f6fb023bf464"} Mar 08 04:20:03.487168 master-0 kubenswrapper[18592]: I0308 04:20:03.487046 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.487023793 podStartE2EDuration="2.487023793s" podCreationTimestamp="2026-03-08 04:20:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:20:03.479144169 +0000 UTC m=+1615.577898539" watchObservedRunningTime="2026-03-08 04:20:03.487023793 +0000 UTC m=+1615.585778153" Mar 08 04:20:04.479595 master-0 kubenswrapper[18592]: I0308 04:20:04.479522 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"351657d7-7d32-440d-94a9-46bfb27871a4","Type":"ContainerDied","Data":"8d2ab13b5231ff92fa88342510da9fcbc544f34fe981c9e7bbd3206434ee661c"} Mar 08 04:20:04.479595 master-0 kubenswrapper[18592]: I0308 04:20:04.479474 18592 generic.go:334] "Generic (PLEG): container finished" podID="351657d7-7d32-440d-94a9-46bfb27871a4" containerID="8d2ab13b5231ff92fa88342510da9fcbc544f34fe981c9e7bbd3206434ee661c" exitCode=0 Mar 08 04:20:04.480521 master-0 kubenswrapper[18592]: I0308 04:20:04.479697 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"351657d7-7d32-440d-94a9-46bfb27871a4","Type":"ContainerDied","Data":"5d706115ff8f139ddc612bd9f55faf254fd521eb7961da4b68a614ddd94b8596"} Mar 08 04:20:04.480521 master-0 kubenswrapper[18592]: I0308 04:20:04.479720 18592 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d706115ff8f139ddc612bd9f55faf254fd521eb7961da4b68a614ddd94b8596" Mar 08 04:20:04.515666 master-0 kubenswrapper[18592]: I0308 04:20:04.515576 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 04:20:04.576764 master-0 kubenswrapper[18592]: I0308 04:20:04.575196 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/351657d7-7d32-440d-94a9-46bfb27871a4-logs\") pod \"351657d7-7d32-440d-94a9-46bfb27871a4\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " Mar 08 04:20:04.576764 master-0 kubenswrapper[18592]: I0308 04:20:04.575514 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/351657d7-7d32-440d-94a9-46bfb27871a4-nova-metadata-tls-certs\") pod \"351657d7-7d32-440d-94a9-46bfb27871a4\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " Mar 08 04:20:04.576764 master-0 kubenswrapper[18592]: I0308 04:20:04.575557 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg8hq\" (UniqueName: \"kubernetes.io/projected/351657d7-7d32-440d-94a9-46bfb27871a4-kube-api-access-fg8hq\") pod \"351657d7-7d32-440d-94a9-46bfb27871a4\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " Mar 08 04:20:04.576764 master-0 kubenswrapper[18592]: I0308 04:20:04.575606 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/351657d7-7d32-440d-94a9-46bfb27871a4-config-data\") pod \"351657d7-7d32-440d-94a9-46bfb27871a4\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " Mar 08 04:20:04.576764 master-0 kubenswrapper[18592]: I0308 04:20:04.575917 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/351657d7-7d32-440d-94a9-46bfb27871a4-combined-ca-bundle\") pod \"351657d7-7d32-440d-94a9-46bfb27871a4\" (UID: \"351657d7-7d32-440d-94a9-46bfb27871a4\") " Mar 08 04:20:04.581323 master-0 kubenswrapper[18592]: I0308 04:20:04.581232 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/351657d7-7d32-440d-94a9-46bfb27871a4-logs" (OuterVolumeSpecName: "logs") pod "351657d7-7d32-440d-94a9-46bfb27871a4" (UID: "351657d7-7d32-440d-94a9-46bfb27871a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 04:20:04.603380 master-0 kubenswrapper[18592]: I0308 04:20:04.603298 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/351657d7-7d32-440d-94a9-46bfb27871a4-kube-api-access-fg8hq" (OuterVolumeSpecName: "kube-api-access-fg8hq") pod "351657d7-7d32-440d-94a9-46bfb27871a4" (UID: "351657d7-7d32-440d-94a9-46bfb27871a4"). InnerVolumeSpecName "kube-api-access-fg8hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:20:04.631726 master-0 kubenswrapper[18592]: I0308 04:20:04.631638 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/351657d7-7d32-440d-94a9-46bfb27871a4-config-data" (OuterVolumeSpecName: "config-data") pod "351657d7-7d32-440d-94a9-46bfb27871a4" (UID: "351657d7-7d32-440d-94a9-46bfb27871a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:20:04.646409 master-0 kubenswrapper[18592]: I0308 04:20:04.646274 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/351657d7-7d32-440d-94a9-46bfb27871a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "351657d7-7d32-440d-94a9-46bfb27871a4" (UID: "351657d7-7d32-440d-94a9-46bfb27871a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:20:04.663262 master-0 kubenswrapper[18592]: I0308 04:20:04.663202 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/351657d7-7d32-440d-94a9-46bfb27871a4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "351657d7-7d32-440d-94a9-46bfb27871a4" (UID: "351657d7-7d32-440d-94a9-46bfb27871a4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:20:04.678819 master-0 kubenswrapper[18592]: I0308 04:20:04.678756 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/351657d7-7d32-440d-94a9-46bfb27871a4-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:04.678819 master-0 kubenswrapper[18592]: I0308 04:20:04.678808 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/351657d7-7d32-440d-94a9-46bfb27871a4-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:04.679067 master-0 kubenswrapper[18592]: I0308 04:20:04.678860 18592 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/351657d7-7d32-440d-94a9-46bfb27871a4-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:04.679067 master-0 kubenswrapper[18592]: I0308 04:20:04.678882 18592 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/351657d7-7d32-440d-94a9-46bfb27871a4-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:04.679067 master-0 kubenswrapper[18592]: I0308 04:20:04.678905 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg8hq\" (UniqueName: \"kubernetes.io/projected/351657d7-7d32-440d-94a9-46bfb27871a4-kube-api-access-fg8hq\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:05.517426 master-0 kubenswrapper[18592]: I0308 04:20:05.517322 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 04:20:05.595382 master-0 kubenswrapper[18592]: I0308 04:20:05.595291 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 04:20:05.615729 master-0 kubenswrapper[18592]: I0308 04:20:05.615650 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 04:20:05.644042 master-0 kubenswrapper[18592]: I0308 04:20:05.643978 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 04:20:05.644646 master-0 kubenswrapper[18592]: E0308 04:20:05.644615 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="351657d7-7d32-440d-94a9-46bfb27871a4" containerName="nova-metadata-metadata" Mar 08 04:20:05.644646 master-0 kubenswrapper[18592]: I0308 04:20:05.644642 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="351657d7-7d32-440d-94a9-46bfb27871a4" containerName="nova-metadata-metadata" Mar 08 04:20:05.644733 master-0 kubenswrapper[18592]: E0308 04:20:05.644717 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="351657d7-7d32-440d-94a9-46bfb27871a4" containerName="nova-metadata-log" Mar 08 04:20:05.644733 master-0 kubenswrapper[18592]: I0308 04:20:05.644725 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="351657d7-7d32-440d-94a9-46bfb27871a4" containerName="nova-metadata-log" Mar 08 04:20:05.645004 master-0 kubenswrapper[18592]: I0308 04:20:05.644981 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="351657d7-7d32-440d-94a9-46bfb27871a4" containerName="nova-metadata-metadata" Mar 08 04:20:05.645055 master-0 kubenswrapper[18592]: I0308 04:20:05.645026 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="351657d7-7d32-440d-94a9-46bfb27871a4" containerName="nova-metadata-log" Mar 08 04:20:05.646941 master-0 kubenswrapper[18592]: I0308 04:20:05.646903 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 04:20:05.648106 master-0 kubenswrapper[18592]: I0308 04:20:05.648065 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 04:20:05.653451 master-0 kubenswrapper[18592]: I0308 04:20:05.653413 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 08 04:20:05.654166 master-0 kubenswrapper[18592]: I0308 04:20:05.654119 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 04:20:05.807646 master-0 kubenswrapper[18592]: I0308 04:20:05.807507 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6708fcdc-73fb-496e-8be6-9bff2a926ce2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6708fcdc-73fb-496e-8be6-9bff2a926ce2\") " pod="openstack/nova-metadata-0" Mar 08 04:20:05.807646 master-0 kubenswrapper[18592]: I0308 04:20:05.807629 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6708fcdc-73fb-496e-8be6-9bff2a926ce2-config-data\") pod \"nova-metadata-0\" (UID: \"6708fcdc-73fb-496e-8be6-9bff2a926ce2\") " pod="openstack/nova-metadata-0" Mar 08 04:20:05.807926 master-0 kubenswrapper[18592]: I0308 04:20:05.807751 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6708fcdc-73fb-496e-8be6-9bff2a926ce2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6708fcdc-73fb-496e-8be6-9bff2a926ce2\") " pod="openstack/nova-metadata-0" Mar 08 04:20:05.807926 master-0 kubenswrapper[18592]: I0308 04:20:05.807883 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6708fcdc-73fb-496e-8be6-9bff2a926ce2-logs\") pod \"nova-metadata-0\" (UID: \"6708fcdc-73fb-496e-8be6-9bff2a926ce2\") " pod="openstack/nova-metadata-0" Mar 08 04:20:05.808049 master-0 kubenswrapper[18592]: I0308 04:20:05.808021 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9jd6\" (UniqueName: \"kubernetes.io/projected/6708fcdc-73fb-496e-8be6-9bff2a926ce2-kube-api-access-q9jd6\") pod \"nova-metadata-0\" (UID: \"6708fcdc-73fb-496e-8be6-9bff2a926ce2\") " pod="openstack/nova-metadata-0" Mar 08 04:20:05.909987 master-0 kubenswrapper[18592]: I0308 04:20:05.909939 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6708fcdc-73fb-496e-8be6-9bff2a926ce2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6708fcdc-73fb-496e-8be6-9bff2a926ce2\") " pod="openstack/nova-metadata-0" Mar 08 04:20:05.910304 master-0 kubenswrapper[18592]: I0308 04:20:05.910282 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6708fcdc-73fb-496e-8be6-9bff2a926ce2-config-data\") pod \"nova-metadata-0\" (UID: \"6708fcdc-73fb-496e-8be6-9bff2a926ce2\") " pod="openstack/nova-metadata-0" Mar 08 04:20:05.910481 master-0 kubenswrapper[18592]: I0308 04:20:05.910463 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6708fcdc-73fb-496e-8be6-9bff2a926ce2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6708fcdc-73fb-496e-8be6-9bff2a926ce2\") " pod="openstack/nova-metadata-0" Mar 08 04:20:05.910637 master-0 kubenswrapper[18592]: I0308 04:20:05.910617 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6708fcdc-73fb-496e-8be6-9bff2a926ce2-logs\") pod \"nova-metadata-0\" (UID: \"6708fcdc-73fb-496e-8be6-9bff2a926ce2\") " pod="openstack/nova-metadata-0" Mar 08 04:20:05.910816 master-0 kubenswrapper[18592]: I0308 04:20:05.910798 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9jd6\" (UniqueName: \"kubernetes.io/projected/6708fcdc-73fb-496e-8be6-9bff2a926ce2-kube-api-access-q9jd6\") pod \"nova-metadata-0\" (UID: \"6708fcdc-73fb-496e-8be6-9bff2a926ce2\") " pod="openstack/nova-metadata-0" Mar 08 04:20:05.912044 master-0 kubenswrapper[18592]: I0308 04:20:05.911341 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6708fcdc-73fb-496e-8be6-9bff2a926ce2-logs\") pod \"nova-metadata-0\" (UID: \"6708fcdc-73fb-496e-8be6-9bff2a926ce2\") " pod="openstack/nova-metadata-0" Mar 08 04:20:05.915638 master-0 kubenswrapper[18592]: I0308 04:20:05.915574 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6708fcdc-73fb-496e-8be6-9bff2a926ce2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6708fcdc-73fb-496e-8be6-9bff2a926ce2\") " pod="openstack/nova-metadata-0" Mar 08 04:20:05.916779 master-0 kubenswrapper[18592]: I0308 04:20:05.916708 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6708fcdc-73fb-496e-8be6-9bff2a926ce2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6708fcdc-73fb-496e-8be6-9bff2a926ce2\") " pod="openstack/nova-metadata-0" Mar 08 04:20:05.919604 master-0 kubenswrapper[18592]: I0308 04:20:05.919575 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6708fcdc-73fb-496e-8be6-9bff2a926ce2-config-data\") pod \"nova-metadata-0\" (UID: \"6708fcdc-73fb-496e-8be6-9bff2a926ce2\") " pod="openstack/nova-metadata-0" Mar 08 04:20:05.944943 master-0 kubenswrapper[18592]: I0308 04:20:05.944797 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9jd6\" (UniqueName: \"kubernetes.io/projected/6708fcdc-73fb-496e-8be6-9bff2a926ce2-kube-api-access-q9jd6\") pod \"nova-metadata-0\" (UID: \"6708fcdc-73fb-496e-8be6-9bff2a926ce2\") " pod="openstack/nova-metadata-0" Mar 08 04:20:05.971892 master-0 kubenswrapper[18592]: I0308 04:20:05.971841 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 04:20:06.178476 master-0 kubenswrapper[18592]: I0308 04:20:06.178366 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="351657d7-7d32-440d-94a9-46bfb27871a4" path="/var/lib/kubelet/pods/351657d7-7d32-440d-94a9-46bfb27871a4/volumes" Mar 08 04:20:06.603547 master-0 kubenswrapper[18592]: I0308 04:20:06.594268 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 04:20:07.503013 master-0 kubenswrapper[18592]: I0308 04:20:07.502652 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 04:20:07.561290 master-0 kubenswrapper[18592]: I0308 04:20:07.561241 18592 generic.go:334] "Generic (PLEG): container finished" podID="924fe7b2-8c29-4604-a79a-64d2b5d42e11" containerID="46c4ac6a03a7b1b9601ef3e9a177ca316ce89e93d57c72aea9367285d73c9e83" exitCode=0 Mar 08 04:20:07.561497 master-0 kubenswrapper[18592]: I0308 04:20:07.561300 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"924fe7b2-8c29-4604-a79a-64d2b5d42e11","Type":"ContainerDied","Data":"46c4ac6a03a7b1b9601ef3e9a177ca316ce89e93d57c72aea9367285d73c9e83"} Mar 08 04:20:07.561497 master-0 kubenswrapper[18592]: I0308 04:20:07.561300 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 04:20:07.561497 master-0 kubenswrapper[18592]: I0308 04:20:07.561374 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"924fe7b2-8c29-4604-a79a-64d2b5d42e11","Type":"ContainerDied","Data":"45829af79144168cf1f8c4db2fa251a0c058146ada5c1f2149ce4f7f89f7fe85"} Mar 08 04:20:07.561497 master-0 kubenswrapper[18592]: I0308 04:20:07.561400 18592 scope.go:117] "RemoveContainer" containerID="46c4ac6a03a7b1b9601ef3e9a177ca316ce89e93d57c72aea9367285d73c9e83" Mar 08 04:20:07.567441 master-0 kubenswrapper[18592]: I0308 04:20:07.567409 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6708fcdc-73fb-496e-8be6-9bff2a926ce2","Type":"ContainerStarted","Data":"0c2f708377d16870323881ea310cd3471b0e258f5ee8f1e35cef4beedd6389bf"} Mar 08 04:20:07.567544 master-0 kubenswrapper[18592]: I0308 04:20:07.567530 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6708fcdc-73fb-496e-8be6-9bff2a926ce2","Type":"ContainerStarted","Data":"932e0fd8794202697cd6b5f2e79e5d88a15d6d39dccbdc22193238aae2bfb41f"} Mar 08 04:20:07.567627 master-0 kubenswrapper[18592]: I0308 04:20:07.567614 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6708fcdc-73fb-496e-8be6-9bff2a926ce2","Type":"ContainerStarted","Data":"65fccf68f1b10142712b8bf30b6191e94582965bc746743a8e48a386b34007a1"} Mar 08 04:20:07.615882 master-0 kubenswrapper[18592]: I0308 04:20:07.611461 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.611438667 podStartE2EDuration="2.611438667s" podCreationTimestamp="2026-03-08 04:20:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:20:07.595857763 +0000 UTC m=+1619.694612113" watchObservedRunningTime="2026-03-08 04:20:07.611438667 +0000 UTC m=+1619.710193017" Mar 08 04:20:07.615882 master-0 kubenswrapper[18592]: I0308 04:20:07.614958 18592 scope.go:117] "RemoveContainer" containerID="46c4ac6a03a7b1b9601ef3e9a177ca316ce89e93d57c72aea9367285d73c9e83" Mar 08 04:20:07.617845 master-0 kubenswrapper[18592]: E0308 04:20:07.617477 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46c4ac6a03a7b1b9601ef3e9a177ca316ce89e93d57c72aea9367285d73c9e83\": container with ID starting with 46c4ac6a03a7b1b9601ef3e9a177ca316ce89e93d57c72aea9367285d73c9e83 not found: ID does not exist" containerID="46c4ac6a03a7b1b9601ef3e9a177ca316ce89e93d57c72aea9367285d73c9e83" Mar 08 04:20:07.617845 master-0 kubenswrapper[18592]: I0308 04:20:07.617528 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c4ac6a03a7b1b9601ef3e9a177ca316ce89e93d57c72aea9367285d73c9e83"} err="failed to get container status \"46c4ac6a03a7b1b9601ef3e9a177ca316ce89e93d57c72aea9367285d73c9e83\": rpc error: code = NotFound desc = could not find container \"46c4ac6a03a7b1b9601ef3e9a177ca316ce89e93d57c72aea9367285d73c9e83\": container with ID starting with 46c4ac6a03a7b1b9601ef3e9a177ca316ce89e93d57c72aea9367285d73c9e83 not found: ID does not exist" Mar 08 04:20:07.670903 master-0 kubenswrapper[18592]: I0308 04:20:07.670748 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924fe7b2-8c29-4604-a79a-64d2b5d42e11-combined-ca-bundle\") pod \"924fe7b2-8c29-4604-a79a-64d2b5d42e11\" (UID: \"924fe7b2-8c29-4604-a79a-64d2b5d42e11\") " Mar 08 04:20:07.671103 master-0 kubenswrapper[18592]: I0308 04:20:07.670973 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924fe7b2-8c29-4604-a79a-64d2b5d42e11-config-data\") pod \"924fe7b2-8c29-4604-a79a-64d2b5d42e11\" (UID: \"924fe7b2-8c29-4604-a79a-64d2b5d42e11\") " Mar 08 04:20:07.671103 master-0 kubenswrapper[18592]: I0308 04:20:07.671000 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9q2n\" (UniqueName: \"kubernetes.io/projected/924fe7b2-8c29-4604-a79a-64d2b5d42e11-kube-api-access-t9q2n\") pod \"924fe7b2-8c29-4604-a79a-64d2b5d42e11\" (UID: \"924fe7b2-8c29-4604-a79a-64d2b5d42e11\") " Mar 08 04:20:07.676481 master-0 kubenswrapper[18592]: I0308 04:20:07.676429 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/924fe7b2-8c29-4604-a79a-64d2b5d42e11-kube-api-access-t9q2n" (OuterVolumeSpecName: "kube-api-access-t9q2n") pod "924fe7b2-8c29-4604-a79a-64d2b5d42e11" (UID: "924fe7b2-8c29-4604-a79a-64d2b5d42e11"). InnerVolumeSpecName "kube-api-access-t9q2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:20:07.697064 master-0 kubenswrapper[18592]: I0308 04:20:07.697004 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/924fe7b2-8c29-4604-a79a-64d2b5d42e11-config-data" (OuterVolumeSpecName: "config-data") pod "924fe7b2-8c29-4604-a79a-64d2b5d42e11" (UID: "924fe7b2-8c29-4604-a79a-64d2b5d42e11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:20:07.721911 master-0 kubenswrapper[18592]: I0308 04:20:07.721861 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/924fe7b2-8c29-4604-a79a-64d2b5d42e11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "924fe7b2-8c29-4604-a79a-64d2b5d42e11" (UID: "924fe7b2-8c29-4604-a79a-64d2b5d42e11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:20:07.773630 master-0 kubenswrapper[18592]: I0308 04:20:07.773553 18592 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924fe7b2-8c29-4604-a79a-64d2b5d42e11-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:07.773630 master-0 kubenswrapper[18592]: I0308 04:20:07.773604 18592 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924fe7b2-8c29-4604-a79a-64d2b5d42e11-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:07.773630 master-0 kubenswrapper[18592]: I0308 04:20:07.773613 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9q2n\" (UniqueName: \"kubernetes.io/projected/924fe7b2-8c29-4604-a79a-64d2b5d42e11-kube-api-access-t9q2n\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:07.903872 master-0 kubenswrapper[18592]: I0308 04:20:07.902334 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 04:20:07.915852 master-0 kubenswrapper[18592]: I0308 04:20:07.915751 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 04:20:07.950158 master-0 kubenswrapper[18592]: I0308 04:20:07.950065 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 04:20:07.951649 master-0 kubenswrapper[18592]: E0308 04:20:07.951615 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="924fe7b2-8c29-4604-a79a-64d2b5d42e11" containerName="nova-scheduler-scheduler" Mar 08 04:20:07.951854 master-0 kubenswrapper[18592]: I0308 04:20:07.951811 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="924fe7b2-8c29-4604-a79a-64d2b5d42e11" containerName="nova-scheduler-scheduler" Mar 08 04:20:07.952645 master-0 kubenswrapper[18592]: I0308 04:20:07.952607 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="924fe7b2-8c29-4604-a79a-64d2b5d42e11" containerName="nova-scheduler-scheduler" Mar 08 04:20:07.957358 master-0 kubenswrapper[18592]: I0308 04:20:07.957318 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 04:20:07.961188 master-0 kubenswrapper[18592]: I0308 04:20:07.960158 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 04:20:07.977053 master-0 kubenswrapper[18592]: I0308 04:20:07.977005 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 04:20:08.078811 master-0 kubenswrapper[18592]: I0308 04:20:08.078757 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52hhp\" (UniqueName: \"kubernetes.io/projected/330d8046-3011-4ca0-9240-0c55e8117d1d-kube-api-access-52hhp\") pod \"nova-scheduler-0\" (UID: \"330d8046-3011-4ca0-9240-0c55e8117d1d\") " pod="openstack/nova-scheduler-0" Mar 08 04:20:08.079027 master-0 kubenswrapper[18592]: I0308 04:20:08.078934 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/330d8046-3011-4ca0-9240-0c55e8117d1d-config-data\") pod \"nova-scheduler-0\" (UID: \"330d8046-3011-4ca0-9240-0c55e8117d1d\") " pod="openstack/nova-scheduler-0" Mar 08 04:20:08.079027 master-0 kubenswrapper[18592]: I0308 04:20:08.078987 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330d8046-3011-4ca0-9240-0c55e8117d1d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"330d8046-3011-4ca0-9240-0c55e8117d1d\") " pod="openstack/nova-scheduler-0" Mar 08 04:20:08.163999 master-0 kubenswrapper[18592]: I0308 04:20:08.163591 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="924fe7b2-8c29-4604-a79a-64d2b5d42e11" path="/var/lib/kubelet/pods/924fe7b2-8c29-4604-a79a-64d2b5d42e11/volumes" Mar 08 04:20:08.184599 master-0 kubenswrapper[18592]: I0308 04:20:08.184539 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/330d8046-3011-4ca0-9240-0c55e8117d1d-config-data\") pod \"nova-scheduler-0\" (UID: \"330d8046-3011-4ca0-9240-0c55e8117d1d\") " pod="openstack/nova-scheduler-0" Mar 08 04:20:08.184800 master-0 kubenswrapper[18592]: I0308 04:20:08.184645 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330d8046-3011-4ca0-9240-0c55e8117d1d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"330d8046-3011-4ca0-9240-0c55e8117d1d\") " pod="openstack/nova-scheduler-0" Mar 08 04:20:08.184800 master-0 kubenswrapper[18592]: I0308 04:20:08.184753 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52hhp\" (UniqueName: \"kubernetes.io/projected/330d8046-3011-4ca0-9240-0c55e8117d1d-kube-api-access-52hhp\") pod \"nova-scheduler-0\" (UID: \"330d8046-3011-4ca0-9240-0c55e8117d1d\") " pod="openstack/nova-scheduler-0" Mar 08 04:20:08.190106 master-0 kubenswrapper[18592]: I0308 04:20:08.189809 18592 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 04:20:08.196916 master-0 kubenswrapper[18592]: I0308 04:20:08.196800 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330d8046-3011-4ca0-9240-0c55e8117d1d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"330d8046-3011-4ca0-9240-0c55e8117d1d\") " pod="openstack/nova-scheduler-0" Mar 08 04:20:08.204997 master-0 kubenswrapper[18592]: I0308 04:20:08.204902 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/330d8046-3011-4ca0-9240-0c55e8117d1d-config-data\") pod \"nova-scheduler-0\" (UID: \"330d8046-3011-4ca0-9240-0c55e8117d1d\") " pod="openstack/nova-scheduler-0" Mar 08 04:20:08.207809 master-0 kubenswrapper[18592]: I0308 04:20:08.207640 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52hhp\" (UniqueName: \"kubernetes.io/projected/330d8046-3011-4ca0-9240-0c55e8117d1d-kube-api-access-52hhp\") pod \"nova-scheduler-0\" (UID: \"330d8046-3011-4ca0-9240-0c55e8117d1d\") " pod="openstack/nova-scheduler-0" Mar 08 04:20:08.297958 master-0 kubenswrapper[18592]: I0308 04:20:08.297904 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 04:20:08.990100 master-0 kubenswrapper[18592]: W0308 04:20:08.990025 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod330d8046_3011_4ca0_9240_0c55e8117d1d.slice/crio-17d5561789fcd62c42f5084d2152ad03d619d6bc8f3af3c336de2b41dd4ebc4e WatchSource:0}: Error finding container 17d5561789fcd62c42f5084d2152ad03d619d6bc8f3af3c336de2b41dd4ebc4e: Status 404 returned error can't find the container with id 17d5561789fcd62c42f5084d2152ad03d619d6bc8f3af3c336de2b41dd4ebc4e Mar 08 04:20:09.004351 master-0 kubenswrapper[18592]: I0308 04:20:09.004270 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 04:20:09.601941 master-0 kubenswrapper[18592]: I0308 04:20:09.601716 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"330d8046-3011-4ca0-9240-0c55e8117d1d","Type":"ContainerStarted","Data":"03985a5166284de76bf1958feee58fe3e4238a212e98833294e23660f326510a"} Mar 08 04:20:09.601941 master-0 kubenswrapper[18592]: I0308 04:20:09.601777 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"330d8046-3011-4ca0-9240-0c55e8117d1d","Type":"ContainerStarted","Data":"17d5561789fcd62c42f5084d2152ad03d619d6bc8f3af3c336de2b41dd4ebc4e"} Mar 08 04:20:09.635754 master-0 kubenswrapper[18592]: I0308 04:20:09.635689 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.635666382 podStartE2EDuration="2.635666382s" podCreationTimestamp="2026-03-08 04:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:20:09.622945896 +0000 UTC m=+1621.721700256" watchObservedRunningTime="2026-03-08 04:20:09.635666382 +0000 UTC m=+1621.734420732" Mar 08 04:20:10.972427 master-0 kubenswrapper[18592]: I0308 04:20:10.972346 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 04:20:10.972427 master-0 kubenswrapper[18592]: I0308 04:20:10.972402 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 04:20:11.915338 master-0 kubenswrapper[18592]: I0308 04:20:11.915194 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 04:20:11.915338 master-0 kubenswrapper[18592]: I0308 04:20:11.915258 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 04:20:12.930978 master-0 kubenswrapper[18592]: I0308 04:20:12.930918 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4b94212a-655e-4381-b7f0-d195a9157e27" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.128.1.15:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 04:20:12.931500 master-0 kubenswrapper[18592]: I0308 04:20:12.931109 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4b94212a-655e-4381-b7f0-d195a9157e27" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.128.1.15:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 04:20:13.298585 master-0 kubenswrapper[18592]: I0308 04:20:13.298468 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 04:20:15.972620 master-0 kubenswrapper[18592]: I0308 04:20:15.972548 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 04:20:15.973604 master-0 kubenswrapper[18592]: I0308 04:20:15.973574 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 04:20:16.992114 master-0 kubenswrapper[18592]: I0308 04:20:16.992033 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6708fcdc-73fb-496e-8be6-9bff2a926ce2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.16:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 04:20:16.994274 master-0 kubenswrapper[18592]: I0308 04:20:16.992053 18592 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6708fcdc-73fb-496e-8be6-9bff2a926ce2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.16:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 04:20:18.302435 master-0 kubenswrapper[18592]: I0308 04:20:18.302338 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 04:20:18.373267 master-0 kubenswrapper[18592]: I0308 04:20:18.373217 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 04:20:18.782398 master-0 kubenswrapper[18592]: I0308 04:20:18.782349 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 04:20:21.926603 master-0 kubenswrapper[18592]: I0308 04:20:21.926499 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 04:20:21.927594 master-0 kubenswrapper[18592]: I0308 04:20:21.927285 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 04:20:21.927594 master-0 kubenswrapper[18592]: I0308 04:20:21.927486 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 04:20:21.945611 master-0 kubenswrapper[18592]: I0308 04:20:21.945537 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 04:20:22.827678 master-0 kubenswrapper[18592]: I0308 04:20:22.827602 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 04:20:22.841336 master-0 kubenswrapper[18592]: I0308 04:20:22.841231 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 04:20:25.978146 master-0 kubenswrapper[18592]: I0308 04:20:25.978072 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 04:20:25.980907 master-0 kubenswrapper[18592]: I0308 04:20:25.980858 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 04:20:25.983371 master-0 kubenswrapper[18592]: I0308 04:20:25.983321 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 04:20:26.902666 master-0 kubenswrapper[18592]: I0308 04:20:26.902553 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 04:20:53.633843 master-0 kubenswrapper[18592]: I0308 04:20:53.633124 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-hbj9h"] Mar 08 04:20:53.633843 master-0 kubenswrapper[18592]: I0308 04:20:53.633465 18592 kuberuntime_container.go:808] "Killing container with a grace period" pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" podUID="5a03f3b9-7c27-4c2d-8a8e-c342aeb10529" containerName="sushy-emulator" containerID="cri-o://39a7d69d7fa5fbc824daf31104d7fa6c79f37c222ffbf496e54783c529e2ef3f" gracePeriod=30 Mar 08 04:20:54.344065 master-0 kubenswrapper[18592]: I0308 04:20:54.343992 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" Mar 08 04:20:54.377040 master-0 kubenswrapper[18592]: I0308 04:20:54.376973 18592 generic.go:334] "Generic (PLEG): container finished" podID="5a03f3b9-7c27-4c2d-8a8e-c342aeb10529" containerID="39a7d69d7fa5fbc824daf31104d7fa6c79f37c222ffbf496e54783c529e2ef3f" exitCode=0 Mar 08 04:20:54.377040 master-0 kubenswrapper[18592]: I0308 04:20:54.377028 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" event={"ID":"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529","Type":"ContainerDied","Data":"39a7d69d7fa5fbc824daf31104d7fa6c79f37c222ffbf496e54783c529e2ef3f"} Mar 08 04:20:54.377040 master-0 kubenswrapper[18592]: I0308 04:20:54.377058 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" event={"ID":"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529","Type":"ContainerDied","Data":"169ba855e99c242c62ce625cd99ee1ae8b7ddb2407177178462abe5503ad814c"} Mar 08 04:20:54.377533 master-0 kubenswrapper[18592]: I0308 04:20:54.377076 18592 scope.go:117] "RemoveContainer" containerID="39a7d69d7fa5fbc824daf31104d7fa6c79f37c222ffbf496e54783c529e2ef3f" Mar 08 04:20:54.377533 master-0 kubenswrapper[18592]: I0308 04:20:54.377207 18592 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-hbj9h" Mar 08 04:20:54.437206 master-0 kubenswrapper[18592]: I0308 04:20:54.437167 18592 scope.go:117] "RemoveContainer" containerID="39a7d69d7fa5fbc824daf31104d7fa6c79f37c222ffbf496e54783c529e2ef3f" Mar 08 04:20:54.438149 master-0 kubenswrapper[18592]: E0308 04:20:54.438101 18592 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39a7d69d7fa5fbc824daf31104d7fa6c79f37c222ffbf496e54783c529e2ef3f\": container with ID starting with 39a7d69d7fa5fbc824daf31104d7fa6c79f37c222ffbf496e54783c529e2ef3f not found: ID does not exist" containerID="39a7d69d7fa5fbc824daf31104d7fa6c79f37c222ffbf496e54783c529e2ef3f" Mar 08 04:20:54.438226 master-0 kubenswrapper[18592]: I0308 04:20:54.438156 18592 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39a7d69d7fa5fbc824daf31104d7fa6c79f37c222ffbf496e54783c529e2ef3f"} err="failed to get container status \"39a7d69d7fa5fbc824daf31104d7fa6c79f37c222ffbf496e54783c529e2ef3f\": rpc error: code = NotFound desc = could not find container \"39a7d69d7fa5fbc824daf31104d7fa6c79f37c222ffbf496e54783c529e2ef3f\": container with ID starting with 39a7d69d7fa5fbc824daf31104d7fa6c79f37c222ffbf496e54783c529e2ef3f not found: ID does not exist" Mar 08 04:20:54.468856 master-0 kubenswrapper[18592]: I0308 04:20:54.466314 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529-os-client-config\") pod \"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529\" (UID: \"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529\") " Mar 08 04:20:54.468856 master-0 kubenswrapper[18592]: I0308 04:20:54.466791 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-656fq\" (UniqueName: \"kubernetes.io/projected/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529-kube-api-access-656fq\") pod \"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529\" (UID: \"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529\") " Mar 08 04:20:54.468856 master-0 kubenswrapper[18592]: I0308 04:20:54.466906 18592 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529-sushy-emulator-config\") pod \"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529\" (UID: \"5a03f3b9-7c27-4c2d-8a8e-c342aeb10529\") " Mar 08 04:20:54.471911 master-0 kubenswrapper[18592]: I0308 04:20:54.470953 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529-sushy-emulator-config" (OuterVolumeSpecName: "sushy-emulator-config") pod "5a03f3b9-7c27-4c2d-8a8e-c342aeb10529" (UID: "5a03f3b9-7c27-4c2d-8a8e-c342aeb10529"). InnerVolumeSpecName "sushy-emulator-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 04:20:54.482712 master-0 kubenswrapper[18592]: I0308 04:20:54.482658 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529-kube-api-access-656fq" (OuterVolumeSpecName: "kube-api-access-656fq") pod "5a03f3b9-7c27-4c2d-8a8e-c342aeb10529" (UID: "5a03f3b9-7c27-4c2d-8a8e-c342aeb10529"). InnerVolumeSpecName "kube-api-access-656fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:20:54.484968 master-0 kubenswrapper[18592]: I0308 04:20:54.484916 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-84965d5d88-wwnkd"] Mar 08 04:20:54.485352 master-0 kubenswrapper[18592]: I0308 04:20:54.485320 18592 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529-os-client-config" (OuterVolumeSpecName: "os-client-config") pod "5a03f3b9-7c27-4c2d-8a8e-c342aeb10529" (UID: "5a03f3b9-7c27-4c2d-8a8e-c342aeb10529"). InnerVolumeSpecName "os-client-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:20:54.485872 master-0 kubenswrapper[18592]: E0308 04:20:54.485813 18592 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a03f3b9-7c27-4c2d-8a8e-c342aeb10529" containerName="sushy-emulator" Mar 08 04:20:54.485872 master-0 kubenswrapper[18592]: I0308 04:20:54.485870 18592 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a03f3b9-7c27-4c2d-8a8e-c342aeb10529" containerName="sushy-emulator" Mar 08 04:20:54.486267 master-0 kubenswrapper[18592]: I0308 04:20:54.486232 18592 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a03f3b9-7c27-4c2d-8a8e-c342aeb10529" containerName="sushy-emulator" Mar 08 04:20:54.488679 master-0 kubenswrapper[18592]: I0308 04:20:54.488561 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-84965d5d88-wwnkd" Mar 08 04:20:54.499766 master-0 kubenswrapper[18592]: I0308 04:20:54.499629 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-84965d5d88-wwnkd"] Mar 08 04:20:54.570407 master-0 kubenswrapper[18592]: I0308 04:20:54.570344 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgpt4\" (UniqueName: \"kubernetes.io/projected/503eadb9-7904-4ee7-8f02-037baf4500bf-kube-api-access-jgpt4\") pod \"sushy-emulator-84965d5d88-wwnkd\" (UID: \"503eadb9-7904-4ee7-8f02-037baf4500bf\") " pod="sushy-emulator/sushy-emulator-84965d5d88-wwnkd" Mar 08 04:20:54.570575 master-0 kubenswrapper[18592]: I0308 04:20:54.570534 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/503eadb9-7904-4ee7-8f02-037baf4500bf-sushy-emulator-config\") pod \"sushy-emulator-84965d5d88-wwnkd\" (UID: \"503eadb9-7904-4ee7-8f02-037baf4500bf\") " pod="sushy-emulator/sushy-emulator-84965d5d88-wwnkd" Mar 08 04:20:54.570777 master-0 kubenswrapper[18592]: I0308 04:20:54.570733 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/503eadb9-7904-4ee7-8f02-037baf4500bf-os-client-config\") pod \"sushy-emulator-84965d5d88-wwnkd\" (UID: \"503eadb9-7904-4ee7-8f02-037baf4500bf\") " pod="sushy-emulator/sushy-emulator-84965d5d88-wwnkd" Mar 08 04:20:54.571145 master-0 kubenswrapper[18592]: I0308 04:20:54.571010 18592 reconciler_common.go:293] "Volume detached for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529-os-client-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:54.571145 master-0 kubenswrapper[18592]: I0308 04:20:54.571035 18592 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-656fq\" (UniqueName: \"kubernetes.io/projected/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529-kube-api-access-656fq\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:54.571145 master-0 kubenswrapper[18592]: I0308 04:20:54.571050 18592 reconciler_common.go:293] "Volume detached for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529-sushy-emulator-config\") on node \"master-0\" DevicePath \"\"" Mar 08 04:20:54.673637 master-0 kubenswrapper[18592]: I0308 04:20:54.673562 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgpt4\" (UniqueName: \"kubernetes.io/projected/503eadb9-7904-4ee7-8f02-037baf4500bf-kube-api-access-jgpt4\") pod \"sushy-emulator-84965d5d88-wwnkd\" (UID: \"503eadb9-7904-4ee7-8f02-037baf4500bf\") " pod="sushy-emulator/sushy-emulator-84965d5d88-wwnkd" Mar 08 04:20:54.674395 master-0 kubenswrapper[18592]: I0308 04:20:54.673763 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/503eadb9-7904-4ee7-8f02-037baf4500bf-sushy-emulator-config\") pod \"sushy-emulator-84965d5d88-wwnkd\" (UID: \"503eadb9-7904-4ee7-8f02-037baf4500bf\") " pod="sushy-emulator/sushy-emulator-84965d5d88-wwnkd" Mar 08 04:20:54.674395 master-0 kubenswrapper[18592]: I0308 04:20:54.673850 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/503eadb9-7904-4ee7-8f02-037baf4500bf-os-client-config\") pod \"sushy-emulator-84965d5d88-wwnkd\" (UID: \"503eadb9-7904-4ee7-8f02-037baf4500bf\") " pod="sushy-emulator/sushy-emulator-84965d5d88-wwnkd" Mar 08 04:20:54.675020 master-0 kubenswrapper[18592]: I0308 04:20:54.674971 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/503eadb9-7904-4ee7-8f02-037baf4500bf-sushy-emulator-config\") pod \"sushy-emulator-84965d5d88-wwnkd\" (UID: \"503eadb9-7904-4ee7-8f02-037baf4500bf\") " pod="sushy-emulator/sushy-emulator-84965d5d88-wwnkd" Mar 08 04:20:54.678858 master-0 kubenswrapper[18592]: I0308 04:20:54.678770 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/503eadb9-7904-4ee7-8f02-037baf4500bf-os-client-config\") pod \"sushy-emulator-84965d5d88-wwnkd\" (UID: \"503eadb9-7904-4ee7-8f02-037baf4500bf\") " pod="sushy-emulator/sushy-emulator-84965d5d88-wwnkd" Mar 08 04:20:54.692344 master-0 kubenswrapper[18592]: I0308 04:20:54.692291 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgpt4\" (UniqueName: \"kubernetes.io/projected/503eadb9-7904-4ee7-8f02-037baf4500bf-kube-api-access-jgpt4\") pod \"sushy-emulator-84965d5d88-wwnkd\" (UID: \"503eadb9-7904-4ee7-8f02-037baf4500bf\") " pod="sushy-emulator/sushy-emulator-84965d5d88-wwnkd" Mar 08 04:20:54.736837 master-0 kubenswrapper[18592]: I0308 04:20:54.736749 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-hbj9h"] Mar 08 04:20:54.754062 master-0 kubenswrapper[18592]: I0308 04:20:54.753957 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-hbj9h"] Mar 08 04:20:54.873417 master-0 kubenswrapper[18592]: I0308 04:20:54.873363 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-84965d5d88-wwnkd" Mar 08 04:20:55.522141 master-0 kubenswrapper[18592]: W0308 04:20:55.522080 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod503eadb9_7904_4ee7_8f02_037baf4500bf.slice/crio-3bd1295f5a2344db41284b388a3bee5514fc3954f5f19c720b6c2ddfa95baefb WatchSource:0}: Error finding container 3bd1295f5a2344db41284b388a3bee5514fc3954f5f19c720b6c2ddfa95baefb: Status 404 returned error can't find the container with id 3bd1295f5a2344db41284b388a3bee5514fc3954f5f19c720b6c2ddfa95baefb Mar 08 04:20:55.525718 master-0 kubenswrapper[18592]: I0308 04:20:55.525664 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-84965d5d88-wwnkd"] Mar 08 04:20:56.159942 master-0 kubenswrapper[18592]: I0308 04:20:56.159819 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a03f3b9-7c27-4c2d-8a8e-c342aeb10529" path="/var/lib/kubelet/pods/5a03f3b9-7c27-4c2d-8a8e-c342aeb10529/volumes" Mar 08 04:20:56.410364 master-0 kubenswrapper[18592]: I0308 04:20:56.410219 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-84965d5d88-wwnkd" event={"ID":"503eadb9-7904-4ee7-8f02-037baf4500bf","Type":"ContainerStarted","Data":"61554c74d30c1f059bd76b5b8db253e64aeec0c19e2cee2da7969bb38e5d3216"} Mar 08 04:20:56.410364 master-0 kubenswrapper[18592]: I0308 04:20:56.410295 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-84965d5d88-wwnkd" event={"ID":"503eadb9-7904-4ee7-8f02-037baf4500bf","Type":"ContainerStarted","Data":"3bd1295f5a2344db41284b388a3bee5514fc3954f5f19c720b6c2ddfa95baefb"} Mar 08 04:20:56.434577 master-0 kubenswrapper[18592]: I0308 04:20:56.434466 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-84965d5d88-wwnkd" podStartSLOduration=2.434440702 podStartE2EDuration="2.434440702s" podCreationTimestamp="2026-03-08 04:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:20:56.428029017 +0000 UTC m=+1668.526783397" watchObservedRunningTime="2026-03-08 04:20:56.434440702 +0000 UTC m=+1668.533195082" Mar 08 04:21:04.873883 master-0 kubenswrapper[18592]: I0308 04:21:04.873773 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-84965d5d88-wwnkd" Mar 08 04:21:04.875066 master-0 kubenswrapper[18592]: I0308 04:21:04.873934 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-84965d5d88-wwnkd" Mar 08 04:21:04.889443 master-0 kubenswrapper[18592]: I0308 04:21:04.889352 18592 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-84965d5d88-wwnkd" Mar 08 04:21:05.537545 master-0 kubenswrapper[18592]: I0308 04:21:05.537450 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-84965d5d88-wwnkd" Mar 08 04:21:11.256558 master-0 kubenswrapper[18592]: I0308 04:21:11.256438 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-664cb58b85-lrnks_634c0f6d-bce6-42cf-9253-80d1bcc7c507/cluster-samples-operator/0.log" Mar 08 04:21:11.262107 master-0 kubenswrapper[18592]: I0308 04:21:11.262067 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-664cb58b85-lrnks_634c0f6d-bce6-42cf-9253-80d1bcc7c507/cluster-samples-operator-watch/0.log" Mar 08 04:22:23.897755 master-0 kubenswrapper[18592]: I0308 04:22:23.897664 18592 scope.go:117] "RemoveContainer" containerID="de1e46bba5defdbd0fdb2eccd135173714e135192f882fbe3a7a4118754955b4" Mar 08 04:22:23.941574 master-0 kubenswrapper[18592]: I0308 04:22:23.941526 18592 scope.go:117] "RemoveContainer" containerID="714c452b7ac9c59920d86a7a152becd45938f5747a5e5273faeb7e0a69bf031d" Mar 08 04:23:24.073145 master-0 kubenswrapper[18592]: I0308 04:23:24.073026 18592 scope.go:117] "RemoveContainer" containerID="815a2ab3203863df3a809e029c429539a9587ad09ba883aa8520d64bec13105d" Mar 08 04:23:24.124809 master-0 kubenswrapper[18592]: I0308 04:23:24.124721 18592 scope.go:117] "RemoveContainer" containerID="3394f4094cde91116b129fc4d22534c98fbc6ce883bd0bd22cae7316a2296746" Mar 08 04:25:24.319972 master-0 kubenswrapper[18592]: I0308 04:25:24.319903 18592 scope.go:117] "RemoveContainer" containerID="e51ccc50fb947177adef20d9bab2569798703badf9faa82f42a9e8c66b4fba81" Mar 08 04:25:24.349422 master-0 kubenswrapper[18592]: I0308 04:25:24.349328 18592 scope.go:117] "RemoveContainer" containerID="594c4626b43e5156aef5f6f37841f7110536f38a59cde5003eb3067fec95e397" Mar 08 04:25:59.095859 master-0 kubenswrapper[18592]: I0308 04:25:59.094414 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-f96f-account-create-update-69kzj"] Mar 08 04:25:59.125358 master-0 kubenswrapper[18592]: I0308 04:25:59.125251 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-f96f-account-create-update-69kzj"] Mar 08 04:26:00.047968 master-0 kubenswrapper[18592]: I0308 04:26:00.047898 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-gc5cl"] Mar 08 04:26:00.093499 master-0 kubenswrapper[18592]: I0308 04:26:00.064306 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-6m2qh"] Mar 08 04:26:00.093499 master-0 kubenswrapper[18592]: I0308 04:26:00.078492 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-gc5cl"] Mar 08 04:26:00.093499 master-0 kubenswrapper[18592]: I0308 04:26:00.091634 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-6m2qh"] Mar 08 04:26:00.157550 master-0 kubenswrapper[18592]: I0308 04:26:00.157502 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d70e50d-f856-4cb6-bebe-58e2584f70dc" path="/var/lib/kubelet/pods/5d70e50d-f856-4cb6-bebe-58e2584f70dc/volumes" Mar 08 04:26:00.159120 master-0 kubenswrapper[18592]: I0308 04:26:00.159095 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6" path="/var/lib/kubelet/pods/6b7bbafa-ea6f-47ac-b71f-1cddb5b4adf6/volumes" Mar 08 04:26:00.161304 master-0 kubenswrapper[18592]: I0308 04:26:00.161282 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7ce4a7b-aee1-4b45-a17c-73289f866b0d" path="/var/lib/kubelet/pods/d7ce4a7b-aee1-4b45-a17c-73289f866b0d/volumes" Mar 08 04:26:01.098306 master-0 kubenswrapper[18592]: I0308 04:26:01.098250 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d6d7-account-create-update-gmt7m"] Mar 08 04:26:01.108767 master-0 kubenswrapper[18592]: I0308 04:26:01.108702 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d6d7-account-create-update-gmt7m"] Mar 08 04:26:02.163889 master-0 kubenswrapper[18592]: I0308 04:26:02.163779 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83dacd91-8bfe-4f52-a178-b5b51f46a968" path="/var/lib/kubelet/pods/83dacd91-8bfe-4f52-a178-b5b51f46a968/volumes" Mar 08 04:26:04.066742 master-0 kubenswrapper[18592]: I0308 04:26:04.066657 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8cd0-account-create-update-95lhk"] Mar 08 04:26:04.084368 master-0 kubenswrapper[18592]: I0308 04:26:04.084301 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-jqtb7"] Mar 08 04:26:04.097101 master-0 kubenswrapper[18592]: I0308 04:26:04.097032 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8cd0-account-create-update-95lhk"] Mar 08 04:26:04.108495 master-0 kubenswrapper[18592]: I0308 04:26:04.108421 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-jqtb7"] Mar 08 04:26:04.161543 master-0 kubenswrapper[18592]: I0308 04:26:04.161478 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e539cde-6478-4bc2-9695-96613a0ef358" path="/var/lib/kubelet/pods/6e539cde-6478-4bc2-9695-96613a0ef358/volumes" Mar 08 04:26:04.162957 master-0 kubenswrapper[18592]: I0308 04:26:04.162914 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe552f6f-dbcf-4c72-9829-797c7f222d57" path="/var/lib/kubelet/pods/fe552f6f-dbcf-4c72-9829-797c7f222d57/volumes" Mar 08 04:26:07.057594 master-0 kubenswrapper[18592]: I0308 04:26:07.057500 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-5qfwd"] Mar 08 04:26:07.079119 master-0 kubenswrapper[18592]: I0308 04:26:07.079041 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-5qfwd"] Mar 08 04:26:08.165579 master-0 kubenswrapper[18592]: I0308 04:26:08.165496 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eed04369-6074-4ce6-9455-f47fe669f97d" path="/var/lib/kubelet/pods/eed04369-6074-4ce6-9455-f47fe669f97d/volumes" Mar 08 04:26:24.438405 master-0 kubenswrapper[18592]: I0308 04:26:24.438335 18592 scope.go:117] "RemoveContainer" containerID="cd921792f536fe4f46f0df1900361cf86d7579deb07378440159dccd53a673ed" Mar 08 04:26:24.462519 master-0 kubenswrapper[18592]: I0308 04:26:24.462469 18592 scope.go:117] "RemoveContainer" containerID="e01d199363f07c5fa381899fbf3fe69617c329f6230fb5d4428dd4c45d145b4d" Mar 08 04:26:24.502905 master-0 kubenswrapper[18592]: I0308 04:26:24.502805 18592 scope.go:117] "RemoveContainer" containerID="8d2ab13b5231ff92fa88342510da9fcbc544f34fe981c9e7bbd3206434ee661c" Mar 08 04:26:24.531008 master-0 kubenswrapper[18592]: I0308 04:26:24.530971 18592 scope.go:117] "RemoveContainer" containerID="bb11fccfedd8503f4a2d9dc61da670f56dde7f182f3884caef5d346ac9ecdd2d" Mar 08 04:26:24.550250 master-0 kubenswrapper[18592]: I0308 04:26:24.550108 18592 scope.go:117] "RemoveContainer" containerID="ec2577ddda46c4d9d1b163c8303b77fd7ba8ba3bfd5a776087bd44130924b8ae" Mar 08 04:26:24.577663 master-0 kubenswrapper[18592]: I0308 04:26:24.577614 18592 scope.go:117] "RemoveContainer" containerID="4b5d687e97584a9c47540ce435bbb44ed95fee498a0368023874569c8ebdc477" Mar 08 04:26:24.602529 master-0 kubenswrapper[18592]: I0308 04:26:24.602400 18592 scope.go:117] "RemoveContainer" containerID="c1f24c7b7642331135e0d3993c516efd5a955c5d4702f432c9fda8e8c1673ae3" Mar 08 04:26:24.627790 master-0 kubenswrapper[18592]: I0308 04:26:24.627744 18592 scope.go:117] "RemoveContainer" containerID="4a2defff732ad3eb5100d1e4751d09aad32cc280f036126795e5f22a656d064b" Mar 08 04:26:24.653841 master-0 kubenswrapper[18592]: I0308 04:26:24.653677 18592 scope.go:117] "RemoveContainer" containerID="7b154f228ef2e8736ddaa958d956365bfb8eab1e1b87238098a4bd93879e4e96" Mar 08 04:26:37.082975 master-0 kubenswrapper[18592]: I0308 04:26:37.082895 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-k9b8j"] Mar 08 04:26:37.106961 master-0 kubenswrapper[18592]: I0308 04:26:37.105882 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a1de-account-create-update-2qztj"] Mar 08 04:26:37.124729 master-0 kubenswrapper[18592]: I0308 04:26:37.120890 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-3190-account-create-update-p2mgv"] Mar 08 04:26:37.153842 master-0 kubenswrapper[18592]: I0308 04:26:37.147663 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-7954b"] Mar 08 04:26:37.158814 master-0 kubenswrapper[18592]: I0308 04:26:37.158766 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-7wxjf"] Mar 08 04:26:37.169204 master-0 kubenswrapper[18592]: I0308 04:26:37.169133 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-k9b8j"] Mar 08 04:26:37.178947 master-0 kubenswrapper[18592]: I0308 04:26:37.178889 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a1de-account-create-update-2qztj"] Mar 08 04:26:37.188149 master-0 kubenswrapper[18592]: I0308 04:26:37.188099 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-3190-account-create-update-p2mgv"] Mar 08 04:26:37.198124 master-0 kubenswrapper[18592]: I0308 04:26:37.198058 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-7954b"] Mar 08 04:26:37.210514 master-0 kubenswrapper[18592]: I0308 04:26:37.210454 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-7wxjf"] Mar 08 04:26:38.166536 master-0 kubenswrapper[18592]: I0308 04:26:38.166456 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30a31504-5cfd-45af-ae76-d76a8fdb816a" path="/var/lib/kubelet/pods/30a31504-5cfd-45af-ae76-d76a8fdb816a/volumes" Mar 08 04:26:38.168550 master-0 kubenswrapper[18592]: I0308 04:26:38.168505 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e36975b-0c3c-492b-bce9-60a58ceeaf47" path="/var/lib/kubelet/pods/7e36975b-0c3c-492b-bce9-60a58ceeaf47/volumes" Mar 08 04:26:38.170814 master-0 kubenswrapper[18592]: I0308 04:26:38.170775 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce83c2f4-a8a4-4cda-ae8c-3cb1197decca" path="/var/lib/kubelet/pods/ce83c2f4-a8a4-4cda-ae8c-3cb1197decca/volumes" Mar 08 04:26:38.176760 master-0 kubenswrapper[18592]: I0308 04:26:38.176717 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7105244-a23e-43d2-9d49-3cb1b0edf604" path="/var/lib/kubelet/pods/e7105244-a23e-43d2-9d49-3cb1b0edf604/volumes" Mar 08 04:26:38.181800 master-0 kubenswrapper[18592]: I0308 04:26:38.181753 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4af0f8d-1061-4359-ad23-aa58a643206d" path="/var/lib/kubelet/pods/f4af0f8d-1061-4359-ad23-aa58a643206d/volumes" Mar 08 04:26:44.069107 master-0 kubenswrapper[18592]: I0308 04:26:44.069020 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-f2htj"] Mar 08 04:26:44.082520 master-0 kubenswrapper[18592]: I0308 04:26:44.082403 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-f2htj"] Mar 08 04:26:44.163802 master-0 kubenswrapper[18592]: I0308 04:26:44.163691 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d9ef75-99d9-448e-9fc5-e14b49976a0a" path="/var/lib/kubelet/pods/b3d9ef75-99d9-448e-9fc5-e14b49976a0a/volumes" Mar 08 04:26:50.063851 master-0 kubenswrapper[18592]: I0308 04:26:50.062919 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-e561-account-create-update-pmf7m"] Mar 08 04:26:50.079440 master-0 kubenswrapper[18592]: I0308 04:26:50.079376 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-e561-account-create-update-pmf7m"] Mar 08 04:26:50.165603 master-0 kubenswrapper[18592]: I0308 04:26:50.165516 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81d0a12f-c85b-41ef-a076-efa3dd40f9aa" path="/var/lib/kubelet/pods/81d0a12f-c85b-41ef-a076-efa3dd40f9aa/volumes" Mar 08 04:26:51.067347 master-0 kubenswrapper[18592]: I0308 04:26:51.067154 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-create-rk2j4"] Mar 08 04:26:51.080519 master-0 kubenswrapper[18592]: I0308 04:26:51.080446 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-create-rk2j4"] Mar 08 04:26:52.162736 master-0 kubenswrapper[18592]: I0308 04:26:52.162667 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f03ac81-08d4-4053-9a33-ae4228fb3d3b" path="/var/lib/kubelet/pods/1f03ac81-08d4-4053-9a33-ae4228fb3d3b/volumes" Mar 08 04:27:10.233782 master-0 kubenswrapper[18592]: I0308 04:27:10.232157 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-7ttxm"] Mar 08 04:27:10.250705 master-0 kubenswrapper[18592]: I0308 04:27:10.248146 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-7ttxm"] Mar 08 04:27:12.162646 master-0 kubenswrapper[18592]: I0308 04:27:12.162577 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad7d2ffe-d3ac-4a74-ae47-46241d6c4769" path="/var/lib/kubelet/pods/ad7d2ffe-d3ac-4a74-ae47-46241d6c4769/volumes" Mar 08 04:27:17.071654 master-0 kubenswrapper[18592]: I0308 04:27:17.071576 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-vhcff"] Mar 08 04:27:17.090500 master-0 kubenswrapper[18592]: I0308 04:27:17.090426 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xr25c"] Mar 08 04:27:17.123285 master-0 kubenswrapper[18592]: I0308 04:27:17.123107 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-vhcff"] Mar 08 04:27:17.155942 master-0 kubenswrapper[18592]: I0308 04:27:17.155814 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xr25c"] Mar 08 04:27:18.155165 master-0 kubenswrapper[18592]: I0308 04:27:18.154935 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf" path="/var/lib/kubelet/pods/3f1dfc4e-ddac-4bbe-bc95-2e75a5d9cabf/volumes" Mar 08 04:27:18.158041 master-0 kubenswrapper[18592]: I0308 04:27:18.157678 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0eaefda-7b75-43b9-8d18-8c1476db321d" path="/var/lib/kubelet/pods/d0eaefda-7b75-43b9-8d18-8c1476db321d/volumes" Mar 08 04:27:24.904280 master-0 kubenswrapper[18592]: I0308 04:27:24.904203 18592 scope.go:117] "RemoveContainer" containerID="c721ea8060313fa5fc0c7961cf3fa40bc577a54204648b62c16a81b5d46812eb" Mar 08 04:27:24.954850 master-0 kubenswrapper[18592]: I0308 04:27:24.954767 18592 scope.go:117] "RemoveContainer" containerID="9f789bd4f6b5c3120378d674cd8d7777595ed8a68da18849338703d979a42656" Mar 08 04:27:24.986785 master-0 kubenswrapper[18592]: I0308 04:27:24.986719 18592 scope.go:117] "RemoveContainer" containerID="18b5cd4d1d14d3966303213f9dee13d75278af393efa6352d9cb626117413a9f" Mar 08 04:27:25.049177 master-0 kubenswrapper[18592]: I0308 04:27:25.045647 18592 scope.go:117] "RemoveContainer" containerID="6c4cfb4394d4b2d083f3fdd5d6040a7cb6d0ea33dc2f9b2a15af4fadfce19341" Mar 08 04:27:25.054754 master-0 kubenswrapper[18592]: I0308 04:27:25.054678 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ff301-db-sync-bgkm7"] Mar 08 04:27:25.073508 master-0 kubenswrapper[18592]: I0308 04:27:25.073439 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-ff301-db-sync-bgkm7"] Mar 08 04:27:25.096626 master-0 kubenswrapper[18592]: I0308 04:27:25.095092 18592 scope.go:117] "RemoveContainer" containerID="c4b1d930d8c3a49e276f6417985c5bdea4b518aeb0512687ad85f96adb6eb5ba" Mar 08 04:27:25.126126 master-0 kubenswrapper[18592]: I0308 04:27:25.126076 18592 scope.go:117] "RemoveContainer" containerID="98456ce5c9197901e88dfeb6003bdec1816e2f7af6cb2c9c669b43b5b248a93c" Mar 08 04:27:25.154845 master-0 kubenswrapper[18592]: I0308 04:27:25.154783 18592 scope.go:117] "RemoveContainer" containerID="4dfcd2877205aebea28a238cd509c700166963d1bb10dd5b57731fe5381fb617" Mar 08 04:27:25.188785 master-0 kubenswrapper[18592]: I0308 04:27:25.188710 18592 scope.go:117] "RemoveContainer" containerID="82a5ba7a653d1cea9eca76348be272af00817196df0f66112fccb1c20e585194" Mar 08 04:27:25.224910 master-0 kubenswrapper[18592]: I0308 04:27:25.224818 18592 scope.go:117] "RemoveContainer" containerID="2d05c434f6340493ea5c71599940121bca0203f9dbd9d148e7d7cd6836dd18dd" Mar 08 04:27:25.257196 master-0 kubenswrapper[18592]: I0308 04:27:25.257099 18592 scope.go:117] "RemoveContainer" containerID="95cd58650c1743a863844b11c731a9d0decec809efde463936ffaea0589654db" Mar 08 04:27:25.301700 master-0 kubenswrapper[18592]: I0308 04:27:25.301663 18592 scope.go:117] "RemoveContainer" containerID="fddf0c276ff3e965c6a7d21cfe20238c47c133412bb2794d15ef17ace13bbc58" Mar 08 04:27:26.169961 master-0 kubenswrapper[18592]: I0308 04:27:26.169785 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="117a9c49-cd48-4a2c-bdee-10bb60588b20" path="/var/lib/kubelet/pods/117a9c49-cd48-4a2c-bdee-10bb60588b20/volumes" Mar 08 04:27:32.081640 master-0 kubenswrapper[18592]: I0308 04:27:32.081107 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-sync-78kcs"] Mar 08 04:27:32.099991 master-0 kubenswrapper[18592]: I0308 04:27:32.099916 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-sync-78kcs"] Mar 08 04:27:32.161063 master-0 kubenswrapper[18592]: I0308 04:27:32.160971 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c11a4533-a895-42a7-8c17-d6f421276ae0" path="/var/lib/kubelet/pods/c11a4533-a895-42a7-8c17-d6f421276ae0/volumes" Mar 08 04:27:39.052050 master-0 kubenswrapper[18592]: I0308 04:27:39.051981 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-db-create-kgmn4"] Mar 08 04:27:39.064334 master-0 kubenswrapper[18592]: I0308 04:27:39.064273 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-ff09-account-create-update-79kwt"] Mar 08 04:27:39.075361 master-0 kubenswrapper[18592]: I0308 04:27:39.075307 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-db-create-kgmn4"] Mar 08 04:27:39.086132 master-0 kubenswrapper[18592]: I0308 04:27:39.086082 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-ff09-account-create-update-79kwt"] Mar 08 04:27:40.165526 master-0 kubenswrapper[18592]: I0308 04:27:40.165431 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70814fbc-2d62-4cc7-b815-5b791cdc5e0a" path="/var/lib/kubelet/pods/70814fbc-2d62-4cc7-b815-5b791cdc5e0a/volumes" Mar 08 04:27:40.166363 master-0 kubenswrapper[18592]: I0308 04:27:40.166261 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a3f947a-748d-4cfd-a500-d759b58d22f4" path="/var/lib/kubelet/pods/9a3f947a-748d-4cfd-a500-d759b58d22f4/volumes" Mar 08 04:27:56.048868 master-0 kubenswrapper[18592]: I0308 04:27:56.048586 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-db-sync-fswtl"] Mar 08 04:27:56.063802 master-0 kubenswrapper[18592]: I0308 04:27:56.063732 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-db-sync-fswtl"] Mar 08 04:27:56.160399 master-0 kubenswrapper[18592]: I0308 04:27:56.160326 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67779b28-c7a4-43b1-bc15-a8401880ff2e" path="/var/lib/kubelet/pods/67779b28-c7a4-43b1-bc15-a8401880ff2e/volumes" Mar 08 04:28:16.085956 master-0 kubenswrapper[18592]: I0308 04:28:16.085877 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-6jgd8"] Mar 08 04:28:16.105856 master-0 kubenswrapper[18592]: I0308 04:28:16.105754 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f75d-account-create-update-fxddp"] Mar 08 04:28:16.121590 master-0 kubenswrapper[18592]: I0308 04:28:16.121477 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-6jgd8"] Mar 08 04:28:16.132081 master-0 kubenswrapper[18592]: I0308 04:28:16.132026 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-qrztf"] Mar 08 04:28:16.168566 master-0 kubenswrapper[18592]: I0308 04:28:16.168457 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2a5661f-15ae-4df4-ab1f-8539afd4d339" path="/var/lib/kubelet/pods/f2a5661f-15ae-4df4-ab1f-8539afd4d339/volumes" Mar 08 04:28:16.171072 master-0 kubenswrapper[18592]: I0308 04:28:16.171015 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f75d-account-create-update-fxddp"] Mar 08 04:28:16.171188 master-0 kubenswrapper[18592]: I0308 04:28:16.171077 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-qrztf"] Mar 08 04:28:18.069058 master-0 kubenswrapper[18592]: I0308 04:28:18.068127 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-e145-account-create-update-nccdt"] Mar 08 04:28:18.086068 master-0 kubenswrapper[18592]: I0308 04:28:18.085993 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2306-account-create-update-c95sx"] Mar 08 04:28:18.099139 master-0 kubenswrapper[18592]: I0308 04:28:18.099049 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-e145-account-create-update-nccdt"] Mar 08 04:28:18.110184 master-0 kubenswrapper[18592]: I0308 04:28:18.110115 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-f4tg6"] Mar 08 04:28:18.118349 master-0 kubenswrapper[18592]: I0308 04:28:18.118286 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2306-account-create-update-c95sx"] Mar 08 04:28:18.126911 master-0 kubenswrapper[18592]: I0308 04:28:18.126845 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-f4tg6"] Mar 08 04:28:18.175778 master-0 kubenswrapper[18592]: I0308 04:28:18.175673 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fefd13e-e178-4790-823e-458456886a84" path="/var/lib/kubelet/pods/3fefd13e-e178-4790-823e-458456886a84/volumes" Mar 08 04:28:18.177432 master-0 kubenswrapper[18592]: I0308 04:28:18.177372 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a5fc841-ddfd-4704-9a4c-878bcbb98bcc" path="/var/lib/kubelet/pods/6a5fc841-ddfd-4704-9a4c-878bcbb98bcc/volumes" Mar 08 04:28:18.179388 master-0 kubenswrapper[18592]: I0308 04:28:18.179335 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c678d7e-e3ea-40d5-b265-cf42ac1139c6" path="/var/lib/kubelet/pods/7c678d7e-e3ea-40d5-b265-cf42ac1139c6/volumes" Mar 08 04:28:18.180729 master-0 kubenswrapper[18592]: I0308 04:28:18.180675 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="955ce42c-5d68-4659-a993-85d566eb7c0c" path="/var/lib/kubelet/pods/955ce42c-5d68-4659-a993-85d566eb7c0c/volumes" Mar 08 04:28:18.182950 master-0 kubenswrapper[18592]: I0308 04:28:18.182902 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bafa1006-fce0-4733-9706-a4de6df10ac7" path="/var/lib/kubelet/pods/bafa1006-fce0-4733-9706-a4de6df10ac7/volumes" Mar 08 04:28:25.633022 master-0 kubenswrapper[18592]: I0308 04:28:25.632912 18592 scope.go:117] "RemoveContainer" containerID="99a98feadbf2914137693643285f4a1f8a079354fbc62898f0077f7d7a80a784" Mar 08 04:28:25.671112 master-0 kubenswrapper[18592]: I0308 04:28:25.671060 18592 scope.go:117] "RemoveContainer" containerID="7028d5939f93d9c1447410c6d8f2d3738a8f4d02a3d683a2410839cec945d10f" Mar 08 04:28:25.707707 master-0 kubenswrapper[18592]: I0308 04:28:25.707649 18592 scope.go:117] "RemoveContainer" containerID="3f72611aa2a83f58ed2ac02536d92f6772c99e81d4ff1f180b1b640f08dc25a9" Mar 08 04:28:25.746500 master-0 kubenswrapper[18592]: I0308 04:28:25.746426 18592 scope.go:117] "RemoveContainer" containerID="490068217b99e1b9eb5597b0547cfb888c0fa4b0491eeaa5a5cf7358d93d627e" Mar 08 04:28:25.792895 master-0 kubenswrapper[18592]: I0308 04:28:25.792801 18592 scope.go:117] "RemoveContainer" containerID="3cacfb137573a88f0a197cf79ea71f9406089db6fbd3e3c02e9515c621c0ccdf" Mar 08 04:28:25.830984 master-0 kubenswrapper[18592]: I0308 04:28:25.830808 18592 scope.go:117] "RemoveContainer" containerID="8d506f5f75e3ed6df7f7d128e438fa040648c2bb9e229937c4c8eb73ef731bd2" Mar 08 04:28:25.874728 master-0 kubenswrapper[18592]: I0308 04:28:25.872968 18592 scope.go:117] "RemoveContainer" containerID="681a96c101abae53751c5a962ecc82bf1bc3fb04bf52b8267a5914f994ae00ad" Mar 08 04:28:25.916934 master-0 kubenswrapper[18592]: I0308 04:28:25.916810 18592 scope.go:117] "RemoveContainer" containerID="a9e4fea2ba64e24a9bb73af0e5ee5e6cfb072593d414a9dd1d559cd6d7fd9248" Mar 08 04:28:25.965533 master-0 kubenswrapper[18592]: I0308 04:28:25.965454 18592 scope.go:117] "RemoveContainer" containerID="1870beb02130dec3bf711629c227a521d8a55fdc7f4ce516dc9d419fdbd15d7c" Mar 08 04:28:25.998087 master-0 kubenswrapper[18592]: I0308 04:28:25.997981 18592 scope.go:117] "RemoveContainer" containerID="2b74154b1248b8c3b12da0ca994416c32181cd7fe176de0ad730f3c342b5c2bb" Mar 08 04:28:26.028029 master-0 kubenswrapper[18592]: I0308 04:28:26.027903 18592 scope.go:117] "RemoveContainer" containerID="da944f53c1cacc5ea7072e22c65947bc2fd94150ad9087fd44c3193c4dffd65c" Mar 08 04:28:26.061217 master-0 kubenswrapper[18592]: I0308 04:28:26.061153 18592 scope.go:117] "RemoveContainer" containerID="a9efdc51bdd11fca1685cb40aa17e5acbaa143e7a3e9f12915b823c98dc81e5e" Mar 08 04:28:52.115221 master-0 kubenswrapper[18592]: I0308 04:28:52.114930 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jq2f2"] Mar 08 04:28:52.131655 master-0 kubenswrapper[18592]: I0308 04:28:52.131601 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jq2f2"] Mar 08 04:28:52.162953 master-0 kubenswrapper[18592]: I0308 04:28:52.162898 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="807694b2-9dea-4fae-b203-ee9b8331871d" path="/var/lib/kubelet/pods/807694b2-9dea-4fae-b203-ee9b8331871d/volumes" Mar 08 04:29:22.091548 master-0 kubenswrapper[18592]: I0308 04:29:22.091457 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-gz2ch"] Mar 08 04:29:22.111136 master-0 kubenswrapper[18592]: I0308 04:29:22.111043 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zqczq"] Mar 08 04:29:22.123668 master-0 kubenswrapper[18592]: I0308 04:29:22.123585 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zqczq"] Mar 08 04:29:22.136726 master-0 kubenswrapper[18592]: I0308 04:29:22.136567 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-gz2ch"] Mar 08 04:29:22.158940 master-0 kubenswrapper[18592]: I0308 04:29:22.158818 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1953d5ca-ae5f-488f-998f-bea80ea7a09c" path="/var/lib/kubelet/pods/1953d5ca-ae5f-488f-998f-bea80ea7a09c/volumes" Mar 08 04:29:22.159620 master-0 kubenswrapper[18592]: I0308 04:29:22.159585 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b079179-65b3-4f1a-8c57-b4f84a718761" path="/var/lib/kubelet/pods/7b079179-65b3-4f1a-8c57-b4f84a718761/volumes" Mar 08 04:29:26.403408 master-0 kubenswrapper[18592]: I0308 04:29:26.403341 18592 scope.go:117] "RemoveContainer" containerID="45ad974296a648bc7b38887c2d33fcd930e3fd24f338de6b8310b1bfa6b5adf8" Mar 08 04:29:26.426080 master-0 kubenswrapper[18592]: I0308 04:29:26.426034 18592 scope.go:117] "RemoveContainer" containerID="621d660dd302ca66a361ccaab4d3a4be58c28c37fe95952c072e7ebb1c51670a" Mar 08 04:29:26.461847 master-0 kubenswrapper[18592]: I0308 04:29:26.459048 18592 scope.go:117] "RemoveContainer" containerID="0b668349c154eaae9bcda090cd2e5e3d59b08c2f81653f2dae0d8d0fece4fdae" Mar 08 04:29:58.087360 master-0 kubenswrapper[18592]: I0308 04:29:58.087267 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-host-discover-fvmbl"] Mar 08 04:29:58.104294 master-0 kubenswrapper[18592]: I0308 04:29:58.104045 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-host-discover-fvmbl"] Mar 08 04:29:58.168847 master-0 kubenswrapper[18592]: I0308 04:29:58.168772 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e" path="/var/lib/kubelet/pods/55182ece-5dae-4bb2-9d2c-cf0fbc3bb55e/volumes" Mar 08 04:30:00.072268 master-0 kubenswrapper[18592]: I0308 04:30:00.072193 18592 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-sr2h2"] Mar 08 04:30:00.085411 master-0 kubenswrapper[18592]: I0308 04:30:00.085243 18592 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-sr2h2"] Mar 08 04:30:00.166631 master-0 kubenswrapper[18592]: I0308 04:30:00.166519 18592 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c462f4de-c8cf-4b54-9936-0fb278b41547" path="/var/lib/kubelet/pods/c462f4de-c8cf-4b54-9936-0fb278b41547/volumes" Mar 08 04:30:26.552963 master-0 kubenswrapper[18592]: I0308 04:30:26.552868 18592 scope.go:117] "RemoveContainer" containerID="9aeac887db28754d3c3881529c598d3ae3e5127023ee8dd5e1bba09f0cf10681" Mar 08 04:30:26.586304 master-0 kubenswrapper[18592]: I0308 04:30:26.586131 18592 scope.go:117] "RemoveContainer" containerID="f7fcc5facccab763e3578828ba33e72d72a24f268851ad3dfc9473fa31086bd2" Mar 08 04:43:32.522585 master-0 kubenswrapper[18592]: I0308 04:43:32.522487 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x7bcc/must-gather-z9c4w"] Mar 08 04:43:32.525930 master-0 kubenswrapper[18592]: I0308 04:43:32.525855 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7bcc/must-gather-z9c4w" Mar 08 04:43:32.536285 master-0 kubenswrapper[18592]: I0308 04:43:32.536215 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x7bcc/must-gather-vgjtx"] Mar 08 04:43:32.536528 master-0 kubenswrapper[18592]: I0308 04:43:32.536345 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-x7bcc"/"kube-root-ca.crt" Mar 08 04:43:32.538311 master-0 kubenswrapper[18592]: I0308 04:43:32.538275 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7bcc/must-gather-vgjtx" Mar 08 04:43:32.540787 master-0 kubenswrapper[18592]: I0308 04:43:32.540734 18592 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-x7bcc"/"openshift-service-ca.crt" Mar 08 04:43:32.557785 master-0 kubenswrapper[18592]: I0308 04:43:32.554086 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x7bcc/must-gather-z9c4w"] Mar 08 04:43:32.583806 master-0 kubenswrapper[18592]: I0308 04:43:32.573479 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x7bcc/must-gather-vgjtx"] Mar 08 04:43:32.626181 master-0 kubenswrapper[18592]: I0308 04:43:32.626043 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjf5j\" (UniqueName: \"kubernetes.io/projected/d5401389-c36c-43cf-9446-1c25ba9798e2-kube-api-access-sjf5j\") pod \"must-gather-z9c4w\" (UID: \"d5401389-c36c-43cf-9446-1c25ba9798e2\") " pod="openshift-must-gather-x7bcc/must-gather-z9c4w" Mar 08 04:43:32.626181 master-0 kubenswrapper[18592]: I0308 04:43:32.626159 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d5401389-c36c-43cf-9446-1c25ba9798e2-must-gather-output\") pod \"must-gather-z9c4w\" (UID: \"d5401389-c36c-43cf-9446-1c25ba9798e2\") " pod="openshift-must-gather-x7bcc/must-gather-z9c4w" Mar 08 04:43:32.729095 master-0 kubenswrapper[18592]: I0308 04:43:32.728412 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/65ebe42a-55a9-49a8-915e-92ff016d0927-must-gather-output\") pod \"must-gather-vgjtx\" (UID: \"65ebe42a-55a9-49a8-915e-92ff016d0927\") " pod="openshift-must-gather-x7bcc/must-gather-vgjtx" Mar 08 04:43:32.729095 master-0 kubenswrapper[18592]: I0308 04:43:32.728475 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-669dd\" (UniqueName: \"kubernetes.io/projected/65ebe42a-55a9-49a8-915e-92ff016d0927-kube-api-access-669dd\") pod \"must-gather-vgjtx\" (UID: \"65ebe42a-55a9-49a8-915e-92ff016d0927\") " pod="openshift-must-gather-x7bcc/must-gather-vgjtx" Mar 08 04:43:32.729095 master-0 kubenswrapper[18592]: I0308 04:43:32.728519 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjf5j\" (UniqueName: \"kubernetes.io/projected/d5401389-c36c-43cf-9446-1c25ba9798e2-kube-api-access-sjf5j\") pod \"must-gather-z9c4w\" (UID: \"d5401389-c36c-43cf-9446-1c25ba9798e2\") " pod="openshift-must-gather-x7bcc/must-gather-z9c4w" Mar 08 04:43:32.729095 master-0 kubenswrapper[18592]: I0308 04:43:32.728623 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d5401389-c36c-43cf-9446-1c25ba9798e2-must-gather-output\") pod \"must-gather-z9c4w\" (UID: \"d5401389-c36c-43cf-9446-1c25ba9798e2\") " pod="openshift-must-gather-x7bcc/must-gather-z9c4w" Mar 08 04:43:32.736492 master-0 kubenswrapper[18592]: I0308 04:43:32.736459 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d5401389-c36c-43cf-9446-1c25ba9798e2-must-gather-output\") pod \"must-gather-z9c4w\" (UID: \"d5401389-c36c-43cf-9446-1c25ba9798e2\") " pod="openshift-must-gather-x7bcc/must-gather-z9c4w" Mar 08 04:43:32.807466 master-0 kubenswrapper[18592]: I0308 04:43:32.807315 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjf5j\" (UniqueName: \"kubernetes.io/projected/d5401389-c36c-43cf-9446-1c25ba9798e2-kube-api-access-sjf5j\") pod \"must-gather-z9c4w\" (UID: \"d5401389-c36c-43cf-9446-1c25ba9798e2\") " pod="openshift-must-gather-x7bcc/must-gather-z9c4w" Mar 08 04:43:32.830576 master-0 kubenswrapper[18592]: I0308 04:43:32.830501 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/65ebe42a-55a9-49a8-915e-92ff016d0927-must-gather-output\") pod \"must-gather-vgjtx\" (UID: \"65ebe42a-55a9-49a8-915e-92ff016d0927\") " pod="openshift-must-gather-x7bcc/must-gather-vgjtx" Mar 08 04:43:32.830576 master-0 kubenswrapper[18592]: I0308 04:43:32.830568 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-669dd\" (UniqueName: \"kubernetes.io/projected/65ebe42a-55a9-49a8-915e-92ff016d0927-kube-api-access-669dd\") pod \"must-gather-vgjtx\" (UID: \"65ebe42a-55a9-49a8-915e-92ff016d0927\") " pod="openshift-must-gather-x7bcc/must-gather-vgjtx" Mar 08 04:43:32.831711 master-0 kubenswrapper[18592]: I0308 04:43:32.831686 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/65ebe42a-55a9-49a8-915e-92ff016d0927-must-gather-output\") pod \"must-gather-vgjtx\" (UID: \"65ebe42a-55a9-49a8-915e-92ff016d0927\") " pod="openshift-must-gather-x7bcc/must-gather-vgjtx" Mar 08 04:43:32.851733 master-0 kubenswrapper[18592]: I0308 04:43:32.850551 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-669dd\" (UniqueName: \"kubernetes.io/projected/65ebe42a-55a9-49a8-915e-92ff016d0927-kube-api-access-669dd\") pod \"must-gather-vgjtx\" (UID: \"65ebe42a-55a9-49a8-915e-92ff016d0927\") " pod="openshift-must-gather-x7bcc/must-gather-vgjtx" Mar 08 04:43:32.853328 master-0 kubenswrapper[18592]: I0308 04:43:32.853260 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7bcc/must-gather-z9c4w" Mar 08 04:43:32.918345 master-0 kubenswrapper[18592]: I0308 04:43:32.918288 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7bcc/must-gather-vgjtx" Mar 08 04:43:33.355884 master-0 kubenswrapper[18592]: I0308 04:43:33.355776 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x7bcc/must-gather-z9c4w"] Mar 08 04:43:33.361582 master-0 kubenswrapper[18592]: I0308 04:43:33.361546 18592 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 04:43:33.384840 master-0 kubenswrapper[18592]: I0308 04:43:33.384778 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7bcc/must-gather-z9c4w" event={"ID":"d5401389-c36c-43cf-9446-1c25ba9798e2","Type":"ContainerStarted","Data":"e0420347040860ec4dae4529aa42541296ef562b1300c39f371c57cfe0786647"} Mar 08 04:43:33.504950 master-0 kubenswrapper[18592]: W0308 04:43:33.504895 18592 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65ebe42a_55a9_49a8_915e_92ff016d0927.slice/crio-976bd1a771503e06024be60c3539bdf2618933ddcee059fe25d98248bee5856f WatchSource:0}: Error finding container 976bd1a771503e06024be60c3539bdf2618933ddcee059fe25d98248bee5856f: Status 404 returned error can't find the container with id 976bd1a771503e06024be60c3539bdf2618933ddcee059fe25d98248bee5856f Mar 08 04:43:33.514338 master-0 kubenswrapper[18592]: I0308 04:43:33.514269 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x7bcc/must-gather-vgjtx"] Mar 08 04:43:34.398421 master-0 kubenswrapper[18592]: I0308 04:43:34.398366 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7bcc/must-gather-vgjtx" event={"ID":"65ebe42a-55a9-49a8-915e-92ff016d0927","Type":"ContainerStarted","Data":"976bd1a771503e06024be60c3539bdf2618933ddcee059fe25d98248bee5856f"} Mar 08 04:43:35.421742 master-0 kubenswrapper[18592]: I0308 04:43:35.421621 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7bcc/must-gather-z9c4w" event={"ID":"d5401389-c36c-43cf-9446-1c25ba9798e2","Type":"ContainerStarted","Data":"d9af5b7aeba3a272d429dd8f93431c2d35f1fec2c5f1c90363670b83a89fac29"} Mar 08 04:43:36.433431 master-0 kubenswrapper[18592]: I0308 04:43:36.433336 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7bcc/must-gather-z9c4w" event={"ID":"d5401389-c36c-43cf-9446-1c25ba9798e2","Type":"ContainerStarted","Data":"192c9e6a0289e1534b135e208eb3364a47bf2a620b38bb1dcf995042189c42f2"} Mar 08 04:43:36.482845 master-0 kubenswrapper[18592]: I0308 04:43:36.482099 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x7bcc/must-gather-z9c4w" podStartSLOduration=3.170679156 podStartE2EDuration="4.482084616s" podCreationTimestamp="2026-03-08 04:43:32 +0000 UTC" firstStartedPulling="2026-03-08 04:43:33.361494743 +0000 UTC m=+3025.460249093" lastFinishedPulling="2026-03-08 04:43:34.672900203 +0000 UTC m=+3026.771654553" observedRunningTime="2026-03-08 04:43:36.479928409 +0000 UTC m=+3028.578682759" watchObservedRunningTime="2026-03-08 04:43:36.482084616 +0000 UTC m=+3028.580838966" Mar 08 04:43:37.321486 master-0 kubenswrapper[18592]: I0308 04:43:37.321428 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-8c9c967c7-zq9rp_2262647b-c315-477a-93bd-f168c1810475/cluster-version-operator/1.log" Mar 08 04:43:37.604802 master-0 kubenswrapper[18592]: I0308 04:43:37.604684 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-8c9c967c7-zq9rp_2262647b-c315-477a-93bd-f168c1810475/cluster-version-operator/0.log" Mar 08 04:43:41.361900 master-0 kubenswrapper[18592]: I0308 04:43:41.360936 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-fhftn_6d20e92b-2799-4d90-9bfb-6175bebe39b3/nmstate-console-plugin/0.log" Mar 08 04:43:41.409917 master-0 kubenswrapper[18592]: I0308 04:43:41.409845 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-2c54k_ff56c314-c380-421c-93d1-4e3bb9dd6b08/nmstate-handler/0.log" Mar 08 04:43:41.481280 master-0 kubenswrapper[18592]: I0308 04:43:41.481230 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-6ddmc_9c2a57e8-799a-43c3-8aa2-3638e37db81a/nmstate-metrics/0.log" Mar 08 04:43:41.507513 master-0 kubenswrapper[18592]: I0308 04:43:41.507461 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-6ddmc_9c2a57e8-799a-43c3-8aa2-3638e37db81a/kube-rbac-proxy/0.log" Mar 08 04:43:41.518817 master-0 kubenswrapper[18592]: I0308 04:43:41.518770 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-ljtmb_4e3ba138-68bb-41cd-976b-97275ee70a53/nmstate-operator/0.log" Mar 08 04:43:41.533888 master-0 kubenswrapper[18592]: I0308 04:43:41.531951 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-ssvrb_8aa28996-0851-498b-ba48-9c51fc1676cd/nmstate-webhook/0.log" Mar 08 04:43:41.913790 master-0 kubenswrapper[18592]: I0308 04:43:41.911509 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-wxdl4_51a5fdc3-c642-430a-9da8-448bd0cceae0/controller/0.log" Mar 08 04:43:42.017744 master-0 kubenswrapper[18592]: I0308 04:43:42.016103 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-wxdl4_51a5fdc3-c642-430a-9da8-448bd0cceae0/kube-rbac-proxy/0.log" Mar 08 04:43:42.053101 master-0 kubenswrapper[18592]: I0308 04:43:42.053064 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/controller/0.log" Mar 08 04:43:42.976466 master-0 kubenswrapper[18592]: I0308 04:43:42.976276 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcdctl/0.log" Mar 08 04:43:43.302540 master-0 kubenswrapper[18592]: I0308 04:43:43.302399 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd/0.log" Mar 08 04:43:43.329002 master-0 kubenswrapper[18592]: I0308 04:43:43.328957 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-metrics/0.log" Mar 08 04:43:43.371729 master-0 kubenswrapper[18592]: I0308 04:43:43.371686 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-readyz/0.log" Mar 08 04:43:43.385532 master-0 kubenswrapper[18592]: I0308 04:43:43.385486 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-rev/0.log" Mar 08 04:43:43.408541 master-0 kubenswrapper[18592]: I0308 04:43:43.408506 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/setup/0.log" Mar 08 04:43:43.425043 master-0 kubenswrapper[18592]: I0308 04:43:43.424978 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-ensure-env-vars/0.log" Mar 08 04:43:43.447762 master-0 kubenswrapper[18592]: I0308 04:43:43.447723 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-resources-copy/0.log" Mar 08 04:43:43.522428 master-0 kubenswrapper[18592]: I0308 04:43:43.522373 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b/installer/0.log" Mar 08 04:43:43.575010 master-0 kubenswrapper[18592]: I0308 04:43:43.574251 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/frr/0.log" Mar 08 04:43:43.575010 master-0 kubenswrapper[18592]: I0308 04:43:43.574545 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_d84e0373-988e-47db-be73-5690d18beba3/installer/0.log" Mar 08 04:43:43.583529 master-0 kubenswrapper[18592]: I0308 04:43:43.583489 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/reloader/0.log" Mar 08 04:43:43.593872 master-0 kubenswrapper[18592]: I0308 04:43:43.591632 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/frr-metrics/0.log" Mar 08 04:43:43.601801 master-0 kubenswrapper[18592]: I0308 04:43:43.601765 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/kube-rbac-proxy/0.log" Mar 08 04:43:43.610176 master-0 kubenswrapper[18592]: I0308 04:43:43.610154 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/kube-rbac-proxy-frr/0.log" Mar 08 04:43:43.618050 master-0 kubenswrapper[18592]: I0308 04:43:43.617978 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/cp-frr-files/0.log" Mar 08 04:43:43.626363 master-0 kubenswrapper[18592]: I0308 04:43:43.626040 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/cp-reloader/0.log" Mar 08 04:43:43.634454 master-0 kubenswrapper[18592]: I0308 04:43:43.634410 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/cp-metrics/0.log" Mar 08 04:43:43.644607 master-0 kubenswrapper[18592]: I0308 04:43:43.644578 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-lwx48_9097e49a-572d-40cd-8657-4dacb3d0f33b/frr-k8s-webhook-server/0.log" Mar 08 04:43:43.667364 master-0 kubenswrapper[18592]: I0308 04:43:43.667314 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-c7f497896-fbcg7_d0a838e8-3b7d-4368-99bc-21c83edb251a/manager/0.log" Mar 08 04:43:43.676693 master-0 kubenswrapper[18592]: I0308 04:43:43.676639 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7d4ccb5df5-l4btv_ba434b94-1eeb-4d82-89a2-b3c8fa4a998d/webhook-server/0.log" Mar 08 04:43:44.069776 master-0 kubenswrapper[18592]: I0308 04:43:44.066295 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-msbrs_423b03ec-93e1-4baa-8530-8e3bba6eccb0/speaker/0.log" Mar 08 04:43:44.085864 master-0 kubenswrapper[18592]: I0308 04:43:44.081240 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-msbrs_423b03ec-93e1-4baa-8530-8e3bba6eccb0/kube-rbac-proxy/0.log" Mar 08 04:43:44.906406 master-0 kubenswrapper[18592]: I0308 04:43:44.906365 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/assisted-installer_assisted-installer-controller-66tqt_7d1b9b65-dba7-48fc-bc59-faa8f3cfcca7/assisted-installer-controller/0.log" Mar 08 04:43:45.800545 master-0 kubenswrapper[18592]: I0308 04:43:45.800500 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-6fd77597b5-w649n_836c1be1-26de-4840-8ba6-9d34a751aebc/oauth-openshift/0.log" Mar 08 04:43:46.581606 master-0 kubenswrapper[18592]: I0308 04:43:46.581529 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7bcc/must-gather-vgjtx" event={"ID":"65ebe42a-55a9-49a8-915e-92ff016d0927","Type":"ContainerStarted","Data":"0a4eb28fddd5a4a4a8fe8e73144b253da46ab0e3353f1f2e09ed18402618067d"} Mar 08 04:43:46.581606 master-0 kubenswrapper[18592]: I0308 04:43:46.581605 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7bcc/must-gather-vgjtx" event={"ID":"65ebe42a-55a9-49a8-915e-92ff016d0927","Type":"ContainerStarted","Data":"348b73c53f74487d4b9b374c1bf3075d546065611f7b8b19fd6d11fe0dcf2fcb"} Mar 08 04:43:46.607137 master-0 kubenswrapper[18592]: I0308 04:43:46.607077 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x7bcc/must-gather-vgjtx" podStartSLOduration=2.216189233 podStartE2EDuration="14.606154292s" podCreationTimestamp="2026-03-08 04:43:32 +0000 UTC" firstStartedPulling="2026-03-08 04:43:33.507113715 +0000 UTC m=+3025.605868085" lastFinishedPulling="2026-03-08 04:43:45.897078794 +0000 UTC m=+3037.995833144" observedRunningTime="2026-03-08 04:43:46.604388385 +0000 UTC m=+3038.703142765" watchObservedRunningTime="2026-03-08 04:43:46.606154292 +0000 UTC m=+3038.704908642" Mar 08 04:43:46.938312 master-0 kubenswrapper[18592]: I0308 04:43:46.938252 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-slm72_a60bc804-52e7-422a-87fd-ac4c5aa90cb3/authentication-operator/1.log" Mar 08 04:43:46.976479 master-0 kubenswrapper[18592]: I0308 04:43:46.976430 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-slm72_a60bc804-52e7-422a-87fd-ac4c5aa90cb3/authentication-operator/2.log" Mar 08 04:43:47.829072 master-0 kubenswrapper[18592]: I0308 04:43:47.827697 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-79f8cd6fdd-zmrb9_acaa7c53-f877-480c-8f36-58af35e0e305/router/0.log" Mar 08 04:43:48.004552 master-0 kubenswrapper[18592]: I0308 04:43:48.004487 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw"] Mar 08 04:43:48.006282 master-0 kubenswrapper[18592]: I0308 04:43:48.006252 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:48.019143 master-0 kubenswrapper[18592]: I0308 04:43:48.019095 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw"] Mar 08 04:43:48.059056 master-0 kubenswrapper[18592]: I0308 04:43:48.055471 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjkpk\" (UniqueName: \"kubernetes.io/projected/aca519db-5371-4d69-9f12-3d6b62d44535-kube-api-access-tjkpk\") pod \"perf-node-gather-daemonset-sstbw\" (UID: \"aca519db-5371-4d69-9f12-3d6b62d44535\") " pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:48.059056 master-0 kubenswrapper[18592]: I0308 04:43:48.055539 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/aca519db-5371-4d69-9f12-3d6b62d44535-proc\") pod \"perf-node-gather-daemonset-sstbw\" (UID: \"aca519db-5371-4d69-9f12-3d6b62d44535\") " pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:48.059056 master-0 kubenswrapper[18592]: I0308 04:43:48.055594 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aca519db-5371-4d69-9f12-3d6b62d44535-lib-modules\") pod \"perf-node-gather-daemonset-sstbw\" (UID: \"aca519db-5371-4d69-9f12-3d6b62d44535\") " pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:48.059056 master-0 kubenswrapper[18592]: I0308 04:43:48.055727 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aca519db-5371-4d69-9f12-3d6b62d44535-sys\") pod \"perf-node-gather-daemonset-sstbw\" (UID: \"aca519db-5371-4d69-9f12-3d6b62d44535\") " pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:48.059056 master-0 kubenswrapper[18592]: I0308 04:43:48.055750 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/aca519db-5371-4d69-9f12-3d6b62d44535-podres\") pod \"perf-node-gather-daemonset-sstbw\" (UID: \"aca519db-5371-4d69-9f12-3d6b62d44535\") " pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:48.162777 master-0 kubenswrapper[18592]: I0308 04:43:48.162654 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjkpk\" (UniqueName: \"kubernetes.io/projected/aca519db-5371-4d69-9f12-3d6b62d44535-kube-api-access-tjkpk\") pod \"perf-node-gather-daemonset-sstbw\" (UID: \"aca519db-5371-4d69-9f12-3d6b62d44535\") " pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:48.162777 master-0 kubenswrapper[18592]: I0308 04:43:48.162714 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/aca519db-5371-4d69-9f12-3d6b62d44535-proc\") pod \"perf-node-gather-daemonset-sstbw\" (UID: \"aca519db-5371-4d69-9f12-3d6b62d44535\") " pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:48.163019 master-0 kubenswrapper[18592]: I0308 04:43:48.162784 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aca519db-5371-4d69-9f12-3d6b62d44535-lib-modules\") pod \"perf-node-gather-daemonset-sstbw\" (UID: \"aca519db-5371-4d69-9f12-3d6b62d44535\") " pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:48.163019 master-0 kubenswrapper[18592]: I0308 04:43:48.162945 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aca519db-5371-4d69-9f12-3d6b62d44535-sys\") pod \"perf-node-gather-daemonset-sstbw\" (UID: \"aca519db-5371-4d69-9f12-3d6b62d44535\") " pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:48.163019 master-0 kubenswrapper[18592]: I0308 04:43:48.162971 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/aca519db-5371-4d69-9f12-3d6b62d44535-podres\") pod \"perf-node-gather-daemonset-sstbw\" (UID: \"aca519db-5371-4d69-9f12-3d6b62d44535\") " pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:48.164220 master-0 kubenswrapper[18592]: I0308 04:43:48.164163 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/aca519db-5371-4d69-9f12-3d6b62d44535-podres\") pod \"perf-node-gather-daemonset-sstbw\" (UID: \"aca519db-5371-4d69-9f12-3d6b62d44535\") " pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:48.164297 master-0 kubenswrapper[18592]: I0308 04:43:48.164203 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aca519db-5371-4d69-9f12-3d6b62d44535-lib-modules\") pod \"perf-node-gather-daemonset-sstbw\" (UID: \"aca519db-5371-4d69-9f12-3d6b62d44535\") " pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:48.164419 master-0 kubenswrapper[18592]: I0308 04:43:48.164357 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aca519db-5371-4d69-9f12-3d6b62d44535-sys\") pod \"perf-node-gather-daemonset-sstbw\" (UID: \"aca519db-5371-4d69-9f12-3d6b62d44535\") " pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:48.164419 master-0 kubenswrapper[18592]: I0308 04:43:48.164410 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/aca519db-5371-4d69-9f12-3d6b62d44535-proc\") pod \"perf-node-gather-daemonset-sstbw\" (UID: \"aca519db-5371-4d69-9f12-3d6b62d44535\") " pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:48.180594 master-0 kubenswrapper[18592]: I0308 04:43:48.180549 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjkpk\" (UniqueName: \"kubernetes.io/projected/aca519db-5371-4d69-9f12-3d6b62d44535-kube-api-access-tjkpk\") pod \"perf-node-gather-daemonset-sstbw\" (UID: \"aca519db-5371-4d69-9f12-3d6b62d44535\") " pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:48.330904 master-0 kubenswrapper[18592]: I0308 04:43:48.330817 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:48.704865 master-0 kubenswrapper[18592]: I0308 04:43:48.701082 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-7695b9f8b5-4jpgl_76ba45a2-8945-4afe-b913-126c26725867/oauth-apiserver/0.log" Mar 08 04:43:48.717717 master-0 kubenswrapper[18592]: I0308 04:43:48.717666 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-7695b9f8b5-4jpgl_76ba45a2-8945-4afe-b913-126c26725867/fix-audit-permissions/0.log" Mar 08 04:43:48.937257 master-0 kubenswrapper[18592]: I0308 04:43:48.937204 18592 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw"] Mar 08 04:43:49.628634 master-0 kubenswrapper[18592]: I0308 04:43:49.625500 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" event={"ID":"aca519db-5371-4d69-9f12-3d6b62d44535","Type":"ContainerStarted","Data":"547b0c4bbb711f6306638f78f4606a83f200ac7c87ee70c580aeb5e29f27371e"} Mar 08 04:43:49.628634 master-0 kubenswrapper[18592]: I0308 04:43:49.625555 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" event={"ID":"aca519db-5371-4d69-9f12-3d6b62d44535","Type":"ContainerStarted","Data":"b6324e2f2b3eca48f8a19b9db2eed862e88329077335bef042f5c27508b2b660"} Mar 08 04:43:49.628634 master-0 kubenswrapper[18592]: I0308 04:43:49.625674 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:49.642457 master-0 kubenswrapper[18592]: I0308 04:43:49.642389 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" podStartSLOduration=2.642373835 podStartE2EDuration="2.642373835s" podCreationTimestamp="2026-03-08 04:43:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:43:49.638901283 +0000 UTC m=+3041.737655633" watchObservedRunningTime="2026-03-08 04:43:49.642373835 +0000 UTC m=+3041.741128185" Mar 08 04:43:49.657085 master-0 kubenswrapper[18592]: I0308 04:43:49.656945 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-bv67b_1b700d17-83d2-46c8-afbc-e5774822eabe/kube-rbac-proxy/0.log" Mar 08 04:43:49.686733 master-0 kubenswrapper[18592]: I0308 04:43:49.686685 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-bv67b_1b700d17-83d2-46c8-afbc-e5774822eabe/cluster-autoscaler-operator/0.log" Mar 08 04:43:49.689449 master-0 kubenswrapper[18592]: I0308 04:43:49.689425 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-bv67b_1b700d17-83d2-46c8-afbc-e5774822eabe/cluster-autoscaler-operator/1.log" Mar 08 04:43:49.739306 master-0 kubenswrapper[18592]: I0308 04:43:49.739176 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-jghp5_d831cb23-7411-4072-8273-c167d9afca28/cluster-baremetal-operator/2.log" Mar 08 04:43:49.740116 master-0 kubenswrapper[18592]: I0308 04:43:49.740060 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-jghp5_d831cb23-7411-4072-8273-c167d9afca28/cluster-baremetal-operator/3.log" Mar 08 04:43:49.759694 master-0 kubenswrapper[18592]: I0308 04:43:49.759633 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-jghp5_d831cb23-7411-4072-8273-c167d9afca28/baremetal-kube-rbac-proxy/0.log" Mar 08 04:43:49.779094 master-0 kubenswrapper[18592]: I0308 04:43:49.779059 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-7bcsk_139881ee-6cfa-4a7e-b002-63cece048d16/control-plane-machine-set-operator/1.log" Mar 08 04:43:49.779363 master-0 kubenswrapper[18592]: I0308 04:43:49.779227 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-7bcsk_139881ee-6cfa-4a7e-b002-63cece048d16/control-plane-machine-set-operator/0.log" Mar 08 04:43:49.807657 master-0 kubenswrapper[18592]: I0308 04:43:49.807620 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-84bf6db4f9-tdrf8_b70adfe9-94f1-44bc-85ce-498e5f0a1ca7/kube-rbac-proxy/0.log" Mar 08 04:43:49.821057 master-0 kubenswrapper[18592]: I0308 04:43:49.821008 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-84bf6db4f9-tdrf8_b70adfe9-94f1-44bc-85ce-498e5f0a1ca7/machine-api-operator/1.log" Mar 08 04:43:49.822389 master-0 kubenswrapper[18592]: I0308 04:43:49.822060 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-84bf6db4f9-tdrf8_b70adfe9-94f1-44bc-85ce-498e5f0a1ca7/machine-api-operator/0.log" Mar 08 04:43:50.126891 master-0 kubenswrapper[18592]: I0308 04:43:50.126771 18592 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x7bcc/master-0-debug-bwjc5"] Mar 08 04:43:50.128609 master-0 kubenswrapper[18592]: I0308 04:43:50.128568 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7bcc/master-0-debug-bwjc5" Mar 08 04:43:50.226253 master-0 kubenswrapper[18592]: I0308 04:43:50.226081 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf3e6a8d-926f-4423-aedb-f2906a1dcb64-host\") pod \"master-0-debug-bwjc5\" (UID: \"bf3e6a8d-926f-4423-aedb-f2906a1dcb64\") " pod="openshift-must-gather-x7bcc/master-0-debug-bwjc5" Mar 08 04:43:50.226253 master-0 kubenswrapper[18592]: I0308 04:43:50.226237 18592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x4pl\" (UniqueName: \"kubernetes.io/projected/bf3e6a8d-926f-4423-aedb-f2906a1dcb64-kube-api-access-7x4pl\") pod \"master-0-debug-bwjc5\" (UID: \"bf3e6a8d-926f-4423-aedb-f2906a1dcb64\") " pod="openshift-must-gather-x7bcc/master-0-debug-bwjc5" Mar 08 04:43:50.328211 master-0 kubenswrapper[18592]: I0308 04:43:50.328149 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x4pl\" (UniqueName: \"kubernetes.io/projected/bf3e6a8d-926f-4423-aedb-f2906a1dcb64-kube-api-access-7x4pl\") pod \"master-0-debug-bwjc5\" (UID: \"bf3e6a8d-926f-4423-aedb-f2906a1dcb64\") " pod="openshift-must-gather-x7bcc/master-0-debug-bwjc5" Mar 08 04:43:50.328424 master-0 kubenswrapper[18592]: I0308 04:43:50.328319 18592 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf3e6a8d-926f-4423-aedb-f2906a1dcb64-host\") pod \"master-0-debug-bwjc5\" (UID: \"bf3e6a8d-926f-4423-aedb-f2906a1dcb64\") " pod="openshift-must-gather-x7bcc/master-0-debug-bwjc5" Mar 08 04:43:50.329750 master-0 kubenswrapper[18592]: I0308 04:43:50.329714 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf3e6a8d-926f-4423-aedb-f2906a1dcb64-host\") pod \"master-0-debug-bwjc5\" (UID: \"bf3e6a8d-926f-4423-aedb-f2906a1dcb64\") " pod="openshift-must-gather-x7bcc/master-0-debug-bwjc5" Mar 08 04:43:50.353375 master-0 kubenswrapper[18592]: I0308 04:43:50.353327 18592 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x4pl\" (UniqueName: \"kubernetes.io/projected/bf3e6a8d-926f-4423-aedb-f2906a1dcb64-kube-api-access-7x4pl\") pod \"master-0-debug-bwjc5\" (UID: \"bf3e6a8d-926f-4423-aedb-f2906a1dcb64\") " pod="openshift-must-gather-x7bcc/master-0-debug-bwjc5" Mar 08 04:43:50.444367 master-0 kubenswrapper[18592]: I0308 04:43:50.444304 18592 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7bcc/master-0-debug-bwjc5" Mar 08 04:43:50.636655 master-0 kubenswrapper[18592]: I0308 04:43:50.636596 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7bcc/master-0-debug-bwjc5" event={"ID":"bf3e6a8d-926f-4423-aedb-f2906a1dcb64","Type":"ContainerStarted","Data":"946384dabf0ee9e9008bf277df0ed42a85608969b1905d4bea582099f761fefe"} Mar 08 04:43:51.088651 master-0 kubenswrapper[18592]: I0308 04:43:51.088533 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-f89bv_33ed331b-89e9-45f8-ab3c-4533a77cc7b6/cluster-cloud-controller-manager/0.log" Mar 08 04:43:51.089233 master-0 kubenswrapper[18592]: I0308 04:43:51.089194 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-f89bv_33ed331b-89e9-45f8-ab3c-4533a77cc7b6/cluster-cloud-controller-manager/1.log" Mar 08 04:43:51.103025 master-0 kubenswrapper[18592]: I0308 04:43:51.102980 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-f89bv_33ed331b-89e9-45f8-ab3c-4533a77cc7b6/config-sync-controllers/0.log" Mar 08 04:43:51.103793 master-0 kubenswrapper[18592]: I0308 04:43:51.103736 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-f89bv_33ed331b-89e9-45f8-ab3c-4533a77cc7b6/config-sync-controllers/1.log" Mar 08 04:43:51.129094 master-0 kubenswrapper[18592]: I0308 04:43:51.129039 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-f89bv_33ed331b-89e9-45f8-ab3c-4533a77cc7b6/kube-rbac-proxy/0.log" Mar 08 04:43:51.986849 master-0 kubenswrapper[18592]: I0308 04:43:51.986057 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-ff301-api-0_588d2fd8-2c47-44e1-b3d9-d1f95c7f1616/cinder-ff301-api-log/0.log" Mar 08 04:43:52.130576 master-0 kubenswrapper[18592]: I0308 04:43:52.129610 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-ff301-api-0_588d2fd8-2c47-44e1-b3d9-d1f95c7f1616/cinder-api/0.log" Mar 08 04:43:52.244911 master-0 kubenswrapper[18592]: I0308 04:43:52.242001 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-ff301-backup-0_6ad32be0-4591-419d-9d36-83abd556234b/cinder-backup/0.log" Mar 08 04:43:52.277506 master-0 kubenswrapper[18592]: I0308 04:43:52.277466 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-ff301-backup-0_6ad32be0-4591-419d-9d36-83abd556234b/probe/0.log" Mar 08 04:43:52.356969 master-0 kubenswrapper[18592]: I0308 04:43:52.356906 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-ff301-scheduler-0_08bdf9f0-9d7a-436b-9f65-a313c8d71f69/cinder-scheduler/0.log" Mar 08 04:43:52.382698 master-0 kubenswrapper[18592]: I0308 04:43:52.382640 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-ff301-scheduler-0_08bdf9f0-9d7a-436b-9f65-a313c8d71f69/probe/0.log" Mar 08 04:43:52.447417 master-0 kubenswrapper[18592]: I0308 04:43:52.444836 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-ff301-volume-lvm-iscsi-0_f6c6845e-2f27-4421-b2ee-5d5892a8f5c9/cinder-volume/0.log" Mar 08 04:43:52.477507 master-0 kubenswrapper[18592]: I0308 04:43:52.477460 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-ff301-volume-lvm-iscsi-0_f6c6845e-2f27-4421-b2ee-5d5892a8f5c9/probe/0.log" Mar 08 04:43:52.488186 master-0 kubenswrapper[18592]: I0308 04:43:52.488070 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78466d865f-kkm92_37219418-787e-46f1-b4a9-afd765b3c33b/dnsmasq-dns/0.log" Mar 08 04:43:52.498340 master-0 kubenswrapper[18592]: I0308 04:43:52.497210 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78466d865f-kkm92_37219418-787e-46f1-b4a9-afd765b3c33b/init/0.log" Mar 08 04:43:52.593683 master-0 kubenswrapper[18592]: I0308 04:43:52.593620 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-afe2b-default-external-api-0_b1f5ab90-a4e1-47d7-9d79-22cc65e4295d/glance-log/0.log" Mar 08 04:43:52.611900 master-0 kubenswrapper[18592]: I0308 04:43:52.610054 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-afe2b-default-external-api-0_b1f5ab90-a4e1-47d7-9d79-22cc65e4295d/glance-httpd/0.log" Mar 08 04:43:52.716014 master-0 kubenswrapper[18592]: I0308 04:43:52.714352 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-afe2b-default-internal-api-0_7ff800ca-3f34-40c3-a4d1-329fe69e0c3a/glance-log/0.log" Mar 08 04:43:52.726973 master-0 kubenswrapper[18592]: I0308 04:43:52.726924 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-afe2b-default-internal-api-0_7ff800ca-3f34-40c3-a4d1-329fe69e0c3a/glance-httpd/0.log" Mar 08 04:43:52.764493 master-0 kubenswrapper[18592]: I0308 04:43:52.756579 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-5b8d48c5b6-nqv28_e39f8cef-9071-463a-af1a-4728e182c6c2/ironic-api-log/0.log" Mar 08 04:43:52.795487 master-0 kubenswrapper[18592]: I0308 04:43:52.795453 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-5b8d48c5b6-nqv28_e39f8cef-9071-463a-af1a-4728e182c6c2/ironic-api/0.log" Mar 08 04:43:52.805128 master-0 kubenswrapper[18592]: I0308 04:43:52.805106 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-5b8d48c5b6-nqv28_e39f8cef-9071-463a-af1a-4728e182c6c2/init/0.log" Mar 08 04:43:52.826685 master-0 kubenswrapper[18592]: I0308 04:43:52.826639 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_abe912ba-4a33-4634-a3fb-b6fb09b38d8e/ironic-conductor/0.log" Mar 08 04:43:52.835805 master-0 kubenswrapper[18592]: I0308 04:43:52.834590 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_abe912ba-4a33-4634-a3fb-b6fb09b38d8e/httpboot/0.log" Mar 08 04:43:52.850014 master-0 kubenswrapper[18592]: I0308 04:43:52.849473 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_abe912ba-4a33-4634-a3fb-b6fb09b38d8e/dnsmasq/0.log" Mar 08 04:43:52.861821 master-0 kubenswrapper[18592]: I0308 04:43:52.859098 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_abe912ba-4a33-4634-a3fb-b6fb09b38d8e/init/0.log" Mar 08 04:43:52.865978 master-0 kubenswrapper[18592]: I0308 04:43:52.865085 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_abe912ba-4a33-4634-a3fb-b6fb09b38d8e/ironic-python-agent-init/0.log" Mar 08 04:43:53.409323 master-0 kubenswrapper[18592]: I0308 04:43:53.409274 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-55d85b7b47-5v6gs_c0861ccd-5e86-4277-9082-95f3133508a0/kube-rbac-proxy/0.log" Mar 08 04:43:53.461618 master-0 kubenswrapper[18592]: I0308 04:43:53.461522 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-55d85b7b47-5v6gs_c0861ccd-5e86-4277-9082-95f3133508a0/cloud-credential-operator/0.log" Mar 08 04:43:53.543904 master-0 kubenswrapper[18592]: I0308 04:43:53.543647 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_abe912ba-4a33-4634-a3fb-b6fb09b38d8e/pxe-init/0.log" Mar 08 04:43:53.611431 master-0 kubenswrapper[18592]: I0308 04:43:53.611385 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_6936bf71-3ad4-47e9-8df6-9075d83086db/ironic-inspector-httpd/0.log" Mar 08 04:43:53.656922 master-0 kubenswrapper[18592]: I0308 04:43:53.656858 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_6936bf71-3ad4-47e9-8df6-9075d83086db/ironic-inspector/0.log" Mar 08 04:43:53.670768 master-0 kubenswrapper[18592]: I0308 04:43:53.670660 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_6936bf71-3ad4-47e9-8df6-9075d83086db/inspector-httpboot/0.log" Mar 08 04:43:53.679132 master-0 kubenswrapper[18592]: I0308 04:43:53.679058 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_6936bf71-3ad4-47e9-8df6-9075d83086db/ramdisk-logs/0.log" Mar 08 04:43:53.696922 master-0 kubenswrapper[18592]: I0308 04:43:53.696372 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_6936bf71-3ad4-47e9-8df6-9075d83086db/inspector-dnsmasq/0.log" Mar 08 04:43:53.717880 master-0 kubenswrapper[18592]: I0308 04:43:53.716538 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_6936bf71-3ad4-47e9-8df6-9075d83086db/ironic-python-agent-init/0.log" Mar 08 04:43:53.739063 master-0 kubenswrapper[18592]: I0308 04:43:53.739013 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_6936bf71-3ad4-47e9-8df6-9075d83086db/inspector-pxe-init/0.log" Mar 08 04:43:53.755901 master-0 kubenswrapper[18592]: I0308 04:43:53.752914 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-neutron-agent-859d47fc89-z2wvz_05ba9b98-7d2f-4a9b-80ad-60793d8279e8/ironic-neutron-agent/3.log" Mar 08 04:43:53.755901 master-0 kubenswrapper[18592]: I0308 04:43:53.754907 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-neutron-agent-859d47fc89-z2wvz_05ba9b98-7d2f-4a9b-80ad-60793d8279e8/ironic-neutron-agent/2.log" Mar 08 04:43:53.825533 master-0 kubenswrapper[18592]: I0308 04:43:53.825483 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-56d7944fd-t2xf9_081bf4e2-02c9-4cca-b699-6148f8aaa219/keystone-api/0.log" Mar 08 04:43:55.573368 master-0 kubenswrapper[18592]: I0308 04:43:55.573325 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-vfgfp_0918ba32-8e55-48d0-8e50-027c0dcb4bbd/openshift-config-operator/0.log" Mar 08 04:43:55.577456 master-0 kubenswrapper[18592]: I0308 04:43:55.577415 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-vfgfp_0918ba32-8e55-48d0-8e50-027c0dcb4bbd/openshift-config-operator/1.log" Mar 08 04:43:55.596380 master-0 kubenswrapper[18592]: I0308 04:43:55.596348 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-vfgfp_0918ba32-8e55-48d0-8e50-027c0dcb4bbd/openshift-api/0.log" Mar 08 04:43:56.827114 master-0 kubenswrapper[18592]: I0308 04:43:56.824528 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-6c7fb6b958-mr9k6_48ab3c8e-a2bd-4380-9e8d-a41d515a989d/console-operator/0.log" Mar 08 04:43:58.170546 master-0 kubenswrapper[18592]: I0308 04:43:58.170387 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5795766c6-lq2d8_dc4dec3e-6f82-404f-9aad-9bad08ca306c/console/0.log" Mar 08 04:43:58.237522 master-0 kubenswrapper[18592]: I0308 04:43:58.237413 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-84f57b9877-n666k_f8fc59f5-7a53-4075-8005-8fdb2b45ccb5/download-server/0.log" Mar 08 04:43:58.368249 master-0 kubenswrapper[18592]: I0308 04:43:58.368188 18592 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-x7bcc/perf-node-gather-daemonset-sstbw" Mar 08 04:43:59.311322 master-0 kubenswrapper[18592]: I0308 04:43:59.311262 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-6fbfc8dc8f-nm8fj_b3eea925-73b3-4693-8f0e-6dd26107f60a/cluster-storage-operator/3.log" Mar 08 04:43:59.312307 master-0 kubenswrapper[18592]: I0308 04:43:59.312285 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-6fbfc8dc8f-nm8fj_b3eea925-73b3-4693-8f0e-6dd26107f60a/cluster-storage-operator/2.log" Mar 08 04:43:59.337273 master-0 kubenswrapper[18592]: I0308 04:43:59.337220 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/4.log" Mar 08 04:43:59.338108 master-0 kubenswrapper[18592]: I0308 04:43:59.338085 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-h4qlp_9ec89e27-4360-48f2-a7ca-5d823bda4510/snapshot-controller/5.log" Mar 08 04:43:59.382307 master-0 kubenswrapper[18592]: I0308 04:43:59.382223 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-5685fbc7d-xhbrl_52b495ac-bb28-44f3-b925-3c54f86d5ec4/csi-snapshot-controller-operator/0.log" Mar 08 04:44:00.416869 master-0 kubenswrapper[18592]: I0308 04:44:00.416793 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-589895fbb7-xttlz_8efdcef9-9b31-4567-b7f9-cb59a894273d/dns-operator/0.log" Mar 08 04:44:00.451976 master-0 kubenswrapper[18592]: I0308 04:44:00.451431 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-589895fbb7-xttlz_8efdcef9-9b31-4567-b7f9-cb59a894273d/kube-rbac-proxy/0.log" Mar 08 04:44:01.550202 master-0 kubenswrapper[18592]: I0308 04:44:01.550168 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4pjsn_7b485db9-29b5-45a1-a4fb-b4264c6bf2d6/dns/0.log" Mar 08 04:44:01.566506 master-0 kubenswrapper[18592]: I0308 04:44:01.566471 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4pjsn_7b485db9-29b5-45a1-a4fb-b4264c6bf2d6/kube-rbac-proxy/0.log" Mar 08 04:44:01.597257 master-0 kubenswrapper[18592]: I0308 04:44:01.597127 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wjl9v_d2cd5b23-e622-4b96-aee8-dbc942b73b4a/dns-node-resolver/0.log" Mar 08 04:44:01.743631 master-0 kubenswrapper[18592]: E0308 04:44:01.743414 18592 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:49262->192.168.32.10:42143: write tcp 192.168.32.10:49262->192.168.32.10:42143: write: broken pipe Mar 08 04:44:02.676931 master-0 kubenswrapper[18592]: I0308 04:44:02.676842 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-vzms7_5a7752f9-7b9a-451f-997a-e9f696d38b34/etcd-operator/2.log" Mar 08 04:44:02.692506 master-0 kubenswrapper[18592]: I0308 04:44:02.690903 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-vzms7_5a7752f9-7b9a-451f-997a-e9f696d38b34/etcd-operator/3.log" Mar 08 04:44:03.824308 master-0 kubenswrapper[18592]: I0308 04:44:03.822920 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcdctl/0.log" Mar 08 04:44:04.046668 master-0 kubenswrapper[18592]: I0308 04:44:04.046608 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd/0.log" Mar 08 04:44:04.066223 master-0 kubenswrapper[18592]: I0308 04:44:04.064263 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-metrics/0.log" Mar 08 04:44:04.077328 master-0 kubenswrapper[18592]: I0308 04:44:04.077198 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-readyz/0.log" Mar 08 04:44:04.091452 master-0 kubenswrapper[18592]: I0308 04:44:04.091091 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-rev/0.log" Mar 08 04:44:04.101764 master-0 kubenswrapper[18592]: I0308 04:44:04.101723 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/setup/0.log" Mar 08 04:44:04.120849 master-0 kubenswrapper[18592]: I0308 04:44:04.120788 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-ensure-env-vars/0.log" Mar 08 04:44:04.133125 master-0 kubenswrapper[18592]: I0308 04:44:04.133080 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-resources-copy/0.log" Mar 08 04:44:04.207654 master-0 kubenswrapper[18592]: I0308 04:44:04.207586 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_9b02b2c9-9ee3-436c-aa36-0c23e5fdf07b/installer/0.log" Mar 08 04:44:04.289966 master-0 kubenswrapper[18592]: I0308 04:44:04.289021 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_d84e0373-988e-47db-be73-5690d18beba3/installer/0.log" Mar 08 04:44:04.951877 master-0 kubenswrapper[18592]: I0308 04:44:04.951815 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_d174d635-30c7-4d7d-a077-aa6436a5675a/memcached/0.log" Mar 08 04:44:05.120869 master-0 kubenswrapper[18592]: I0308 04:44:05.118312 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67869b4f85-qhzqs_b300a420-49d9-40f0-9c31-7a2ff20dcbfc/neutron-api/0.log" Mar 08 04:44:05.167917 master-0 kubenswrapper[18592]: I0308 04:44:05.165117 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67869b4f85-qhzqs_b300a420-49d9-40f0-9c31-7a2ff20dcbfc/neutron-httpd/0.log" Mar 08 04:44:05.272450 master-0 kubenswrapper[18592]: I0308 04:44:05.271223 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4b94212a-655e-4381-b7f0-d195a9157e27/nova-api-log/0.log" Mar 08 04:44:05.543845 master-0 kubenswrapper[18592]: I0308 04:44:05.542587 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4b94212a-655e-4381-b7f0-d195a9157e27/nova-api-api/0.log" Mar 08 04:44:05.622476 master-0 kubenswrapper[18592]: I0308 04:44:05.622405 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_017acd52-088f-439e-8ff2-97c079c31eb3/nova-cell0-conductor-conductor/0.log" Mar 08 04:44:05.687892 master-0 kubenswrapper[18592]: I0308 04:44:05.687857 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-compute-ironic-compute-0_0e0a6cbb-243a-4dd1-86c7-1ee9d839a00d/nova-cell1-compute-ironic-compute-compute/0.log" Mar 08 04:44:05.773899 master-0 kubenswrapper[18592]: I0308 04:44:05.768361 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3590bdae-73b7-424b-897d-20a88497d3d0/nova-cell1-conductor-conductor/0.log" Mar 08 04:44:05.831453 master-0 kubenswrapper[18592]: I0308 04:44:05.831367 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_cluster-image-registry-operator-86d6d77c7c-572xh_69eb8ba2-7bfb-4433-8951-08f89e7bcb5f/cluster-image-registry-operator/0.log" Mar 08 04:44:05.853834 master-0 kubenswrapper[18592]: I0308 04:44:05.853780 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_ffcc095d-2c9f-44dc-9bef-6f6f962e8d70/nova-cell1-novncproxy-novncproxy/0.log" Mar 08 04:44:05.919602 master-0 kubenswrapper[18592]: I0308 04:44:05.919555 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n77ps_83000169-b43a-41b4-9e1c-43d99dade81a/node-ca/0.log" Mar 08 04:44:05.952809 master-0 kubenswrapper[18592]: I0308 04:44:05.952478 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6708fcdc-73fb-496e-8be6-9bff2a926ce2/nova-metadata-log/0.log" Mar 08 04:44:06.567688 master-0 kubenswrapper[18592]: I0308 04:44:06.567642 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6708fcdc-73fb-496e-8be6-9bff2a926ce2/nova-metadata-metadata/0.log" Mar 08 04:44:06.671560 master-0 kubenswrapper[18592]: I0308 04:44:06.671515 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_330d8046-3011-4ca0-9240-0c55e8117d1d/nova-scheduler-scheduler/0.log" Mar 08 04:44:06.694040 master-0 kubenswrapper[18592]: I0308 04:44:06.691038 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_17b43cd8-4413-4958-9473-bbc5448585dc/galera/0.log" Mar 08 04:44:06.707881 master-0 kubenswrapper[18592]: I0308 04:44:06.707622 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_17b43cd8-4413-4958-9473-bbc5448585dc/mysql-bootstrap/0.log" Mar 08 04:44:06.731028 master-0 kubenswrapper[18592]: I0308 04:44:06.730981 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7d8ff61a-7e75-41ec-9314-40ff5a0fea03/galera/0.log" Mar 08 04:44:06.745236 master-0 kubenswrapper[18592]: I0308 04:44:06.745184 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7d8ff61a-7e75-41ec-9314-40ff5a0fea03/mysql-bootstrap/0.log" Mar 08 04:44:06.753155 master-0 kubenswrapper[18592]: I0308 04:44:06.752250 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ee2a865f-ef65-4965-abfc-1425b7f0bf84/openstackclient/0.log" Mar 08 04:44:06.782308 master-0 kubenswrapper[18592]: I0308 04:44:06.781705 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-hkfg8_bf9606de-37b2-4dc9-8b6b-adfb1d81f6d1/ovn-controller/0.log" Mar 08 04:44:06.802036 master-0 kubenswrapper[18592]: I0308 04:44:06.801985 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-5lz68_fa276697-ebfd-42a0-b269-60e71b01056c/openstack-network-exporter/0.log" Mar 08 04:44:06.816457 master-0 kubenswrapper[18592]: I0308 04:44:06.815171 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2lfv5_9fb36151-5aa5-462a-b8da-a082585a5a26/ovsdb-server/0.log" Mar 08 04:44:06.828262 master-0 kubenswrapper[18592]: I0308 04:44:06.828149 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2lfv5_9fb36151-5aa5-462a-b8da-a082585a5a26/ovs-vswitchd/0.log" Mar 08 04:44:06.837433 master-0 kubenswrapper[18592]: I0308 04:44:06.837397 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-t77qr_c9de4939-680a-4e3e-89fd-e20ecb8b10f2/ingress-operator/0.log" Mar 08 04:44:06.844875 master-0 kubenswrapper[18592]: I0308 04:44:06.842330 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2lfv5_9fb36151-5aa5-462a-b8da-a082585a5a26/ovsdb-server-init/0.log" Mar 08 04:44:06.844875 master-0 kubenswrapper[18592]: I0308 04:44:06.843310 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-t77qr_c9de4939-680a-4e3e-89fd-e20ecb8b10f2/ingress-operator/1.log" Mar 08 04:44:06.855095 master-0 kubenswrapper[18592]: I0308 04:44:06.854918 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8e48f207-5d13-41af-9187-97f528daeb55/ovn-northd/0.log" Mar 08 04:44:06.860267 master-0 kubenswrapper[18592]: I0308 04:44:06.860177 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8e48f207-5d13-41af-9187-97f528daeb55/openstack-network-exporter/0.log" Mar 08 04:44:06.861499 master-0 kubenswrapper[18592]: I0308 04:44:06.860717 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-t77qr_c9de4939-680a-4e3e-89fd-e20ecb8b10f2/kube-rbac-proxy/0.log" Mar 08 04:44:06.874647 master-0 kubenswrapper[18592]: I0308 04:44:06.873430 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_19d803f7-e454-4197-833f-539d8f1926ca/ovsdbserver-nb/0.log" Mar 08 04:44:06.878137 master-0 kubenswrapper[18592]: I0308 04:44:06.877902 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_19d803f7-e454-4197-833f-539d8f1926ca/openstack-network-exporter/0.log" Mar 08 04:44:06.897807 master-0 kubenswrapper[18592]: I0308 04:44:06.896618 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264/ovsdbserver-sb/0.log" Mar 08 04:44:06.905878 master-0 kubenswrapper[18592]: I0308 04:44:06.903712 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9d6e9be4-a15c-476b-a4e3-cbbbc2c3c264/openstack-network-exporter/0.log" Mar 08 04:44:06.956460 master-0 kubenswrapper[18592]: I0308 04:44:06.951649 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-59ffff478d-tll4r_29953134-4064-46e8-9791-cf9c07d7f106/placement-log/0.log" Mar 08 04:44:06.978289 master-0 kubenswrapper[18592]: I0308 04:44:06.978242 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-59ffff478d-tll4r_29953134-4064-46e8-9791-cf9c07d7f106/placement-api/0.log" Mar 08 04:44:07.000748 master-0 kubenswrapper[18592]: I0308 04:44:07.000686 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_466c2b13-2b27-4a83-911c-db97d66490a5/rabbitmq/0.log" Mar 08 04:44:07.005753 master-0 kubenswrapper[18592]: I0308 04:44:07.005089 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_466c2b13-2b27-4a83-911c-db97d66490a5/setup-container/0.log" Mar 08 04:44:07.066407 master-0 kubenswrapper[18592]: I0308 04:44:07.066359 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6cc32df9-dcb4-43f3-b78d-f992b0488bf1/rabbitmq/0.log" Mar 08 04:44:07.073049 master-0 kubenswrapper[18592]: I0308 04:44:07.073003 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6cc32df9-dcb4-43f3-b78d-f992b0488bf1/setup-container/0.log" Mar 08 04:44:07.178894 master-0 kubenswrapper[18592]: I0308 04:44:07.176266 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-59f645d994-2wg6m_987b967c-b06a-4079-ac6a-46f89e7c1b49/proxy-httpd/0.log" Mar 08 04:44:07.204450 master-0 kubenswrapper[18592]: I0308 04:44:07.203953 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-59f645d994-2wg6m_987b967c-b06a-4079-ac6a-46f89e7c1b49/proxy-server/0.log" Mar 08 04:44:07.221842 master-0 kubenswrapper[18592]: I0308 04:44:07.219239 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-hmnb8_284c1d20-bb45-4e62-9ebe-76fdb2e4fd01/swift-ring-rebalance/0.log" Mar 08 04:44:07.249844 master-0 kubenswrapper[18592]: I0308 04:44:07.246720 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2166af23-aec1-40ee-9114-2a0ffa1c7f11/account-server/0.log" Mar 08 04:44:07.274847 master-0 kubenswrapper[18592]: I0308 04:44:07.274057 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2166af23-aec1-40ee-9114-2a0ffa1c7f11/account-replicator/0.log" Mar 08 04:44:07.285843 master-0 kubenswrapper[18592]: I0308 04:44:07.284316 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2166af23-aec1-40ee-9114-2a0ffa1c7f11/account-auditor/0.log" Mar 08 04:44:07.295890 master-0 kubenswrapper[18592]: I0308 04:44:07.293281 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2166af23-aec1-40ee-9114-2a0ffa1c7f11/account-reaper/0.log" Mar 08 04:44:07.304734 master-0 kubenswrapper[18592]: I0308 04:44:07.302973 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2166af23-aec1-40ee-9114-2a0ffa1c7f11/container-server/0.log" Mar 08 04:44:07.327287 master-0 kubenswrapper[18592]: I0308 04:44:07.327167 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2166af23-aec1-40ee-9114-2a0ffa1c7f11/container-replicator/0.log" Mar 08 04:44:07.338212 master-0 kubenswrapper[18592]: I0308 04:44:07.337705 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2166af23-aec1-40ee-9114-2a0ffa1c7f11/container-auditor/0.log" Mar 08 04:44:07.343231 master-0 kubenswrapper[18592]: I0308 04:44:07.343192 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2166af23-aec1-40ee-9114-2a0ffa1c7f11/container-updater/0.log" Mar 08 04:44:07.359299 master-0 kubenswrapper[18592]: I0308 04:44:07.359243 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2166af23-aec1-40ee-9114-2a0ffa1c7f11/object-server/0.log" Mar 08 04:44:07.379125 master-0 kubenswrapper[18592]: I0308 04:44:07.379029 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2166af23-aec1-40ee-9114-2a0ffa1c7f11/object-replicator/0.log" Mar 08 04:44:07.484845 master-0 kubenswrapper[18592]: I0308 04:44:07.481963 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2166af23-aec1-40ee-9114-2a0ffa1c7f11/object-auditor/0.log" Mar 08 04:44:07.493846 master-0 kubenswrapper[18592]: I0308 04:44:07.491934 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2166af23-aec1-40ee-9114-2a0ffa1c7f11/object-updater/0.log" Mar 08 04:44:07.501304 master-0 kubenswrapper[18592]: I0308 04:44:07.501259 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2166af23-aec1-40ee-9114-2a0ffa1c7f11/object-expirer/0.log" Mar 08 04:44:07.509363 master-0 kubenswrapper[18592]: I0308 04:44:07.509329 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2166af23-aec1-40ee-9114-2a0ffa1c7f11/rsync/0.log" Mar 08 04:44:07.516391 master-0 kubenswrapper[18592]: I0308 04:44:07.516355 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2166af23-aec1-40ee-9114-2a0ffa1c7f11/swift-recon-cron/0.log" Mar 08 04:44:07.995061 master-0 kubenswrapper[18592]: I0308 04:44:07.994945 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xs7l6_e51a412e-9068-4720-aec4-f07e8fc465c9/serve-healthcheck-canary/0.log" Mar 08 04:44:08.815484 master-0 kubenswrapper[18592]: I0308 04:44:08.815361 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-8f89dfddd-4mr6p_ee586416-6f56-4ea4-ad62-95de1e6df23b/insights-operator/4.log" Mar 08 04:44:08.844012 master-0 kubenswrapper[18592]: I0308 04:44:08.843962 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-8f89dfddd-4mr6p_ee586416-6f56-4ea4-ad62-95de1e6df23b/insights-operator/5.log" Mar 08 04:44:08.903337 master-0 kubenswrapper[18592]: I0308 04:44:08.903282 18592 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7bcc/master-0-debug-bwjc5" event={"ID":"bf3e6a8d-926f-4423-aedb-f2906a1dcb64","Type":"ContainerStarted","Data":"16cf6940cafa7c24aee98fc1e65a6c0fcb1310263179ab32d8a7561e0d0e5b39"} Mar 08 04:44:08.931911 master-0 kubenswrapper[18592]: I0308 04:44:08.923755 18592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x7bcc/master-0-debug-bwjc5" podStartSLOduration=1.663018957 podStartE2EDuration="18.923738236s" podCreationTimestamp="2026-03-08 04:43:50 +0000 UTC" firstStartedPulling="2026-03-08 04:43:50.482872088 +0000 UTC m=+3042.581626428" lastFinishedPulling="2026-03-08 04:44:07.743591357 +0000 UTC m=+3059.842345707" observedRunningTime="2026-03-08 04:44:08.917366967 +0000 UTC m=+3061.016121317" watchObservedRunningTime="2026-03-08 04:44:08.923738236 +0000 UTC m=+3061.022492586" Mar 08 04:44:11.170396 master-0 kubenswrapper[18592]: I0308 04:44:11.170345 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c68746bf-7562-4da0-85e5-dce1ad9786b3/alertmanager/0.log" Mar 08 04:44:11.184952 master-0 kubenswrapper[18592]: I0308 04:44:11.184911 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c68746bf-7562-4da0-85e5-dce1ad9786b3/config-reloader/0.log" Mar 08 04:44:11.200004 master-0 kubenswrapper[18592]: I0308 04:44:11.199948 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c68746bf-7562-4da0-85e5-dce1ad9786b3/kube-rbac-proxy-web/0.log" Mar 08 04:44:11.215655 master-0 kubenswrapper[18592]: I0308 04:44:11.215610 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c68746bf-7562-4da0-85e5-dce1ad9786b3/kube-rbac-proxy/0.log" Mar 08 04:44:11.229490 master-0 kubenswrapper[18592]: I0308 04:44:11.229438 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c68746bf-7562-4da0-85e5-dce1ad9786b3/kube-rbac-proxy-metric/0.log" Mar 08 04:44:11.250158 master-0 kubenswrapper[18592]: I0308 04:44:11.249972 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c68746bf-7562-4da0-85e5-dce1ad9786b3/prom-label-proxy/0.log" Mar 08 04:44:11.265707 master-0 kubenswrapper[18592]: I0308 04:44:11.265656 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c68746bf-7562-4da0-85e5-dce1ad9786b3/init-config-reloader/0.log" Mar 08 04:44:11.339191 master-0 kubenswrapper[18592]: I0308 04:44:11.339122 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-674cbfbd9d-clqwj_0418ff42-7eac-4266-97b5-4df88623d066/cluster-monitoring-operator/0.log" Mar 08 04:44:11.362046 master-0 kubenswrapper[18592]: I0308 04:44:11.361983 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-68b88f8cb5-22jdz_93ebbf2d-6b34-40ae-9f2e-f861e8a20183/kube-state-metrics/0.log" Mar 08 04:44:11.382071 master-0 kubenswrapper[18592]: I0308 04:44:11.382010 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-68b88f8cb5-22jdz_93ebbf2d-6b34-40ae-9f2e-f861e8a20183/kube-rbac-proxy-main/0.log" Mar 08 04:44:11.394979 master-0 kubenswrapper[18592]: I0308 04:44:11.394933 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-68b88f8cb5-22jdz_93ebbf2d-6b34-40ae-9f2e-f861e8a20183/kube-rbac-proxy-self/0.log" Mar 08 04:44:11.413855 master-0 kubenswrapper[18592]: I0308 04:44:11.413782 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7c975945c4-kbb6q_934ced6d-5bc3-4215-8b97-059f77762cfd/metrics-server/0.log" Mar 08 04:44:11.427554 master-0 kubenswrapper[18592]: I0308 04:44:11.427466 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-64589489d-z2spc_8b79990f-516d-4eb7-bc3f-bf63ff11f105/monitoring-plugin/0.log" Mar 08 04:44:11.446927 master-0 kubenswrapper[18592]: I0308 04:44:11.446882 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f2sxm_10d13e6c-631d-4753-b564-fd88ceb7d358/node-exporter/0.log" Mar 08 04:44:11.463270 master-0 kubenswrapper[18592]: I0308 04:44:11.463232 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f2sxm_10d13e6c-631d-4753-b564-fd88ceb7d358/kube-rbac-proxy/0.log" Mar 08 04:44:11.474253 master-0 kubenswrapper[18592]: I0308 04:44:11.474216 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f2sxm_10d13e6c-631d-4753-b564-fd88ceb7d358/init-textfile/0.log" Mar 08 04:44:11.489345 master-0 kubenswrapper[18592]: I0308 04:44:11.489297 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-74cc79fd76-gtrmb_fa83f817-2611-4894-9bad-d9c8640520b3/kube-rbac-proxy-main/0.log" Mar 08 04:44:11.509238 master-0 kubenswrapper[18592]: I0308 04:44:11.509187 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-74cc79fd76-gtrmb_fa83f817-2611-4894-9bad-d9c8640520b3/kube-rbac-proxy-self/0.log" Mar 08 04:44:11.519042 master-0 kubenswrapper[18592]: I0308 04:44:11.518996 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-74cc79fd76-gtrmb_fa83f817-2611-4894-9bad-d9c8640520b3/openshift-state-metrics/0.log" Mar 08 04:44:11.560450 master-0 kubenswrapper[18592]: I0308 04:44:11.560405 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0eece71b-beb6-49f4-96cd-6b7476337ded/prometheus/0.log" Mar 08 04:44:11.572304 master-0 kubenswrapper[18592]: I0308 04:44:11.572262 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0eece71b-beb6-49f4-96cd-6b7476337ded/config-reloader/0.log" Mar 08 04:44:11.585026 master-0 kubenswrapper[18592]: I0308 04:44:11.584554 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0eece71b-beb6-49f4-96cd-6b7476337ded/thanos-sidecar/0.log" Mar 08 04:44:11.600792 master-0 kubenswrapper[18592]: I0308 04:44:11.600756 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0eece71b-beb6-49f4-96cd-6b7476337ded/kube-rbac-proxy-web/0.log" Mar 08 04:44:11.611351 master-0 kubenswrapper[18592]: I0308 04:44:11.611304 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0eece71b-beb6-49f4-96cd-6b7476337ded/kube-rbac-proxy/0.log" Mar 08 04:44:11.624161 master-0 kubenswrapper[18592]: I0308 04:44:11.624121 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0eece71b-beb6-49f4-96cd-6b7476337ded/kube-rbac-proxy-thanos/0.log" Mar 08 04:44:11.640156 master-0 kubenswrapper[18592]: I0308 04:44:11.640118 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0eece71b-beb6-49f4-96cd-6b7476337ded/init-config-reloader/0.log" Mar 08 04:44:11.659592 master-0 kubenswrapper[18592]: I0308 04:44:11.659550 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5ff8674d55-lv2h9_46b636ff-fb55-4e68-9836-04e46bd462ee/prometheus-operator/0.log" Mar 08 04:44:11.669646 master-0 kubenswrapper[18592]: I0308 04:44:11.669613 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5ff8674d55-lv2h9_46b636ff-fb55-4e68-9836-04e46bd462ee/kube-rbac-proxy/0.log" Mar 08 04:44:11.687013 master-0 kubenswrapper[18592]: I0308 04:44:11.686969 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-8464df8497-kkrx6_ec003aa0-e60e-4c9b-8110-48502405d3a7/prometheus-operator-admission-webhook/0.log" Mar 08 04:44:11.700795 master-0 kubenswrapper[18592]: I0308 04:44:11.700758 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-648fbfb658-kprxt_b1ad8862-127f-4a2c-9846-57fef0b5cdb6/telemeter-client/1.log" Mar 08 04:44:11.701438 master-0 kubenswrapper[18592]: I0308 04:44:11.701412 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-648fbfb658-kprxt_b1ad8862-127f-4a2c-9846-57fef0b5cdb6/telemeter-client/0.log" Mar 08 04:44:11.711225 master-0 kubenswrapper[18592]: I0308 04:44:11.711186 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-648fbfb658-kprxt_b1ad8862-127f-4a2c-9846-57fef0b5cdb6/reload/0.log" Mar 08 04:44:11.725859 master-0 kubenswrapper[18592]: I0308 04:44:11.724300 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-648fbfb658-kprxt_b1ad8862-127f-4a2c-9846-57fef0b5cdb6/kube-rbac-proxy/0.log" Mar 08 04:44:11.743637 master-0 kubenswrapper[18592]: I0308 04:44:11.743588 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cfbff6469-pjfrs_04bf080c-cf49-4717-abdf-f247a4cdbf46/thanos-query/0.log" Mar 08 04:44:11.753084 master-0 kubenswrapper[18592]: I0308 04:44:11.753045 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cfbff6469-pjfrs_04bf080c-cf49-4717-abdf-f247a4cdbf46/kube-rbac-proxy-web/0.log" Mar 08 04:44:11.765969 master-0 kubenswrapper[18592]: I0308 04:44:11.765932 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cfbff6469-pjfrs_04bf080c-cf49-4717-abdf-f247a4cdbf46/kube-rbac-proxy/0.log" Mar 08 04:44:11.780005 master-0 kubenswrapper[18592]: I0308 04:44:11.779952 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cfbff6469-pjfrs_04bf080c-cf49-4717-abdf-f247a4cdbf46/prom-label-proxy/0.log" Mar 08 04:44:11.790888 master-0 kubenswrapper[18592]: I0308 04:44:11.790849 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cfbff6469-pjfrs_04bf080c-cf49-4717-abdf-f247a4cdbf46/kube-rbac-proxy-rules/0.log" Mar 08 04:44:11.801375 master-0 kubenswrapper[18592]: I0308 04:44:11.801336 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cfbff6469-pjfrs_04bf080c-cf49-4717-abdf-f247a4cdbf46/kube-rbac-proxy-metrics/0.log" Mar 08 04:44:13.958775 master-0 kubenswrapper[18592]: I0308 04:44:13.958730 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-wxdl4_51a5fdc3-c642-430a-9da8-448bd0cceae0/controller/0.log" Mar 08 04:44:13.989561 master-0 kubenswrapper[18592]: I0308 04:44:13.989475 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-wxdl4_51a5fdc3-c642-430a-9da8-448bd0cceae0/kube-rbac-proxy/0.log" Mar 08 04:44:14.012126 master-0 kubenswrapper[18592]: I0308 04:44:14.012082 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/controller/0.log" Mar 08 04:44:15.163881 master-0 kubenswrapper[18592]: I0308 04:44:15.157644 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/frr/0.log" Mar 08 04:44:15.173375 master-0 kubenswrapper[18592]: I0308 04:44:15.173286 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/reloader/0.log" Mar 08 04:44:15.185147 master-0 kubenswrapper[18592]: I0308 04:44:15.185110 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/frr-metrics/0.log" Mar 08 04:44:15.199158 master-0 kubenswrapper[18592]: I0308 04:44:15.199063 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/kube-rbac-proxy/0.log" Mar 08 04:44:15.210688 master-0 kubenswrapper[18592]: I0308 04:44:15.210643 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/kube-rbac-proxy-frr/0.log" Mar 08 04:44:15.229207 master-0 kubenswrapper[18592]: I0308 04:44:15.225464 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/cp-frr-files/0.log" Mar 08 04:44:15.242704 master-0 kubenswrapper[18592]: I0308 04:44:15.242643 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/cp-reloader/0.log" Mar 08 04:44:15.254156 master-0 kubenswrapper[18592]: I0308 04:44:15.253837 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/cp-metrics/0.log" Mar 08 04:44:15.267723 master-0 kubenswrapper[18592]: I0308 04:44:15.267646 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-lwx48_9097e49a-572d-40cd-8657-4dacb3d0f33b/frr-k8s-webhook-server/0.log" Mar 08 04:44:15.292217 master-0 kubenswrapper[18592]: I0308 04:44:15.292167 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-c7f497896-fbcg7_d0a838e8-3b7d-4368-99bc-21c83edb251a/manager/0.log" Mar 08 04:44:15.302907 master-0 kubenswrapper[18592]: I0308 04:44:15.302858 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7d4ccb5df5-l4btv_ba434b94-1eeb-4d82-89a2-b3c8fa4a998d/webhook-server/0.log" Mar 08 04:44:15.672775 master-0 kubenswrapper[18592]: I0308 04:44:15.672718 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-msbrs_423b03ec-93e1-4baa-8530-8e3bba6eccb0/speaker/0.log" Mar 08 04:44:15.694381 master-0 kubenswrapper[18592]: I0308 04:44:15.694334 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-msbrs_423b03ec-93e1-4baa-8530-8e3bba6eccb0/kube-rbac-proxy/0.log" Mar 08 04:44:16.302584 master-0 kubenswrapper[18592]: I0308 04:44:16.302524 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7_1e476107-935e-4d00-9290-a8e242341c5a/extract/0.log" Mar 08 04:44:16.308885 master-0 kubenswrapper[18592]: I0308 04:44:16.308849 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7_1e476107-935e-4d00-9290-a8e242341c5a/util/0.log" Mar 08 04:44:16.324651 master-0 kubenswrapper[18592]: I0308 04:44:16.324596 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_084a9f42e05a600999ddea68e965db4422238b5f1e47c90a13e7441bc7j2tq7_1e476107-935e-4d00-9290-a8e242341c5a/pull/0.log" Mar 08 04:44:17.150906 master-0 kubenswrapper[18592]: I0308 04:44:17.150777 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-wxdl4_51a5fdc3-c642-430a-9da8-448bd0cceae0/controller/0.log" Mar 08 04:44:17.159589 master-0 kubenswrapper[18592]: I0308 04:44:17.159535 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-wxdl4_51a5fdc3-c642-430a-9da8-448bd0cceae0/kube-rbac-proxy/0.log" Mar 08 04:44:17.222630 master-0 kubenswrapper[18592]: I0308 04:44:17.222559 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/controller/0.log" Mar 08 04:44:18.581159 master-0 kubenswrapper[18592]: I0308 04:44:18.581111 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-66c7586884-qjv52_232c421d-96f0-4894-b8d8-74f43d02bbd3/cluster-node-tuning-operator/0.log" Mar 08 04:44:18.629000 master-0 kubenswrapper[18592]: I0308 04:44:18.628915 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-bpwdb_e187516f-8f33-4c17-81d6-60c10b580bb0/tuned/0.log" Mar 08 04:44:18.767115 master-0 kubenswrapper[18592]: I0308 04:44:18.767071 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/frr/0.log" Mar 08 04:44:18.777153 master-0 kubenswrapper[18592]: I0308 04:44:18.777129 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/reloader/0.log" Mar 08 04:44:18.784788 master-0 kubenswrapper[18592]: I0308 04:44:18.784760 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/frr-metrics/0.log" Mar 08 04:44:18.796678 master-0 kubenswrapper[18592]: I0308 04:44:18.796420 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/kube-rbac-proxy/0.log" Mar 08 04:44:18.809581 master-0 kubenswrapper[18592]: I0308 04:44:18.809436 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/kube-rbac-proxy-frr/0.log" Mar 08 04:44:18.816690 master-0 kubenswrapper[18592]: I0308 04:44:18.816657 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/cp-frr-files/0.log" Mar 08 04:44:18.843544 master-0 kubenswrapper[18592]: I0308 04:44:18.843430 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/cp-reloader/0.log" Mar 08 04:44:18.853168 master-0 kubenswrapper[18592]: I0308 04:44:18.853137 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f69bb_e1532f83-fda2-443b-9bdb-aa4de5c66a13/cp-metrics/0.log" Mar 08 04:44:18.865803 master-0 kubenswrapper[18592]: I0308 04:44:18.865696 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-lwx48_9097e49a-572d-40cd-8657-4dacb3d0f33b/frr-k8s-webhook-server/0.log" Mar 08 04:44:18.900817 master-0 kubenswrapper[18592]: I0308 04:44:18.900774 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-c7f497896-fbcg7_d0a838e8-3b7d-4368-99bc-21c83edb251a/manager/0.log" Mar 08 04:44:18.910326 master-0 kubenswrapper[18592]: I0308 04:44:18.910285 18592 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7d4ccb5df5-l4btv_ba434b94-1eeb-4d82-89a2-b3c8fa4a998d/webhook-server/0.log"